var/home/core/zuul-output/0000755000175000017500000000000015145653226014536 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015145670403015476 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000211324215145670264020265 0ustar corecorepikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs$r.k9Gf8^pg?sb}Wߟ/v]qo_fZsZ-j C4%_̾zׇϘէoW7_~uyi{|||F^lWo?<{r.:;.뙘 A|==-$JRPœ*fOԼf^`ig7!)&c(z$5jlUi_η*t:%?vEmO5wtqÜ3Byu '~qlF?}| nLFR6f8yWWxgg ;k44|Ck4UD'O[l6ro%-9}tytE*,Cj·1z_j( ,"z-Ee}t(QCuˠMޮi#2j9iݸ6C~z+_Ex$L}*%h>t m2m`QɢJ[a|$ᑨj:D+w4rھxiJz硂Ϧ4Co9=]٣Z%T%x~5r.N ;`$g`Խ!:*Wni|QXj0ħNbQe絸%]zNdƭwq LJ;_ʧNs9[(=>@Q,}s=LN YlYd'7M qbEY QOΨN!㞊?4U^Z/ QB?q3yv.اeIʷVF^j=_Z{5v7xni^^J"ͦ>CMMQQ؏*ΧL ߞNPi?$;g&uw8~Y >hl%}Р`sMC@ztԝp ,}Nptt%q6& ND lM?ָPZGa(X(2*91n@^7rN_Ŗ׼O>Bߔ)bQ) <4G0 C.b~CףkB(*<[Ǧ4 V mD~q2볯Q'Q/L1+iY¥  %T%SE:!җӣ D>P.BvJ>mIyVVTF% tFL-*$tZm2AČAE9ϯ~ihFf&6bL΁?kMPc_Ԝ*΄Bs`[mJ?t 53@?jڞ(7h?cFnEOא&nay!%dqO՟wX:) ťLxӛ*0q}0L'd1*-B[aL"T 1dȂGl*?%|L pSROޔ8'mzX+`قSᔙD'Ad [kP=+<, {Z5׷!)'xN&}|Y>!0[r_G{j 6JYǹ>zs;tc.mctie:x&"bR4S uV8?킖,~0;g2NET݃jYAT` &AD]Ax95mvXYs"(A+/+o+{b]}@UP*acԇ&~hb]l[9'݌ylSO2<쿫lIc*Qqk&60XdGY!D O C*Mrii1fu5̕@UFB1l߽Imq%u LOL8c3ilLJ!Ip,2(( *%KGj l  %*e5-oﴍ8M*a~ff~6|Y,d,`!qIv꜒"T[1!I!Nw.v]zFh`QwkCVAg/X_}F@?ƻvT񟜾[mm#?,>t?}=˼l?ff>\fbN2p cL1%'4-1a_`[틎b=SSO|{krk{-3ss`yB}U%:X:@;afU=sru+}K >Y%LwM*t{zƝ$dYr;Owim @tBODɆj>0st\t@HTu( v e`H*1aK`3CmF1K>y5u=kkN2;;#N;md^6%rd9#_~2:Y`&UW*֢v|E}#{usSMiI S/jﴍ8Ⱦ/XA PLjy*#etĨB$"xㄡʪM#z?NwGj{VjQSqbq 2_^׏޹(*exBaEW :bT:>%:ò6PT:”QVay <UKkZ{iqi}_ ּ};SN )ǘ΁ՁҺy mڜ]Lr*?rLX](^!#h k:U7Uv7쿻d)wBu^-%[ R'l}ʰ (T$ n#b@hpj:˾kj3)M/8`$:) X+ҧSaz}VP1J%+P:Dsƫ%z? +g 0հc0E) 3͛rƯ?e|+4d%wfm#Y~!%rpWMEWMjbn(ek~iQ)à/2,?O ! I,($F{ձ7*Oy 6EK( EF #31J8mN .TTF9㕴/5~RxCe,&v3,JE- ZF5%Da,Gܠ*qI@qlG6s푻jÝ$ >8ȕ$eZ1j[h0SH,qf<"${/ksBK}xnwDb%M6:K<~̓9*u᛹Q{FЖt~6S#G1(zr6<ߜ!?U\(0EmG4 4c~J~]ps/9܎ms4gZY-07`-Id,9õ԰t+-b[uemNi_󈛥^g+!SKq<>78NBx;c4<ニ)H .Pd^cR^p_G+E--ۥ_F]a|v@|3p%kzh|k*BBRib\J3Yn|뇱[FfP%M:<`pz?]6laz5`ZQs{>3ư_o%oU׆]YLz_s߭AF'is^_&uUm$[[5HI4QCZ5!N&D[uiXk&2Bg&Ս7_/6v_cd쿽d@eU XyX2z>g8:.⺻h()&nO5YE\1t7aSyFxPV19 ĕi%K"IcB j>Pm[E[^oHmmU̸nG pHKZ{{Qo}i¿Xc\]e1e,5`te.5Hhao<[50wMUF􀍠PV?Yg"ź)\3mf|ܔMUiU|Ym! #'ukMmQ9Blm]TO1ba.XW x6ܠ9[v35H;-]Um4mMrW-k#~fؤϋu_j*^Wj^qM `-Pk.@%=X#|ۡb1lKcj$׋bKv[~"N jS4HOkeF3LPyi︅iWk! cAnxu6<7cp?WN $?X3l(?  'Z! ,Z.maO_Bk/m~ޖ(<qRfR"Au\PmLZ"twpuJ` mvf+T!6Ѓjw1ncuwo':o gSPC=]U҅yY9 &K<-na'Xk,P4+`Þ/lX/bjFO.= w ?>ȑ3n߿z,t s5Z/ Clo-` z?a~b mzkC zFȏ>1k*Dls6vP9hS  ehC.3 @6ijvUuBY hBnb[ Fr#D7ćlA!:X lYE>#0JvʈɌ|\u,'Y˲.,;oOwoj-25Hݻ7 li0bSlbw=IsxhRbd+I]Y]JP}@.供SЃ??w w@KvKts[TSa /ZaDžPAEư07>~w3n:U/.P珀Yaٳ5Ʈ]խ4 ~fh.8C>n@T%W?%TbzK-6cb:XeGL`'žeVVޖ~;BLv[n|viPjbMeO?!hEfޮ])4 ?KN1o<]0Bg9lldXuT ʑ!Iu2ʌnB5*<^I^~G;Ja߄bHȌsK+D"̽E/"Icƀsu0,gy(&TI{ U܋N5 l͖h"褁lm *#n/Q!m b0X3i)\IN˭% Y&cKoG w 9pM^WϋQf7s#bd+SDL ,FZ<1Kx&C!{P|Ռr,* ] O;*X]Eg,5,ouZm8pnglVj!p2֬uT[QyB402|2d5K: `Bcz|Rxxl3{c` 1nhJzQHv?hbºܞz=73qSO0}Dc D]ͺjgw07'㤸z YJ\Hb9Ɖ„2Hi{(2HFE?*w*hy4ޙM^٫wF(p]EwQzr*! 5F XrO7E[!gJ^.a&HߣaaQÝ$_vyz4}0!yܒ栒޹a% Ŋ X!cJ!A\ ?E\R1 q/rJjd A4y4c+bQ̘TT!kw/nb͵FcRG0xeO sw5TV12R7<OG5cjShGg/5TbW > ]~Wޠ9dNiee$V[\[Qp-&u~a+3~;xUFFW>'ǣC~방u)т48ZdH;j a]`bGԹ#qiP(yڤ~dO@wA[Vz/$NW\F?H4kX6)F*1*(eJAaݡ krqB}q^fn 8y7P  GRޠkQn>eqQntq"Occ°NRjg#qSn02DŔw:ؽ 5l)Fa/TTmCԤ{"9b{ywSXE*m#3U ùRIvޏrJ`k|wJKH:O*OKy`( ݢe*{ ua ȻݔhvOkU~OǠI/aǕ-JMX _.6KsjA Qsmd  O#F.Uf28ZAgy>y,d$C?v01q5e.Um>]RLa&r?+@6k&#l)I5_> ` D s5npo}/ؙq #a2V?X~.4O/'|/_|&q̑0dd4>vk 60D _o~[Sw3ckpkpLNa ^j 5*<&}kˢmqvۗj=<Tr=[ a^؃ È(<^=xZb [_tܡ&yЋ{ Sym^?̑sU~' Ԓ f\itu)b>5X -$+mj(^>c/"ɭex^k$# $V :]PGszyPE[/Y5d{zrBܖ6Hlc "mKv~[uLU4lZ;xEN'oI㤛rP*jC# 6@dmHg1$ʇȠh#CBΤ{sTQ{%w)7@y1K^ ].Y$46[B-3%OONw8d`Q4d$x0t8@t]y1T\YAidtxBG:pɨyeNg4n]M؞ e}Wn6׳i~'ہZ*FU{fXڃP'Hl4 ,ŸqMHDCYZz Qnz܁$Jp04ȴIL΃.0FiO-qy)i_TA|S2G4miBȨHM(2hys|F 94 DNlϒòκ-q|xC ,gKDzHR%t+E/wd#礱ºȄWEz o\JξB.wLKZ39(M +(PWՇfR6#ю3Ȋt ݪbh]MTw䀩S]'qf&)-_G;"1qz퇛0,#yiq$ՁɄ)KٮޓJ|̖D?:3mhW=rOf'/wѹ8BS8]`;=?,ڼ"ϴq*(A7? /W= #^ub"6q f+=^OI@߱^F[n4A#bYѤwd)J^Z{*ǥzw73LuaVad=$6)iI gC~.1%YmҪ+2gSt!8iIۛ*JgE7LGoş\bC}O i ycK1YhO6 /g:KT sPv6l+uN|!"VS^΄t*3b\N7dYܞLcn3rnNd8"is"1- ޑܧd[]~:'#;N(NknfV('I rcj2J1G<5 Nj̒Qh]ꍾZBn&Un' CyUM0nCj.&Oڣg\q0^Ϻ%4i" ZZG>Xr'XKc$2iσֹH<6N8HSg>uMik{Fm(W F@@{W+ߑ?X2hS4-=^YgpUHެbZ!y!ul@ڼ63" ۩:6=TZõ$E,ϓRV|G&$rr;J TtIHFE=RȬ]P pLm|?$%>Eü%mWO[>Xmw,*9.[G n >X8Ī;xW%dT:`ٓ~:QO,}j6j!yڦʲT:Pqҋh] H+&=>g| Z;D8ܶb:! Å{2:+au 6:!fF+0#+̬NY"!6a7#񕪰%:r|o5Znڧs?si/W qEU馥˟^_޶oڷOj'?nc]Rn\t3^邳塨Lɏ"8k8M~?M}OAH$77f|lgn I;.K*!<+"eK5c&`X:#;@B@[(K44sBFu M.MNWLlY]K᜴=/ VމYlϿ4i36$>m|_>9|dUA"{!$jKx E$K3hN(tÊ-#v#O N, 9g80Ǭ&VdӞ5W1!1KYd`,-*&>F~⯰&jb.~cNk BL_OG]Bv.A|'qT(Ol.' 4IE|@Iі)<-p JkQm1 `qacܗVc?)cl*&<}P媠E{-sVU>߇GUt\+n3X]Byoz)li$2cPs6D>TE-n# rve{椱I |p)U݋7yJw&PzDgi xs  xh\L r Ѥo Zt(I >|$>tnMdэo!4ӥ2 ]8â6 U`V%`!c%؎ʨTzrKh! c.}.CӃ\? 9WykGk,8}B6{/) ݆Y~ 1;;|,ۇ=sxy+@{l/*+E2}`pNU`ZS̯窜qN8V ['4d!FmaX-6 y:1V(!L7,RPEd;)QϢ +RlWDžuF7LFֆoM~ar*EtIbW>jqour?qzJJaQ#-n`/$fhnqgTĔO5 ꐌSYXzv9[ezksA`<dkON৯s|&*pNaJه5B5H:W2% `6MRR'xZtfC$1aH_dx$1'/v^ZZ4 rS/wvҍ%Eb/Ec|U9F-)L)ŘF`U:VK jeFrԋ7EDYpԽ.D\dNyj荊EEg]bÔF˩ք%EGƶ*NX)Hc(<|q@Oޯr^3>Uf1w;mCja:-1_k٘%VbZ˙#G6 `q+MPU~l!.?I_Pĝ"] rT [eTr؟˰ ]\ h! v˱>5S1px fnk}sRmA>d2UAkؖvlX܇Bz1U_#Xӫ+al H d\k/I,k,ρ|`zR/$@8VU^rcG"E7\qtS:ڝUyy >Vc11*p> 6L/_x 'ۙz7~w~);qU9GDT! 6]c_:VlnEUdn6UˇKU;V`JUݵޙEO[)ܶCy*8¢/[cչjx&? ՃJȚ9!j[~[' "ssTV2i sLq>z@JM->=@NỲ\쀜*/) ̞r21`V<eq1.bKʂn?Z֒Z6`!2XY]-Zcp˿˘ɲ}MV<в~!?YXV+lx)RRfb-I7p)3XɯEr^,bfbKJ'@hX><[@ ,&,]$*բk-Yv5 '1T9!(*t 0'b@񲱥-kc6VnR0h& 0Z|ђ8 CGV[4xIIWN?Yt>lf@ Vi`D~ڇŁQLLkY <ZPKoma_u` !>Z;3F\dEB n+0Z ?&s{ 6(E|<ޭLk1Yn(F!%sx]>CTl9"و5 |ݹր|/#.w0ޒx"khD?O`-9C| &8֨O8VH5uH)28 Ǿ-R9~ +#e;U6]aD6Xzqd5y n';)VKL]O@b OIAG Lmc 2;\d˽$Mu>WmCEQuabAJ;`uy-u.M>9VsWٔo RS`S#m8k;(WAXq 8@+S@+' 8U˜z+ZU;=eTtX->9U-q .AV/|\ǔ%&$]1YINJ2]:a0OWv"1] `Wa+m𢛲`Rs _I@U8jxɕͽf3[Pg%,IR Ř`QbmүcH&CLlvLҼé1ivGgJ+u7Τ!ljK1SpHR>:YF2cU(77eGG\ m#Tvmە8[,)4\\=V~?C~>_) cxF;;Ds'n [&8NJP5H2Զj{RC>he:ա+e/.I0\lWoӊĭYcxN^SPiMrFI_"*l§,̀+ å} .[c&SX( ( =X?D5ۙ@m cEpR?H0F>v6A*:W?*nzfw*B#d[se$U>tLNÔ+XX߇`cu0:U[tp^}{>H4z 4 (DtH-ʐ?sk7iIbΏ%T}v}e{aBs˞L=ilNeb]nltwfCEI"*S k`u ygz[~S [j3+sE.,uDΡ1R:Vݐ/CBc˾] shGՙf 2+);W{@dlG)%عF&4D&u.Im9c$A$Dfj-ء^6&#OȯTgرBӆI t[ 5)l>MR2ǂv JpU1cJpրj&*ߗEЍ0U#X) bpNVYSD1౱UR}UR,:lơ2<8"˓MlA2 KvP8 I7D Oj>;V|a|`U>D*KS;|:xI/ió21׭ȦS!e^t+28b$d:z4 .}gRcƈ^ʮC^0l[hl"য*C| m' L,C{"./Ep.h>Ǚ8"oRs?]\McZ0ϕ!1hKS`h0O{!L-w]ln2&0Ǚ'0=.T4G7! H/ͺ|@lX)+{{^s1V63 ۗI"*al NJ`Q8B\pup6_3XqCXznL9:{o qcuו8`n{ave=}OR9~yL Z1=W8>É R$|L ]OfJl˪VVg:lDԒ͢Zu[kWۗw{{7st08`J0ꨴU1|z:9dX)z2!S:'q9| 76"Q;D*04Zٚ ?V¼r8/G:T6Fw/ɚ~h?lUc3MکEen壹n\殸,k[{Uw7z(x!8 J&[V =͋A,z`S,J|L/vrʑ=}IhM4fG(Ȋ1{TT%41Oa'$ ]koH+F/44K6mNn8/H3u[uqdIc]7.v[_>P!LQ4q'[JP&+$8Tš*IHwCU0Pu BRĕȲ(JޯJ;4B(L\BIƧ4G&}}OCBzwzkeg\׸ξFaOwG,dakֽ:Z]oRh!B7!xlsW}W: IL_e}ЂV,Pވc$>oWxm5-,@K&"kWOZ9cW"BuP?8#8ZlxI̵aO^L&w=dnp?y7u;%qTO } .&}e,mpjŒfg*$V ^Wyx?=bch;xZQ]TY'Fd{#-GFhoH [7-v\a؆|-E>3`uOfI$͛!o Ɓ'Eq*>Ʉ#ak2CL8kAfMɋGT#-ǁ|^[>ma#lDkg#M[q*5"A @l$dLZ7jT𒧠X娚M#"z.G}r5 Q:Uu/h*h^Ev@wEVWդ]%XKoI\-ʛGQbgjc񋥃&ݐR]'h=.?* ? YfK10oUyT_b<3N`)|b%䣨/q']uw~uVh`pA!8) b ~A7.gy3^g<2HQGhg O?>G_"C^sv!`6F~>g[0= 2*`D"F,`W"I;S趞w暻Hrƫ8hY_&Ýy}~=8F$395/?w@u) k bVr!5IdNv9ַ}P>&/rx|*IP3^$ySg~0*s$.N@xfW@MR'e8 ɳ*Oe둝v2lʸ^^bYÒ yWeS2Jųy]5ٵwN `$꽏"dJ3_P55'v .[)N/N "v7^#mҮ /崐fCPԛ/(=}w<-沔?2Nm s 4$0ueJ?] iR1wW7| N )7< lXIt"ipv`et>$0 TB/+>e$lX[ i<@'NI|=D߿6A&늪-65h(P|>mJаًU=XK<A^ \4|}(s)|7 K$03Vر2?Q k C10u`*JM=VL<g氽ΉtyNw9ta|G Zm|Z)'!&(`R$>5=,')HL0 `WY\ Kӹ 3Y&E$CB&[UgCY !y=H<{W|NKf PTb_B@YE} v,8CZNj>L?T~U,HgB9e86E% >{اs.]SshY- `mo\bu6lUQ1toeӌ<F: 3 +SWKM8'=&aSL-L-̓Xph]B ϴ;%M8ᴫC!9V9 sW{1ٛ`>k5ooGƅ-HZ7RhMVHb!iX'@?:IGfyL*>%4XIiifZW$Niv6SF"N1Ѓki72Qē(Gh2#3+*eƩ`NlkJnh=% VY>M=cQ0 bbM2'`&XԝLJ<෮bAJEojÍ"g44B0mWYV1D8C/Tmp|ԽhZ-ah6 &Ui[/y5v: (Lb0/t5W & imh}NJUڸDkp}%gQYc*֞8؏sOAY%QRvXt"wUq{˕NKm0Fwm;^ЌgooO4> 4@Hd HD-2ʪ!ȵ7h=SM=f̊6WUFhaTi_{c F碍R[N qry|$Re/5hi}WCk'T0.3<h5`C@| A39i.|{;X筇D}ۊlwt>]WE0Pjm[65S@:hjĊoEKs!gk_=2MR3y հTqDcLgKbMY {ZYeuuʵmmILpSJ.yV{=sevPyf =Xf e?ToZ`"7^㴑Buix\Ke00mO't8i=G$C<*q0pB,> Gy. @&*T44HWإd Ü-w_ōy\[6~dsoiTMRzl11۲oX lg1pkEw8f} ynlA]]ܔ9*3qns\y5f0s U\œa{ o(3td5s A@k{2ǂQ2 ̎ddh ;̂5p߿d2-X׼Wehp7$ :k*gm#Ry!db0aY;[U6x,]K"]}a6̳پ ngˤT>o`Cٽ{2@p 3ӲH ;{2܍Q : Cue>aáLo[{7Qf6 PwG,1߳a z&a^  > cLwH(c; a)8Ri!%ɾ2qpS"Jw%4lPb&#p0w?э> l[=LA%P{q4u-='ī "-{| Ja49x x*9>gt*"h" z|8I3vT u^VA}D^IV@_[+^"2w 4pN,k Vyhۿ1 `Hu`hhO/k!G%+YM$@Q ACE ^=1;AO@$]H# `xfG({w1(~ȷOb`8q.G[T %7PPeEut JB1RcwÑ(,tdP}+vj*Ubz ]p3vV" L u (ߨ͈6:A;gPnp3ۼ]/:"^H3Nc0N0O;Cr@8nxKACoq(0K38nGy!v(zvi14 %1(\)@=^)/Wu+">tP]Ը*9jSNk~`C^ fvX·;`wb#?0(NG83.2.23vqϲͱf،=8S OA#oT%RXo,b (8ur;f]۝ CH G[1oYZ o Y_¯U' ÿ`[h^;^70*.Erz!#|7ž:0 T wgNoUw9@8c,uhEVw j{J~ J K@)`] 뇐ⷢ 2riP(;LǓaEpa4,(Ƅ r҇{VJRVZ_ay+t毖 _l!xX_E~\i6YŠ pY^M#}>wbˠtFm?NOή~N(߆3X?`OޒO~@/n+F>`Kh2{΀`:r1/4l'x(n^"q|!~>9·Eᕟp2y7]~~wsVܣ:MzMz|'ɺ.erwpě UD& Ƕ=Vr#/Ncr\C?qd8qRXst7vxHމiax;uK I}6Nlo -4(-K?<CTUԦYM}`{`G bS=P|pG7[eyxv<9x<8c?{ҫ%ݣUz6FVM.89 E#IY}LT5 yxyß8 U& h*h).ǏiRMҘ//"oU}t~ GxӋWbs".) "hP"444||-FD#&R%^W8= ԴD秧'6&iy@<(V^hZV?ǚTշMum>V)4oa4A]X&,{[?uve>F{2NW0xM}Z-gru!̠.0P_:Mitd?Wcdn 1×ĝ a>i-zq ^BA VmӟьIY^Ԑbd#Ge8+@vi<1U/Se-vj# an~FM|yFcv!>p7 i@)B[;?Քl>y.pBZhϢ=G{mTTv74OS+ Z}QrrHxR5o&կF/8])|4*mHnS.S Ɂ`ך9n,=}Q><0S ʪ uG|EHA{@tR!Tg$EHR84.΂Uqq[}T&茧J \ja*ch)hҤ,:+M Y$XF"*ũf9R?g$Q77- }EbfNT24%on88f4?+r0#C]4:`̟m/+K۶H7qյ3~ߖi_. Y<-M&$ 5: !5&B1dKG]usV", \h0%ٞL^켖]ZjM .ˣcA.&`P" ʟ85uӄ̘"䒔 Pzg# RznK yϓIӊ4܃ a(56Ұ94ԡkO4U>7F@HIQ+2gB),\o>ME2Cey }=rD)Gi:t%V-m-i 'YE+@|:ep @>! ói`yVrdJufE@0-Pd+Wx#vMgɔSS}gw, 6>(fï/:&/?7]`O/ 6׿jS>ի?g"!_t&縿t~ۯJ^xs]eivoujL71=}v.w9m`O}哸wd\_Mߣ[?OߧsZe_jOB,:W0^ 2&'giϬ+縞Jbj?0{Ad|w]ǾdSz g\B~K:E;=SaKVϻ wtKy]3zn)FHiK9D| 7=7MM4|YQA*_i3mkaDV<)Wp|p|Xb06Cܬ6V M. %MH.p/ŏ3 ol~ Ύ=q<Ī;cH͒щYQc P0fXW:@`j)xeZR1]Lї"=b,Q^Yml [͊&Hu3)q 4ӵzLE+se@ z."s^&ͬR2-4&R2jiI81/c^5YF2XN,>&\ 00fôtmxTNvjy` ^4.zEptc-]&APB_Xnj,<6`jREk *kVqc,nkק{X*(yE{85W|цU=%o-G3®\k38Ң;NNYs9@P5l^c=j j# ;k ppp+`áUٖ Ox=$Drv'ϊ蘗xg,SDX:C%~ӦUf: :p mEf&]*xLA.,  !}?88/Rs5)njfkXrֳ NˆjG3q$8ʱ`J.N8ڙyQL44#G .<^Dy^CK*˨'_09kZ{le 36愶$FXb>VY -% F. Aؿv]6~"|]+`-mP^-^j fN7>,v 06FĒ 9؟ٱ\vWzsx)ʹC$1fOfWx"A;_T *RDbl S iwDМ87bt L>4siT|puÙQ[ULsXˑ]Kl1bXW*KGaY7ر[:&%rQg[6xoϖ[*,pD),q}&ю-8fO2 Ď8T6c tMQn,{QpR\qu,)oEd  {$zpc:pTy,4bM'cٜ'Q1YB V51C1CdC∰tcuޑ8R=S@8zmht%jxՠ2KVA`N,",i0 Af.Cʆ9QiG'(c-]!~T4XbYqeWS98Y#ee+Lhek:i#Ęaw+> /5$j~,*&sLK7!%-Kg.3 .dVty2 [p}QqVuL[ˍG7HF i6RL;|>u/{HpUrQ: F^j-_Ip4cx)"{D rc Ӟ$c\PڬJ{uG76z  14 1*J*3X:ΒAC2Kjp<8} 5N[Fks ipiM[ךOt x`C^ d? 5D< npw3"'p ۦxaξ5 uD0cyZSh !CTH,}(QJAX13`GK\@;:agͱVEB-Y{}6nquZ9?ḱ*qSB{ǎL^wΟ=O:~ٯ֫NbWJ Tj]:NAaV}z=c_k_ӜKE6_w 6<28;&OGc3w[@7X(js#Tsg_e@ƕx(=4{V 8&sC /klTd`:8Xu҂,ŏH%nSJ\gKzX]K (q/қQګv[1?XjN_! r"7/U+BHLH{UM0W:J1zĢNT%@2z`勷oHuXrQT&j0"Ե+\= ̷a+MKl{RrTHɬ%!3}Oenf+r1ف|nIыx+q%qKO%YٯX&f[ eUh`uL AU a=mw9_W2[\\#!r#ߐdƒ52UZW,myfθ2%8`,GWsik6x֐8XJkkq95-ib{6L8$Ks{$8aP)n{u$f*~Q̯La]8nziZWܫ3i[͹}QLdN oؙ[bw9Y}] 3Qbz]wJGYʠZƮ3~}LrZu<~qp."ygRocgDzR!13O ҽgvQ ƃ/uK'$K1I,:OHp3rH,ō={5̓8T)!.aEQ )шR,0p|rkǰ(){$s'9S_ ntV:Z&n NN%8 -M7G='zaf.aӜI*]?>S(/- +FnI.vQ3jT]h20%n[겒Z-j!qp QIcpGI6i}z(fYb[RBE>/~C&;VKBqnYx%1= QjVKz\8uNh}0w: O7gzill>*Sp(Ij\*-B 0ǭ [m.Ss=KHOEWIν7y1kOgTn%:ֱSۢAE=~T8;˻}$QEŐؘ0l҃n$8j<10zCpkq/puwE1UGeN̞o;݉Tq#)bp o~Gb L@G ,w;E{nGs[^mQz&ZѤBu^XlJz:9fJYקmbEh_CKE`,cGRcuNk9&03*+npcu~T̅)WK]I@l-Ȋ 0fzNfc5Eos ֍!]'x#[<:݌&8xTI)gF_0{{mIpTH_0 R'Kء:ȱRSCJ9C.e3* a~HZnjpDiҷ9xiz𐚺Rl8(o ~]Y6i^ǼYu$~z}Zh`IҚLwS3`z׻?KJwLV= \;a3pe_Kg4;5HR n_p>n,Dk>E{D`7- ,m;r=/cEf!t|ȻW0:٥6uծ*׆P3*c3D Rl 3 m@~{AɱR.-Vea|uâ)Ӏ"49z;,0z♨@$IV'\>Mwl|o9ieE/LQ<< ?O:J=ry&i ?0nF['UVq7&kD]}|^5VK<6E^X|D<@^u3\n:3q+qf׎ A?83/0%a?s.6)0%ō??.{^{Z8ZeQG~;x {' E+qD#DmP9˺N c6m)hF NҹR=H 7%&w̵l4=aCWpe '߼1zRd}7'r@5qte>= : ϝ鍯.+0~_c. V!6.-7`?P_r̾ˋ7(P7J š?ࣷO{(@{j<cڂxgYS8>@L|; G+!I_ uyQDFH#LThFxw`ԓbwmDσ+y<敔?oh஄Ŷh&{P4'3JD4"|J;zhd~tSuk:^u%k[fM33DԶGY^$+ހ] 2zQ%7z_>vL&y஧ML{i!SKq]1 eu^*/y:w%ݸfVclȚ)c4|rtw#{?+e샩ui@hJ8037e0-S)2+aҎ苷]N+'jzE\Dㅟ@[XWdWς&Mz>}]]5oJ S_$NեߕwZaڊcQ2"OSg(\p}SRo]]nq+\ Yp}U'* / p\e8$e#jOIRgiz|~ 3 qY~|߃[ߛ% ij7M2P 3?|& F>s&Z(s3On=ϜHBEMB rh*kpVtF@#%f:Eo\i4iYb00'L\M#i+` JbH-=>"VIA j4`[1,ŚQmQZj LU'h zbZ$ (۠(o6y@ntego7 G Gi6i -X{õ떷3^&j2ZJhBɋQ9(nFVI#+W#/Iqҿ 5XAkMsp3+A1R:bdTar`I\QjpSP8/ɷr8`T4 `L7zQLA2 !ae24@?]2񵺿uS&%O6XAL-KMbop:>{{\ֲop7ye) YL]kXʝ xq61aoa 6B{d~=NYLpinۘHM盝b}{)62YXez(/T,jm-7Mb Y{q$bh5R85F87 M{0.K} G&'pVQ 8Z#% [.QVa[-;R5,kN<^D#g@nXW-6*{{ bD%) w8.w)P\뱪It\4,eLSbzDܠ=5 ɭl, 3퉟Bm:&۳ɕ-˘ZZ? :xn54a骏ဓHFywgӭJ9ªp%!h,0FbY=l(/+IM5f^{7( gGQy`/?y@k%c,ZNdWI&_eT%h_։;[eQnŶ+PJq<^ޖgvNxi.zV0%V>#rʋq7|[ɺ*43wA&YwAa{)9ߛ3.P1GO#,n4l2\¡p%𰹀ْ9~)y\K}.v#fKF7 \WDnM|QībO8TrENC^L#W䵨Ul^XIx_1IѻdZ{rS̉T=v0OMa8n$mvZ/n".çO#EQ%濇$/>2r77Tj6' L:-cXe̸(萰Rw!E%Q M; 7 [bh x֛ ơܛNlHPTd, 뒡9K0̍aJ#8eFqRTd4d4!1ܵd,vt& LKmj9 F DDp-IRL44l@G댆5 ;LP5xǯ_]}1'UfGba᱿7ﻼ8N:.P`zn^(m}JbNtߑo|^'KQ j\ $*pWy„do |gb]_D1o&ֽ{'3JH3N^L*0kQz,ikS\ڒj%{b?Ԃ5['rMsYZT%GIoqv7AuܾMjfe|#ˈ eEI]zY(fPՃ'$ڄ>^ -30$̯<|q\k7PDpw#ՋQi30 f.k,W+ {s:*/\Ō""CD{GiTOQ>_3[~!_ \(%r퇏[Qq/ꝬAp8>t+#[e2e[+cjsxtھ+kn+wI 䜤(vq0K%|=KgbHk'{\Xͦ FhrAՆrey@u(J+iQ 5ؤ1He($+c- I"é9kj׹[_I&|KP-iU,:P-ܒ $s/*@Ț 6S|g6Ș~tr/V 5~;4iZR}yNq{'".஄Ŷ V+2F! !/3 >`̓I8'D(ߖ#ףV-WWF@ZNJ׫!%g|7ץ2LA)z΍`|<".58tJ u7?xGjO O*K٬X!?raT?fgЬ-햿޺nO.v oʢwΛ?}J܌)?3惛x! :G[u/``3t菺 c}_ [!k¨tkP|@#~OF7࡙AY-wȱw.\|#3҆޺05]gg W"pe( H$ׁrZjwEwb|$ -7yθ%]4ئB c eTrA!H&HG&*\F@Mh3l ߈oja`OeUŎCE q3 ;$7IY` @2ydyqe:JL4NPb0bIpʐKJuL"av C:>Xz5zoz 1b Th8ވ4/]hb} nV;ףoN p{*W^h1Mm_\%v+zEj?i8#sVjcM' O:3@9܉atqCZVvUC$"@.G Q5Vkf^OHƼuL/g֛o>HQ@X[bhT2;C,F5)Y"Dv$D)V~;Q> iOk# N9I8,ՎU-?*-?CLHJe(jYl K#. "&p -JRK=hR&rL? Q(g )?{۶ a ~ۻ[$-n4HS,Ip1$tEɎ[sH=hٲEQ2(FE̙3j%GM̲tܧSTԎٿ !A" Q.N%r1ȧo#A\m$MH)!#WH0.z?HRMDK Τ)XuD8ycVx.bR%m[2 Vi 8vDO w$9 OuF8,1L|{ׄĩN!H׌8$R"(5t:Aj,X|$2+R[~O`@ )H:ۉ0V*C),#i;mږ|B|ʺZk6B(&b2 cd9Ma251` %,&Ҷ+Da`N GV :܃{A@S&{pQgT 6B{c.}\!ZuF` Lb^7d;'*ɀ1ō|>~!CHЧʇ6S@Ym-kB2`{U*,SδPTJYjś@ 깻X?ݟ:YgdƸG;c2E42DXPTgʎjTTDu^X鸰!\maU#',Sn4mrwG3JR*(n15i<Æ F wiт>Ԕ'ͭ5d 6mYf6lKBzO6}M˞~ņ}om}"2O%_MT3&:-AC7;W }Yc衰CNXf!ۏ[s ߙXPK ׌41+VIƉJy^0D,7wiH:>wVD2 [-)S.UO2.e'/(eb-8ʘG 5a#)U"93#t$D*^rskPFx<[ڌTB*ҀeJH층f*54&Hp@TiP-TԶ"L7L8pknVڞIֽ#%Z6`!ܮAG Qeh&=\)/q65XaPԙ44WNw!`+zf}K;>!Zo01z2aM`v޵!`:VfpBTkj;νOovƊJ)DLd@֨mdl\4?6᲎N,?%&#gG&G}`C.UlhVprĮQ JpJE;[c:y 0=RVluoUK{UnVBOrۃ!xb؍7/;m6i`A/A j!ѕc !*>KMn|qkɆ{66 }GS] ./GKVJTLk;ՖhLpxOc6 tс0&欰ڲo.zdFe[s龟̿7\9Ӡ]\MAysw:]??!;?!kc%[rOX}qtܣ:`t(͐ځ[.7ly4" S#F#`ڧ=o~R];4kXΣZ{Qb^֩*y1vgH5VPRb8moNڹߗf%suxi1 YW rN-|rH)F$Xǭ~#ܚl~/>A;G VHWfxPHY0I7mM-MLi];t*n- zy˖??|x7c*WweOuAzɗQ 1N^{]+F'?|yR=a 0b|g@Aaq!eyvsϊOuf̫5ClTa(l4.*8V<5jaAc׃{,R vvQo>kCsʁOo>dv]>9bOd[#Ni DYD*q-HAך"e0W_4Xpv-ؓz)ae.D@!#b\r5%YޤspEǙ %Z1P0s *+Dij(ĀPDъrr9c~iw&08`DǦN5DAèJ*,gUB̟7ZKӁLwCܦx8(q0eKN_r xtOkũӸb{cھqӴm%ne*Q[Otػ-Ĩ/TOUn)2H|i:tZ|KI`IݾroO@  _ͤX ~H ~ K ]Oʔp(o svxX"f c -Ћpރ: Ó_wQmx6]^zGPA5|o_bڦ2W`O}.r6ߗyeHWcWA:/?uw4 /]-t4/bZs6X-8KwWOjY:: bYYI_@pt>ɳo/AvՃ1TMvZ,¿>n>qqZCFiab:UL}A1UJ.c [ƙ|мrCC֕Љ]kƵzIݸZ?R6gB ō-mϭl+p׻&*%vZYr5J(#w:Gz-R/ڼ#%x\gN7q6?hqF:XQ>K.sxLUr*ۦl}1k7**}?~vxonKq<̭K!Vs!9ܮh)ֆMqQiI!xk}W'O`9Uob6L_V:0kOg*`N npVct8uZ!t96LۙK\%%$@FI\g(#:_Vt~k߄tX&fo˓*r i:*ڠ|auW {C>Ġ1Rߊo?t8vO"gRx9H4҆guvǀ8rqq|tPJ%"tr9N>,IJޟ Qޫߢ{ W@0r)"tc\00b~!&NagsϦq)6 %QuqT Vc0Zo}jR߱;̀,O4Rj\*;}UY_'er{Lyl2(./"_aCۍ6a)`q_% o >5 ߼+|Z޲X\|T go7Jq8܋%+vg# 9tEpR@$W9pDKGQ2BIZ-E fL8RhEx˸ $S q^>!-GJhsYZ!#ֺ4|rTCXJdճEdwR|eꅑ40#YpE` \GwY͐r(|,99<ēi+'ﳫ0ImK"œ"ϯP2h_bV 9RN590;DOZ+Z@Qav[7`N \V:ԭz>v{?qdc1. Z),IW. vELatc_oZzkx4 Ū0/~gUvQjH7 #% ڠ7JzS.vqS15VܶBdq_ޔqע^rr# OidV;rhB~jՔ*Ac2qcI}nO'|iE=,FIp" 3&0ƅR5 %z~_4I- >wMo.L˦nzn;6R>pӗ7/]|C( +]:̸OQ}Suqu!]ԇ`J'&4/+1 !g…-ji7Ҙ(1XO)IqdmIx85*2:E&Z d+n濎@.ut Pl6羶&qRLnǥI>Y0pw I^pX9i4ŝsVu q`n :/Vxތ UOY5>̳ax}@f;P۟֝ nGɣ {2ٝzٶʿ]&gx~õcQ-v'v?Mc-)et2X/rQRO,Ֆ {_v*؟}w?75ȘE蘒4RSd"-w (0alꙔPaœK#?0!1]4:?W'o ۉŒߟeiϐ?wo;($O/V1[Y'^hnt<1.X.joɎiΫ3L["<-C$@I)>|O<}?wV~=b΃?Z{0>6q [7sSҧ?u=\i4UzFO"?WvTGcO=SQ=sQȫg_k=6"nKESb7ﺷ.[x} tYAggij9`f 5B`Te"H J4{O\1 ;[?_~Zms"PPh?G_/νsV(UU9ܪܪܪ۴ Gxo]og) BxӶWI=\r9՗f~/Z~K|sYw,0!RqIh}qo4G߽ga35HN;M]w.Ix4$D$DDL˜(o=r1;őT.6)w /W]2,H"a-da{Sۃn1#;őԭታF7릑]N`FM4f L #r qcUnd*9qY+6F Z3D =Ã>1c:đԧp{;C]+f!+#>FI0#Xt1|>vP_~В8KQ&E$AKTi'ӊ'#r q#Unt*5SvCfw)Gb`Df@UAHXpb1@Tّ> ct4#qgt8[ӳVe';MJ,t m3bhw>iWQ3vq=<6J XLXAi8P LvfPG8JeRMCLYkNJp="%$y_؎~q#|xuu# q!SXLR⼂@:HI1<:*ǑUǭ^ -H?=ןެoMs:V6hn7y79>igqq7zu97;Zy_~y 㾨-". ejȉ՞BX@́8," "‘\D(o^yL_99AǏԪ٩ᵗtbga?sP}#վ01r}j#(b쩵{orjɣi=`ry\B15M1)Ǩe/uF#xt(򑧖-h Ppϒ#3TXzn&H0lbӉ*kp2S@Q*|RUd1+A2#Az2T)LWfWC>Rͅ^+zOcɉߞ(⾓XUaP# !xt@(*ݳPL<  F[❄){ D@E>Vj>+N8JEm=2j:,5[o{vvz!ǫ<ͽ~.qB'NpIa}wPY0=2*NQ cŒn1aDx ܌I_j@Ef|g/!ؠMsEnlGz5l,G+65ln7bx˛͗λ0e LXOd`81֕I÷CǪVz<4 V4E("$X3j)2Fqo+].(jvl*Cn2 T tjl/Adpq +N ܾQ'm!b\E>V\]O#CJ}JIh6ߦņ\/q1!}ȼt]&MV;sDeI!lґ#4M[rğucǪ$?'Qe=2*6ӜT^~_UKJT:KXi'"8i#箱{dTQ9|y= ~Ca\jEue ~R__ue9_}ܥbE]=Ʒ]Sx5\.(ۚW7xx moEb9l,6[-lAXds1v+n|IBnߺV*5g"p'=,L̚mOaDMHe}F:5&oЀ$#5oڒ͗ ߻E NI>)!EKJ 򨛉#u"xtݷ(*f&6{͡Cw 'K›l*R:Ft3gf'T9Čbr؅xVnNK;]a5 ^ 8>=eIڞo2ƞ9Ϳ]\"mE\,:kn)iI!N{`xGm X5s9mπ*|`Zsr|X!\+’DzWHP.H ˸?qtlcǪ|&ڢ > SQIGc.a"&bwyPc$嚒} -GF{d"xtLX[5f/1=uD۔LZt!Lub)p#e=<=Q#Uga@QŹJ4M/o cH!񑹗Z7]E>VQaw}^ rs&d̮}ZD͛6XE["P'S ZSsIrfK?]bz5cѡYx2DJ5Y s03^ vBvBUt5Bhj_"9H,'2quy :(5v>9~Bą*WMB,$WQ( 5t<||{ȇ6VQS=2*΢2jo'>vW6Q*jW~ccXMPi'Uj̄y|r\eL"2zM,OMO%s:WĠ|ET5tv\nYM- V|FjJ@2&#hl\ǼYKA$hʄ syN|h1mTH.b*Vq{@}<+t6]f^Wyl='pVrZ}DPKr¾ #T=¨z>b7i3QJ7cfq0S#R=k&&Is|-"brs&Ȩ1{ a$BF%t FwstP#lĕ}Tw[rI) <+Zxτ/m'mN@/c95Bj?_\5SStV _q w/_~*mBs~?o#kG'hRcj.+5]$-l4.UN.(\|F gH9eyTGCbLm)+ c"[hݦ])|E P *,%uLMLN-0[?\*QqycFr] yքB`{"&zǑ+FeGH60v3p;sJO-)Iw߯(ٲE]bUXJw.-xYZޗZ<ؿ+^,,,kt8^tylJK>0@ m{JxV{<ӃXH.} lCTK4S)i ݙ%3K0}:U[J V&4ғ-dGA⟠r7NfxE,_jnИ}nN'ـ*nlZD4 `: bi^hc<ԴP}&ksKW4jf` AMR}̳SGPj(z4]*t4ep;5LX,H0鎫lhjxDZWޗlJ9_7Q9-4&ebI\Qة|bpظ+RIk]C`$ /gxU uGGHbo(6l)eDJu)EdztJl1KZI]GAx"<8l|{]-4&ਅ#6풪KLM (1seօt-4&ݚa ̫ @)t<۴sШ99`槺qP3xWNT&Nއy#* ZT(ΛKڔ\H 4I O zEAlj6QqGlmi1jDxzIʫKфaX٣j<[EP &ЖqE% HDe6=V!mD^&= ֟IDe[(IU L$6M0kq$~iE#:53̐(P`%031I]h{.|z,$z7BЫ0.E|DGI- z#Q Z(`Da\FOT_nZ#Dy`Da\D2㹆>bILwXCl$[K ֤/#6Pr|*1j)%֟ d8Χ`֫Ux7zpX*39IeI|#q|N؄"ֵ;PaײCnE$Iv' -E@G0&:bJѐH+*6 `$#7~eid#򝛧w `CO3Wh9!Nֈ̩Q\D[ Ӗ/Ж;A8)}q0}cDP-O 3(W4qi.f@ǂAխ[Jlńb%Nt٘*"`}u.E$pI8)b5Xv+/VøDEv~ԫ5˲U*s&sS"}y瑖У 0.vO4rt_f %o<z#Q Z(`ta9$#`IX:$ivś{2~uU)X ϯ7YT5D9`p IiPlbv9/-W??R3_8q, 慿g sQI@Ɏ+# HR|tl_}~O@X8W{_?`gAS"F8a@->͎^se5-*VkW̾YiffKVν4L @坑|9+nl1ϋ*U-|\4< iGC۬$88T6_!+O9N"{- ={4<o;QT)/ :ʟK&"@1إV 2KM0gBJ&[mHlSMFirxV[gAoc0.!ѐʱ.~HÙ-7!j8ӧ9@zHN[ OIX(lȦv/hnX(HW\&FF!pNC<`./s%sQ5?NG<,F\68I48A"h1ٶTMz3&S1핯ŀe#t{4ɶQld.BZQ`%Qսnyvwq!nP},+?]t(av I|@]yYTHՈI]/ݔ)q*dL6š!N9I8,Q (t&RM`7J&#x ׀[~;^ף:Иl@r99ܮᛲdЃ#ZSkn30eWL_F"'Lȉэ />CrsQwxulMl.wժ FuOK:UƟqF b(d%k{ ir{& DIx Pz_p[=-yޗbOQZƤu\|" RS6ң(I=`Шr|4ҡ')N 'G' /y4џohH| -F'۳:bU i˵v[roĀ퀩i):>"``\jm"u|6kJrrR"&ZWG PE0.qzniCxBc~+ X_/#:! _!p]&׵ lLc h;-$]R)& 60RSxЪXǒ8 Ðj\ |Ee]\yE00ؐY ~E.M ޓ<Ѷ޴Q vʠIBcXMּE<- /opQv16(0}*,Fy4^Y oBzШ3Gs:|ݮBr6 iO5OQ=Y+BsžH5G <!">lC #M97d>F<4T'x &3D uQÀJia>-o*ṋ[sXfDCde~3O r3D E '_._[-lG- 8|Fۧ~ccx4 ԪJI8Ze6̷Z=iaz@P:)@3d{/:W5wI {!/GI* }YrN1rؗШw(sclJ|WMFon8 C*_xT{MOS u"RIkOed]32:rQd]_(yx#_)\o2P6ǚppfNE$#Gtv_,HuuA Obmewfi]!-\_[S `) uȵ5n6,tU/@ 4˲o>-_?_KgܔO7s[q%LH\HMbOiX:GYaaZ;0Yy|v5W'g_V?Y[ +*f1r D³`0_lVZ%3˼ͮg/yC?E;s̼=΋2(6zDZ_VE;?KtW‹ԜK%2ל6e%{+ڝ"uWdMG<"d\%.a"FTYIuX-t̂ 2dߧéwe^pKST+}ôTdl:_sXD\qS–}b7[Ohxvd~y4w ߱n40}OK }%-oeS~l}j$3IR?6:߽5I4h{-f_ 3{5poMz"߾|rSg<7bO+%R}~FF0d^Cma.Svwxy2Sl3g?m&fTM|fVz\y[G'E^{لi>^]/A$B%;ƊwyP&.O&竘#l }QLx|啱EfVbrwL*n"YȀ7VL-pM:.oFjZؾrzc3R^}ꐞP w)\ijU"} Xlm&lRy+2a8+7X ;1 4c&OiDIB]<8ZI>/Y8Ra58ڔ@kڄr~\KhC=\+i<*Bz&^R JI3FUZ5C82%"E0̴0ĄF[C8~gηU_?c Ϛi$!)D{ȍ_\@.{9юFJ]SI=*ͨBvjzX$,u)bX )b(T0lg4`)3̃22J*[tQ0YJO_Qu3z=_Pi ]SK#68z&'Nĩ$%&>5]ʲx跬qL lJt @G2Yd1cK̞l[ ٣ԃkw) Ϲَ9݁gXqqL}\q-uGPu`R 7g"ׯX$gH2IF B]\dOˏs*ePk ΥM6#[aJJҁgׯoJAV* [:M 3BI.MLoJJwzq| Rj5Ä%>HK7ygB`&{._>3¸%hl6SR!`Y#X:pKqK)*.轠a /XJMV=EOt>p O?hX+.(]y:yASgWPf*|!,z+f(8gs ` ^vE}@V=L*ꘟ'[4v~otԗ0=E8cy!]֗UJ_r I"/MQHc%ufҬEn]>>~taA<рQ jRKDgL<lHC{fg=S$`$c.ݯ CϴᗏA?.sY@r$煳쾒 gk=GKi fw6jRx ,[Z+lA,Qs]*=grכSRq 7(Timi}T>uo;2iI܎$Z}}UR&N.PS4Y[a8=HPh4*.?c٬=f`0l,ߌN'po5;pͨ _cUɃO;y@Kl;f!ߖ(.sR9>띣܁2Zj3tu,dB^n"ۏh6Dkh3!8Nesҹ9@odQ 04n (Z\ɰyzŵ !n'Tj?0By wSJpa"d\q6>;Cc汝;{3uۺg}4[O{1LLpB50)yv~Ul R9 ;y(Kt̸[NO|>@: f Rx ib 3-Q^0#Z՘nTAt7k6bmو:4LEu"Č ZEЊ:ŰwT;q{6Im -ʘufF׆12V}C۷4x&Nנ KxCD6>P 呋_ؽh @o\|m p490sy$d9EV:IC|z0h Fo}  ^9$c2QGM@P-*@F`12Z#fR)^瓙@TEUIrwq5Կ}SQlg31b/We0*[{l܈X- !ޕiڮR9x6yx@ -Qxdiwn+ȳ/.f5Nz.)ߩ ebR^czD{I0I;"p$͘ނ+~29~')3[(gJWNQjto<[;` 7I?Z}7l-wl44߶ʭ6I;F5pro쨻$nXS#~G`jԄiONhCY0週˟H[ \HhԒڌ>& 5 i)eΌ4N *չ,`?R1PTG $UƠzv؈ڟ4??⬪p/uz]K Zo.{x9[IࢎZ eJ$MC@;t%)g&gIѷ|C_YU9Tȳ|B__`[&+b*x9):A5w6,C߲ E?稶W~· tk8%I sٖDަ"~/WO=_ƥ)Mf\9"m|`|:b<4SMA< |ƎyMIԖh}/&uqeŸt}3J.JtJVWbQfDa(2W<식B;<*B`ʥ|) n˂DufTޗF{Tx90 JƳhϏsf>\(IBĝz:ARP(}x_H8>Al-2&\\ln߀.*%4TPArBYCgy bgLFC\fҷ^ )Ad1kATR}Y}v:{Ri QUwʻ9N^\{78l:Ie7eYa3PU _U$_KPFۤ۷-Vwc ti l)!/S`-\lQ!:mI u3KQ9T2:{Ni "+9 7 io:4;>5,Xs+re(t-H-*cm6~TAh( /"2*S.$eO"9tQ="p#M`Bdl)8 t2ǧ1Es٣ ظEs.0aB+"+#2jUYwRT ,_[fC&/L.u~b2iɰ"ȩxB:j/Yqgw||C;\dkd a !ن-”Y{c݉Z`O6㷷0]{>jҜIjV˂Zs!U!|Wلw<$e\ȑor 3js 0X=yg?\1'/=Ds3Tpa.Qݘi$C(?aXQ*:KCI*\*r~Z3} ;fl7"0{#SC0O qm6'@2OI!c a} p!jFStTӂH18ړa 5Hf;nn4E@Gaw9Oc4 ֛Ut*C4Q8 迁64KŘ[7;؉"3tkDңCxGaWZ|P'TѭzwntcvmTK1!U/쬻Tj ˃5tfj3&/Llr#P_CFᬆWdqG"~[>Xbj q9_~;3(|qT}Zeחn7>>!Fl8ɒD,}֗Vtp㭿GvS:%`,Gi QlޘJۂ0Vt~4p%2e?{\M7V`4Gi t`RJ@%s ̫8byd8ZO f5"b{Gx3eGwb>ٍ#+r&|:d s`9#Q^YLicĨfʉ5̘_F&aч5s'nNjE` ca:[`Ox[+lgMX|瞻y{h:[\U/ӻB/zp}qe8lXtq펂[Zච;LqyRXޑa** Z'ջ/P8~y N4`M21 X|U T,Ƌ@& wx߰XWbjnFO`2"-#xV ɈL bHp w7wofxloᡡF A:Eq ^F?Ot+-{:wu@P~:i|0!7X"%;6u )*\* ʀ4L1bkaϦc%GKșFfOXy/?5[j6zknI$6gގ竔%$noDLfoos*o7|ϟIlw7P>`Ot# O|zU^7_qI_!.XqnkPEY!Ǖ[׍Oa@7CH=`׏-g`^? 5_7>B\_U7Krex6A2>SnF-4z9Z_Dlqr6=BV[}s/ ogƱ]/?փERmྦ=E\# T>x:jJ_1q)Sa#Jn] E%וTU;G(SV{ 3t+}z2a;eiU2Uh_ưCEDdrohK)8Yh2+mVtǣ[P'yQӧN0¾ o~MAAVkY mYUiᬿ[6 zb^>%i^m67Z6w`mͭn|g_,|]Y9+~[`h>X,;0O̾yi%ٮ60}L2SeI2#8vî?.>LK.f__fݯ󏿎yFٗ_IvO XT\;'ׇ_6#=K #q> nZCv<ΖU5 [R٣ş6德X"ק߀ѧew_= _N qvR2QlLn'G#Lt&Gb)1QjR[6*8q(zLm,rw'U U T@lWUwzn/TuTJ֦yU6WȞ5"5yMa eF2a> d$0 9|3?Q覶`]<ݸIEEח5+XN\\v^a݁_X K#rItkrrА<\z.Z:Qsi/zR'j_J߆| Pgr*n/MWSax )@Ȝ?EV8b&B5@0ņeu gx[.i\jH\NDa$ 9lR&AXBܢ ߷uMӽ$*hw7an#(Gr+؝$A-E`(bt$϶%8 }|m#DP#b CRmRp^Ѿ({_^[x[y+i\\6I 6L-@1\OkuDm-5g-Ӛ %8߼3&H e0"N-BΝTw㬌#qVaAh÷ޞ;m|N @H0I\0=dV m{W%q h+1DQʂx5z9yݼw GnxO ғބ!n;%CE$J(qֻȑL OP=2ϿoD#m4.HӸF1(=&)|qRyS*ǃ\A(9;mydBL_ M~?`Z(r`%9 ?D9ufS7F>q'PWϯftnM=u?QvxxW9Jp񛞟S2+[QA }G~;B6C-[Y/G90Cz0UsXFC,ڥan &(o90`oTXȎKINR'G:"l'yEUv}jgԷ 'B;Yy **5Z㹤,V*0j@k9JCKn=Jߤ'Zۯǣy >t sP-zLJsĢB\PIPx?})7>/[oRo'yAb/j0>S,d 8jLaký! v>lS6cvoW58q^vxZE*&{]_/ٛPj +"ў VrlF`ɛ2:ű%o/30I[cx'cE }n)գH9!"CX"UBzq;-kh);=OYA&>6*LWa*ʁbF^x(bVRo"C7AeƮMjbix5.vq ^[T8!JiȆ0c2*@X|B:[[ 4nV3p 佚Ux^uU&J5"9)!rQZ CX^[ ZKVӸ)X_5FtR;A{ p)/-sh4/x\|>OZ2E`}.H ޜ~.ʑ^cxDɜ#2]hdHÈI[~*{5#J`ch+OI\oσ |E WA$/ uBq6@ C#DJJ2Y'o,׋:L&Be@Dr,kk6mDT)T7ѠWuwU[+֖7D$="Fƭ뭶&9I{SD)Ĵ$mKv̧ww8^EC4q$1 I4Gpn5&F"ôN.QW~#l654.^N 7BQҟPHg& q1˜Bb%&ĥ^}`}]bI/H]Ҷݯq]"?+ģJ&_Q& _[OoB2/A6,-Wvd*z8Gߟ'[j+#AH gJrx 5KNfIŏʷgwF=/>,'$YOUWXڀ9k:A>W ] z̺ΤE+,rtצ,*Fj&`ㄹsp M8a)5rTf=Jَ`H?~QcMF2Pl#k2>Y!-G+P)oc7cw||}f+h^@{x=<=|@;"˝y{W RB‚ [ksh1蔸Uo,p˗VdQA k3n/9I)-mY?Ϲ_]Lxm&ncS}`8eH’zS0%gQ0j/! IFe[?0 26ɘc8(8#&IEXP,H.8yQŢ,YE6dh9H iu)iw-aE ޻P:c/I.LV×g/x?w#:N'?!v!bf+Of4=I|SSb闙H%$a1{bLPxP5q2Jpukr+{\Kq*|N@‹܃SC$Z5!s&bbIb.ygLX^ *W@_- ol"!AYЎ L;#ڕbP'x4QR5aձ 1_eԊWd8Oވ!Jo/yv)[Vn]dv2p`\]FE0 '>뤜E3ᶻ5{']vr͵ v;)u=sHi%d˔Bf]b'{+Spw:}PVcޝM2bC?^$@3j9,uޕMwC|n ]%wUJE c]4@Ig[̪{YFˤf"}aZMgnc+"n$è>\ lqf;!¬ŸhKؖݐ[e*vrπ"<0c#5O[5Y,9jq!S/&zIBM#xi6@y_kCgec1Ax{T6At nF d=0YNg'dֈs,"'ZDj۲Ўi%㔝 'GA-o=c1mNx0^ԒYX|SOO. V+[s9k|o{!=#7w\z*A{Bm=yt8ă2J y蓂O=CBMKp;}IaFJ{'(sQk"gsuRJMNkhaƜ?$>rkU5û"شeNރk-(x^o+:le+{͍g$yY>y/:9T[qlgYN5d֘v=!Ր:.Vpav酙Ȇ%3BJT4iUntCIkjU*nʐ vEv$ӷ0:Y}!V*mb#ٻ6%W`_H[}v '8lrpWE2H}C%RqV0 Xj몮kDRdlRTH⏒y[,:bQ` 5RQD8! 혊+(n?}U.c($7#A6pbmT(o ~8p&ᘡ)1#aۊL"!}LpI@7ƅvv: unYZtuLjXqG2>$$ڤ ƀIJ`918xiװӗ!%SH ᥗ9%5{/DiІfۇ760OL!v: A7jy=9SX׈z2Ҙ -jjp_q eq<,E`c#sĢrwB!d2Qo Gg|ޭ:`ewR$ X>$g}ER@z?Ƹ(R{APJUrh6 Qih[PtA )ʻG<#fag1hM09$i nD YvB4 $4EDu/Mpua qx*l(Yu*jt@ KV{h %u29 =z{xbhNcz,3bSǃYo\!-8vrq08:doŸ/o7zS.[t}uc^ê1TBT#}9.hIG dF,vD=7<#3@JG7D՗] xDItGcpUqC!奝´AcXIIFv8Aau Ř7 !ImLuE~NZ_*jN9 ;lN||E+[٩%B  JD8F 5t\Kxc8nEN*-!NU@Dxi#RR%AČ`Z 5 AB>(iHk!0<7%$,Mb q|ecB >JhC'"Hd"p<[CRu|`c6w$v6LP (, 1Q1ec$,^c{s[.1Ro)o!(;̣1>oR/4G<( 4dDrLJȹ9 b>ɐ,=$8YG"l8H kl l֨ ֻvHg-?"VS.l0 Q`{NY-".GF0&PPĘ;lVSi/¥x-IFlrLREPr-(00 0,ɲ4N! v ˈlBh8\NG6sFVxlT6<#U/[Y'|.6N?%G2!=(><2'#/H~(KԷK\Жx愔GM~<>bh;y%e@-y~@1^Q k(^_!K)ZƋɪ[~zI%I6Qp1n&.0DV+h7CSF3EehJ\m'_~JiH{b#U!^9+F( ۯE05{azZQf#c(zP(4f(*xburA&B( ^6 37 À𸤁spc*ܓ#NFe;ֆ *;Mp%C-a)I* ֗(i+<5;Ϊ+۲-⍗dv c).RҔ:Ĥ1]TdҪBy݂PeF<S!m7֣@@18pi@TT4QxC+|IE1(M_$**g`ux@|q?nJф< rҢj$}*G*q|~[-60722 p)pM=  ]n\ؽhcR0ŇJ v)Rri[D"B>ܠ+nKz$<CՀ +/*hP2Jw$㏣*zh㨴_\$o"1(YC3 s-hpQ hN1ݶ VV/7PmǛNY:K(eȽGADxGcpLo+ u,V;)8Wȧ.#;%X3mkƴz:18 qL .j3818$ڋRjLr?QҋJ9PR=18Kf'> E#FAr3[Q x4H+ɱeLh JFz|*Y g=%m%ȃ;䜸^cYgJ$ᬛ]/<|||OR-mN|l4tߜoҖv_篿b?Á+yuE2HsPb0x}A|keƫ8 qW+LEWuqy[ӻϝ]vg[\3 h1'x{O^s'@S~Z]|7ZS]O}nigS0Eļ͟?D6wrۋ hlqڨ۳_jRh&-}|G{mEi=WXVjϾ~~G])kD!ޜy7 e?{K|K9"%gsc#څh 0N%xFRiEkk6o"csr>z5oSnj׳ A smykAfZዻ+_jWfWlSD9+'7s hB z}c&: +2!}`v_QNtݭ^&7o_?LVyl{ג !Kldhښ"8Sb(S*d ylk=+|ޥ~Ѹ~~i{ 1 ![V뫾؛O͆68ϓ/J"l\|n9w4]r`w㬃ė1=8N tg~ij!Η(-htlod"Ӿ3;o~Bnz{~Nzykg@:c~}G* ࣝu~qofw!^hnvLk4?.l l$S8YAcx.LWWiQ˜"!]NEQ#1=J$׍[|kW d2G`>$mz)O;:%1kq6k$͐1BFх\L$(M\QacFSշ|gQJH_.c@7j'+bHJdAH dJ<n:{a}bz}1f;kQ±`-v1,wfFYz=8{nqྕ-#eA֏~|~>Tj1[~{KX#893Ykg#r>_ݝ;<,C@M_bԘ.b1J".ㄹsyBg$4J@lr0kQY-$'&fgW7ȱ2yMG4H_Gdb'S k_CZ|g[~Z''3Y`ess$7=C)W=;OlGi̷_+HR)O4^݊^{[ӌƠSV dm-k]ڵl[zRxo__t:NVLsG(fQF;Ƀ&"&Qn"72W |^HÒzং1g 4R%ZPBD<ۨlfG)W{ձ?M0g$ Ӕ52JXރ;KbV󼆎HZjګjZKM5RdJ zo~Zd}6*R즺>++sLEf& *)]PgʊyBjO< 䢶mZS?}}p9a|pYr.~׻S?ˏeSYjm l,hwAjTE) 9]V3pl>VO5\k@P-eA5)XWۻ3:3?}uj("q%^;"Ӫ JtKQ$<-ڲ6)N+q5nE?6oߞeWgQV+P9|NѓSX@uIÎ "k'%%\2NPb) ac T~5؋Ȳw_Vl+_z_}W~|.|PgV8ͦhelH}KšLD``6%dQ_zД}M5eZ_B=93Y?eҷqDC1zVwz{/ [^7}ީgck=ffҩLxtn(zo_t{IFmڰg?Qj]7_n0R<6#!擶v߮o/>g9PCSͪfߜFn_}PͅOE(zyH9˛>],qmd~D 9)|C|HKV~pz=v]/O_ݜr݁z!Mq%HI*WRQNdoKd{;Ls]R8q4mTIPNi3:Qa' $^VH ;u -(jpbw'TcOW;학|x66څ-@HiBM9wG<[Y/#jQEȁ JTPHg:0, 1n Z)e) )$q`_;Mr]_行׆#/ bg̷ iBn=cGI4+UJr\$W$۟\i΄&X ̨tQNP1'2%p*`ą -oZN?wNF 28888z oWL5Ș1nK)8]qvjvjvjvjvjvߚob9cL^4W)UJ{UO["4![Qk ͜2NY,e !PX,% :ux ]+)`ʟ>'/-~pgׂ3E[2W>>z,+i|}c?(2IvR0F/_ٷ[ߙVpEmH\f>1ghz\pJs)!8ucoEC$Yr\ha<ܜbjpo(|l[ g0JwƦɖK򫛫ؘRjǭc& r4~O=㼛IUQoŬzWӿ&///F;9qt4G*ޜm^NK1qNTDzֱ.MJp63lGٽ?+'?T^ߍn&QJG|۹,󑭹W|7D mחB 5i,kb|uMMՐjj.*:_)y㿴@uvn;uV:dS }B|O5$Ụ7軡z!K>Yo|7~)Ǐ3Dq..FW޼]?w0QWo^}8#oSxSMuPCT7??jVձiT-f^O|zy]nD$12;vV DϯK3)F˛)ǍfrgF`_ZKWbVzfS~U_j&޺j QC:֑ftPҨ<Ց] rVaS H8 s!ql1q6"pP C[9"̚28vu2^+dAp+ F mh 9 chQI58iTfGwŷ55Xіz?ǭ=jcOv9DLKW#1^lv$/7@ IeI'CvKioϲ1 zu Wm"ze-sv1Oua2޲Q.Tup e쇚?1Ѡm#_vc[y "kYn"&.k^FVt#)ʳGAo"qNBG@aYܧrt!@m]H5v+hքx+gOqߟeWcMVWSMX=?'p*{pɞsl;_|Ʃ-1Q̇En=>_i}۬W76"(7x 5wi"]piS/OśϦuGd^f[Έwj0t4ݬlպ~ec>Y\+wk3ׇm:QC" ?eJp%bpEU7( ۟!6w{aB>d/ q_⺍4D{B,y )WcxQ!)Lr:åHߎ:ƌƞ1N9 )j$ D<hbhޓeyS L [HL Y"N0M)|uK~rt[''JќT)HBY v ص,D),9&rc6ޡywt`XcdqRR2#RHDdd- ډc9!ٚ4Z(h7'G<_CL[|a+qFaɀQ!QQ[$ 5`H`H`HPm`HX*-ŲF@Y@Y@Y@Y@Y@Y@YLkH途H途H途H途l)ҁ))))))))X9@Jtp*t@Jt/2>(jl pplģ$C->4ļXR-0Z```֜g $5`kkki?",e57_>P4l2l؞c(GIrĹ" %-#f1?uZ,+AJ|4! f0@pE6L?üx|؟0RP'2",nXΛ@Ԕ0тFQcP4ٕ12+.׳Xluy*OSO?=ֻ;Ŕ>{D+_=TO?Fe77G7XuvPe߹hNcf~q.aO!kkZ͹d[RuN*8 !ߠ+4sɲ mЫgV,P+s߰NT15WNJ75Q[j۰Sln;`F۶%`Bh1rK.pXFFǔ8n{mܦ#֘l L oL|N?OTl9yqMPLo7 V>[B wRbXdB{Ǯ%IдGNPYFN}=})iǤG7938P^ ‰%I o@/?[KYNzqhy#>z^:>/L#!![gQ,z=Iyoo{8_CJ}3 emhdR?+5&ZMX%Y_w(ΗX_ۏrM wXS1Y! ߦ E=!p}1 }F v'A'=/#zay3ow>~B=N2sza+}sB;3+k8at&y2U'=#32>|L >ِpxD,}sׂ#qÜnMKX) \V{6Mfnn89"NUOzV;U? Ah߲|,->nq|i<<0Zx`)3kn2:AY7XJBD%6D|s>F6?;Y8^Jk 9Lm&9^{Y)ϺhH3W¨\rA^B1mة3H~:vm3ݾi|G4D\}.0)ےik CZv4m jti2.ڦjOCkm`RR/ŬcyXxʜФ| }OuTMz,ɝݪ{USYPUfcIB@ @ܲ`۫8k0Zހ 0:⃐4 B- ; Tc|{r 1dƣ~GʰOqNΫЫ$.}m5h2ns *FQ 8/?޿3p$n(#H+zꋈh??6KKF{;;dؾ^QF=]uY-ZqE/VwW?V~:$*OT]Dכ Ip:# KB*]>TpC(YTL!b/iK𵠄pD"ۨlwXFuc8QpFL8MY\p^($E=+f:K=6jڐ5l#蝇c5jՙU=)o:w//+iD?c)Vxk^,ӸVc6yDap%*NE޴i8O oŷ>_oW]i 5-oTՏ-^I/ߜ2'2 #ƘC QlMS;{c}^tcc/i\h\2A q2Jp䵦VDCwr~ !{Vij1;Ȱ{^6;ro[ji0Uh1ZneWshab.vQ}ێ$B[IY+t^ӗ!b׏(JviԔv;?$kUspk۶X%gHi4Xf#n7ӱK/I1ʦU׭gU<4F*n${) /%[;Μx(t^tJ&BIqBo'%&~‰o̓._.h7noo"~|hsYO)8oN]H~e\BA԰({M@tcK7KjLCaTɃ=S$C3I~L$I6\a$3¥޾&7Mms=#;8V:ZYpO[Sr906kreA>VYt/_`mժ&D7m4HXc]?٪UW\֙ ||9DfdinRԷy-m^K}^K]9 >aᵃܒz'% R$U.hk¨Ēկ׵_?.>%MjWmS_tY5uXUQVx;@({w֧޾ '-!շIxjz&䛝3 T]]V/o'k?1YnM-8X\ v٭ZWͣ@W,Ca`/ >#ƺuBqtg oi'ہu8\bB%\ (%0 z~mkUN\+xb,~Z߷(eN``_՟W b6Wч?ϝ5.bE␆Bt9F-~FϲiŜy.TR >$˂0*ymNQ1ap\R=o`q >CH@ I$C\$XA cqT@:00Kg9 Rn:_pJ& HvY{&i݋f \̻ݗvW7m*M4Dk:rpZ3CsBPo8^^sx DCї_{q"XL6F=QziHic-mC0"wbDd`Lffe[|!h2gHlQGb"z"Hp:7Yu,uܚ VIm4$@Xr^)`Fٸ3AD=!HC^M.)sӰ=#,ഭ ch^F0 KT֡4{¸>@xIUaqG hE𰈃[EB`MnbekP`lCrg5#H&\pβ?L<@Y t,ld]j)8O+-`d'r~Wgǫ]v:ܖƒmNq) )?Dݳ͑M]08۷x%,|Ἦz÷hZ5y?VeqQzઋz^&z*7i橵Z |iB80>򲸣0K KF q\bj,a|<%CE'TIs;dJlQ6>7B%>1.XSfq&D85!k.twV&t*,MTo*.ƪkmƸ*5'aRuL/i9 g ts fޚ!^ Z-VjZPj10> ٙF־T%[̅BJIMGЭnv+L-2vyx_#  I#g>'d{z+3'Ie:q C?5_m%gsc>?rs8ZI.2KxAJ+uG.ms (o?^@z -VE?rjW2 YyrgXb]GQU/ލ {O9[-{%7Lg rN˼_4\s]ꟳZ?X1 & \l` p bs6VSgO!RJoldXuT ʁ_!Iu2ʌ %gb(8|5Lz3.G(A<+4N(0 aLb(H' vQl]bc  ,TJk8˺a r`"a=2fa&L%.8Ļc 2AIYýD(M=F(A<\D&qC&F"3B 5W9O!W戔 .Pdt0}F(A8H[]FspڔM[;1B tG)(\C*@cD<,#$L*Qx '"JG>C|Fb@^;!Ԫ3B 9D!lrGL5FG.0b#&%ITaINA.)[ +PuA|J/-BNBB ԕ80((g4^BA|J%+M# (\3Nx&!!u>b0 JOʼn R ͓ 9i(2t=B )rpEJa VUX`# S:ƒI!.3- r<shٻFW ?uذZ۱OX_um;qa8-]Ȯ:2k0-JRb<hI` l%.j0om1?"{m"BHb)c-wPydx^kqp^<νDQV6EBNsWK )HY5懋2l߅Bgo5%I΂(R-o 3kk`-C#7)#f0zdKhcskh0Vb2W`)rB¶y0د8 EoV)惐6ԛq>UcҚ B A ~L/'w#26围&0O.KFKR]{eMGEap\D@q&'b-q,V䖹.j0oՙ< {8֕Ӟ |;̓U k4|Ζ8g8^cނP<-.K3o y(!2ЅB 9Uu-h1)_ g|\E| 5JW:4W< y\`3߅B 楠]A7=ѴIV Y.O-:߅B Aj~$ѐ n<0&Je (TaV{O[{P1\HF}a&BژbYqVZ̽"iѻ6;PQ "pD "s QL% BV:`<daaJsJdA jBR/t,8i#K (`^sq DɆ-a9Xqk.Ý/u6YNB1XxRC>x^ga߆An |ow`׏eHZ9,vUeqm:Wb[_}6_Ʋ)M_!_ihzzغEc\rtxv`M&}m`%yb~},_ͥ{*qԧÎ>\@Vx3Os1INW!Eӟ;AU_`@ڋ#%_w@z6^]^2n>P]9#N+ ͆Mji˷ͼ~W_J_m^jrEBoyvq8l8$#zjo_`|%\~d抳+~{k!_ x!Hd 'T4w#X^Gφ^E.1!8J1.Q8-JSs"REN# Ykc!F* 6܁ `=5t7jn̿ZR!3 L5oHvo| <@zvK'kαSу#&SuP}+/]V;z8wjҼ$~@ )<6JUd.ِ/"'$ׅq<{+2Hq=u5PX v2_<) ƦˤѻFכsQ~ka{&1U y泐fޕY(6r[2\E-<(A6k= xM0vAّ2+9{T^ub}, 8Ņ5ϟ'Xz0~y]cǩӁibV L1b2H7CT Pk^7v mv<' G쬩KrqKMlig&m5%)Arb8ɖ"I3-'b#wO܎mucȪcLzp~{2ijMIMN|ߖ{Az:ԕ^\$4cݭpLp)ɵ/Aubn ]:D,a4 iI2xA:ej8V@ᓬ+պSrJb~pMQڒSL-yRDigg*0["Wĸhoz"6 ҄CUZEuc9xL-p pkX_.3TV/[ o{S1V2!<89Iq.e1ށ @-*X $G1t Rx]/σܭ#]{'-<0Oi:PZ+'AV((!i"ma$D)3G Qt*odZ٫${չgG~jg-ZCw']޺6/#:L~ qК@lMM[^lBK-g=4O(_T#@yay-pݶ/98B(9ѢTUH,id9km,<?hCEm ڙdLX]]JI+tu]~j=Z ˤƶU]Vxx4:AI@Zƒ.*$Z*T!+m[>< ٿ7l\>q8$hݚd[fO.RLfVB SX*!;\ R(H̆}8!.;ͽ[Mr}.u>3év%@eS(ڍ\= Ytm~R&31i(Ry왇= E]<[vB%cu(Z.3#HnxYTJ͠Iq8f d<=gPwН~:* GRgv>,E@ e*Dw5GDl Pbe4q/1Ent$w벷tT'cQ}ʠP".~:/A)t%~[>]=AJӔO1 5SA@vNJp* ϼ%JR0NZ0%zIr(]FDR1&X- zK A"挊ډQC\-΋-y8; -eGsY%'!D%B.H2{αڨ9`fhPBTO.ptIuF;"E RrHd i%S騀ENE 8 77aF.>#LPDwy9&0##B2 EFJIsFPG/4~F1إbD&2`X*%qP0 2Ȑ _H/Hǐ0ؚMVí[Y-EWNFƅGv;{%j^ 0&׭Ɇ:6:s \O*av4G*r4{`(@z`-]}s_:+OԴ̚@'Z 5k}2<0,͏ 7qfBUpݴf·^!r'E|~mHmjͲ|a0'4iz<޳N=w%h]vזgMFkH1pRwNZ4 gI- :f;;q ,F__I_?ׯ~׏`30 .c ?5J[oϸukT=npΧn&|rߧHbqڥ/~|u'=z)m{=L˻ z;~'0lP~='tQkkŨ\K"_vdz,Xum_~˹^ 9)#qoppiQuZxQ弴3_.ڭy! OAr O{BQ rŘ3F}D&EʮL]J&%DdMZw{3MmSvF{ŖpkC $ '2iCUuRDPī*1X`R\FDZ DKJR{D/@!XE*ӈ9N NE tpYuX Ἡ8.a0rHR*:hㅑJmc}J!LE K7en7wF^=\FzAmS-P),D)JX5`G" x  0TS|q= )k H$ F* vkm@[(`P )Ȅ5?6v7Ә!HၱXiT TV0bRjSq)yP{*dOنä(EJZaAU8s9O#:ȳFƤ(vY4aSjd~r6e"+*58c|rc6=] |:a"`,2/\b91v1Ⱦx5DW0ywǂ΄ 덟V?QQX_t`bRb*SpTyU^+%XK "xY΀ˇqPA8LaA>me!ۂ5q<3GysH^ظҘfBy38쵔^9x|4s-c*QU;A|E``wTp)ʛEyFij_nˑi`bW 7go#Dgow=|=V{u2x0;EU^TLE\YXeL_»女 JgUYs b8"xvѴ'x|&AX"mY'CexoUBP)](F0N0SBFuş :=_UMZk[xV'?j 3Xv&\Ao 1$_ 'XE*$G+5V2""&l` Xy$OH?߷\ҷX]suvW}cgG!$ڵ _=gykUYZˊH9Y^d]Ao =[>n~oᯭU ZfmQ"-mn2;^^h,Y|Y mѫ3+VQV4xX'jY^5L-"av)5gܰkwp^EaAf9ָM]GY.i[ '-Nz:+ ftr1]@FFc>wDYmЉt.q{bmfCj͙Z&hRF{Ztm~R&31i @%3{Kwl uDtRסkvX#P )oI;fZ,(1LE:*`x)(J ɲ2l9pKLpKb&1veFtkyk(dkBJz]Rϔj/b d W7aZ(H`RGTQJmŌ6>VaL7E*Hzcg],[o^o8˖mAZgBZV6i B+/gTPXy W*) cT(ةN2ضh2D4Vsi$b9.WW΢WEBG&kdgH! }1. lI8DB%yXϐSQ &o-f'msoB8?{~~ZfT<:^1uU4kX? ,Bgt6jA(@-"8+0qHj ŵI#l:Y3'/ |g:5sFPiV;_mCO [~tk͛~H@s^XCs +!0Bs慲 Ƃ%k}8y-eMU 62(#,6*GT KIwPɬن[77Dil1jx[(ųUu 6JFL+1Yk SY]ia9N?``P.0 z# 4% ˽)U6*n.`2dW8jFrB#zBC] tPWEݼ4hm?|299˝g#h|zff]aJF( tbBp).}Ԗx93¿S^ …]?{F=ٶy) a6\ f`_60x%%O98[-YՒ,,FK7KY46~YIMso-l!!&Lg=.;E Zп<=o9V:{2"?'os;dzX}FJN9J.>5<\@ϝ}a0?>{m} NJyO7Keoّ7j *RA&w/ h \BeeT$ D͒5' 5;~2lXiӎC!*&SH.O(R:x RS2% >~G V~"qlhK)%,`,O%ܨH#j;R,~$Le&*ΑḖ2+_.IJ#uZg 7mW ! I3B".iԕΥb"tR{TW' []uPAPtID04j H5w;C58 ^F_חR)BLK*ZΗ'7'FBBA3,T.!3k5+,F!3bq>37d0 (y,;E;K[t^2rbbLovoFCz}q5GŠodX g+HNzjfg qI%e K:yT=P rt26 N먷m=Q[ܕ6o CD>Cݯیm;z:DeܴrӦKݦq6w+e[\ `<3u+HPV(^P(ڔ/MϺiz֦V&t\*E1/qjCrs[୦[$Ձ4%vU$72P!E/pv aȅW/2~{E"&8~qnX!5w*.Us&KٯWƕieX&'Uef1񵩼<&q`YųyY4XNH?)(7l³NdvP԰=q؝6q@.; [c Z;9&|Jxck@B5 ^WFZ~#wR7s^qsWKgS6j?,vW{ިB \+Y*XV~w}{u*A~(c#GP9Use#.M͕vLG5dwdbLSmdCD:υFl2Fଶq. pwA!ugM)L~t}Kk|ǺTBfx.gsN욼P3-jwN\'Q9ZTb O4p]%^a,n(&WBl|A{w|rNNcNC4|\~;ՎV.DNʸlE; U,6;sdjpJ5yk #^*9O#Tקg䧖g*yye1gTj4- H]d26бH>ٗx߹)KfWU_F7g  :߾;'?[v޾Xj=sI!C8jG !!^Bk$M(GXdvlCL` ܙ9юN HuDv3L۹EU'ģu|8 7׃rsL@۔[GX 31d_hc"1MP )K$)pk9 YcRcVˍsDR4R:$`KqKS 0yb@F(ڨ@a#((-vGp(7qF DpJm;E4^ޖJ^rojopc|iTE} QNryrT&w~m+㥗5֮Xj)1 !_70k5|#fN}P&n&l9j 5!aIHeӂC(yT(L!/YdKD 1a!֡eG **k'ۗ2dhJT: $(h~,HM-JS{@rVZIk%mvX+pTQkp>&;Y,`|TzѺ|`qT| 5f)8kdalHEN{eS @.1ZcF~586@k 8^ .-BGDH+Y ɑ3{9W?z @*o9jf M- 2Z8["z*H-j"oކە&} -P ֱdPVѠeS*)FC$*06껲p{M;# }0!i݆7ņEµ/zt-YKLԽ|e0^uc,"(k {7ּF칗DbQLYNBzܡJ# H퐍%H1{nK䴳Fɚ3Zj#jk*g:EKbLx- U6,ug/Y F.Wk=.YlSEֲܓeX,q%\}1̤9CɯL ʹJw8#Ň|Ee<ϙF:J[_'apQ'/=-7)vPN ˚cpݽy3Rw:}zݽ-uwV+ ECyo3YP*AEJs&iӝ71~j8ϑo<|ͦϨrLUŦα%E b%[=_WX6PvFPb~) ~/k\0JPgK5c*>R (5' 4,q3 `eM0\`R|o;g۹[ )R^of=G3(ifif4&MOC@u\8NSIE{Ƌn6"V k'Z^%Áҕ?ѳ&(#1994鸗O<`m@,ݖ9;ܜ634dٴr`C ٠@գ7O<1"M.hϼJ&FޢTlrx^w?}FQLg({(=θu %^{ӛvP; *{UL%;WG8]Ղ{ 6Klg~Wsb<}fSY8eQ6 eҨä}o쒞΢8z(8Y? IBT`Ӓ.֖j2f#̗g; ~eO?*{y~~$NQ_:29jZL8lH3Z.Ř2>t< xaNnr?V5<-h]cLc4pq\&_pˎnd~;p=;^&ŸwKSM"{v+-;'2uK'mwqޑwAĸ174dCm݄=ؔ_ Y^vp΋N]|J"C*e˰Mq)S.U-wwpH*f:-NHv:- o/Az }EKZxK%cYjt•FեndH=Hi̽#u'O#kS99AwE},6*(Zo܏>%J"y~/_n SpͨbHD!B'T,@XBHQ-/їcLʞwYӁ.2z\*:l.>ԏ]Uۢq;QlڳH 8x͑wGy?劒Xrެd6CHq]#S0HtZ9LZ>4ښK`θN?,cfUP7T' Gc%u @>ã/70ne\~,T%Wzrl(]碳Y&<Є$u`hwjjVt cE$vͰ #k8O Gb]t >2q$x42xcT|6# (eZ̀u5]md28HyNufBKz0!ʞ>$,n ߘw߯\,ÅYk]ӵ/.X nd2戊mkZa-ݣ_-b'*n9ZL.MXă شϔ֫ S{'h@Um^G8.rUmxf$U ?FO|Sg .{̱ _ļ+1dKŃn4g^UBoL\j.i&sp* VX\B3E)%JPd,W[zv R}k/ )QtdQI* 9gp ̀y$"ːAH9U-X{,,1L4`B5d0]NAw6^46e{Gud8K)3by 7T1'm t>L`04`c|(1Eaݏ&>8MI9ȏڎeiޠ/tW˘-,] f"ko׽o?wcl?|hU=8[oy}ox731;ׯLMpSZ0aAΚJd,6B3v2'@t3 2iaS Oe/Cfԝ OYџf0w?[$*&/3$|Lؗڢ #%0u'ȫ/cEOb"ky!iFSb;0; 7 #zQ{+R7M>Ѵwթ=}%k%jDyT^fRcHRv 3眖DraFIMN ˳?k._7Q1KooJIO+<;8eJ)JK)"]}<  EJ" {dGOw#~?/[aKҌc,/mSD6qyR2Yfq!IFJFQHZ9$FEfTFK!KDf :4wv_i:2_VîA^кNk(#͍ 1BN9yv@i$VybkWRaGW%8:X-J#.7p| S8Cd N"9J$w mI [JCBgpWD4RDA䣔v?Cvd/!hdew5L%nJ8q*qDX0N! q򌫢mt$JlD,+@b5*3d9)+7Qez%_r b5c])*At27G?}JwL @]3Ma2KMUr2Ëp؟\~63L';珯v )<5 +'N!x,we403mx.sl^r2k5R(TВJ'% `NKJTc~iH ݘ!Zl,s]db"Xqo,R("m2RZjzKVl^z>m6X"Y˺ 8ri ^I>=$^rS9°RȅVBvڞ l>V!7Sp'fĘTR[Y2iԃGޓG~8 Sj ˎ?XR26dfk+)$ChF=FqH۷oiA"؂)kREX( ݟ_?X>0|nub8>1;1xs^M~~edr޽nq>D7]@9̯V,i꥓ڶMXB OD>4(lŁs:HK\?v8/~Q&s(2?1rBb+zŦRPf +^چq`w;Zsʀlp=kFyZtJrP=dY.Pd@L(F7P$ Z>"%@ĺ[sB_@aT.R[v|*z,#X3*ZqK ~ޤYP~Z<`ؕM!xX{jGI(ϑ-Ph aP80kcPxG-njYGF>LI%o[Iw6W֝Q\zš劍vd.ǣ0SSEΕt,2\ZeK[Ux@ #X vPCB!XPVz68Xz Dv癮/:3Ak:sZ9$Ǽ%9Ro3T'H&kstI^-?Q㟏^p\;K*J[蒘=(ON`1 O!BH;>IR~ow{b#a{Frdзw`>40~ACyeY.0%JmVd9FKreF0VZG(\BSdEqSfPY(*SJN*%J&;! ?TJpJH8U&'ʁ5ry]CI[ t{ *tPr=\;B/qY?|zyboq 8rWJ;sst]k]` E7~úhR|8/zsTfjdM)v)PNyi u)uVyŰO`ϴ,x9zY':!\ouㅶ7&2މc]cȹ';?gʮE}wRG_c?")ֈ$Rh(?IPuG$IVbsDr4znJ~DRgWפܺ [IN`k/uWύR&tSB{F p]t^13Xaߑ`uc㜣q\[=ޙGh+T()~[\O<X9,E5E FĦH+I!׋5{"-BCP׺ww(ua^]iX9qz ]jp@.f/:{ H۪|@H+2) ;익A^Ͻ~!xBrИ/Q6ݝr'|>âp0_gBIx0$ (-xY敆2Uz6yH?3 bΨpVk)|$ OQR>ͮj|Zs2ZWʼnƉk's(5 "1:à2Y0_)P\D&tfңYV䕮-JӬE3k*3k{&hbJTcYi:w[fXa#s߸B]QAny# sh&c`(ZX7i06z48igoA@Xk=^neW/$^nBY7`٢Boλ܋\sٴrW4,]7E)]G9'c': {7D΃9x*nfLDt ¸4L\.Scq\A$x-F ھ{-T#,|s<[ `$|AsA oV$de>@#JTT<6?O1VIcbw0͵F5iG. s[ܸ}`1-{֣;RT,r#eI <[$׶I,!\a"c$3h߼<t&tH LXQViM{Btk!3+y5,+1,Z yǻhۭ݌x$=NHxܲ^aˍ0\7WpA$~x0nH!zՖh/ktEjT*}!ZC<Ԙ\Ҿl|=VǏ<{tҦ -Z$nC_GqYogչ, s?G'\rs2g{ޱMCs/J?E)2>y(_^;M>`;.+5֑ WYg2e㘄`[n\o͊Ҷ/o햆@Vm_ Yd- #,3$473Zlu/pWaBq`eVMs%Of+M'OUELAM2]/pIYZcp3du`9=M+(#x{ɗ͡1]Wh~!Y' ZŚ\/`Vƙ%]+ۅ Ҧn XT節C@{ C@_Ey?_Ikٔq\$IkY 黔5 aDnAHkTFTP(>$,{X׎ף </ ? :̸b PEH,2wZZN$ /.)^8 N8;ԢB)(lDzYYsa6ӒY\q<3֊^_ tVIu2ULꭩmIƔ mݎ497TE20`#QEcxp;K_o)Ҧ_~ @M0t֯`lM#L"-FPS`8.)67*iY~eWRXRo ?Br];D`"goUMHKEH4I4fZRX^}h.\N4P:M0 + R S]hY3;/JgbCcmjR%p!/@!]%rG+Ɓ[y&tr D";TND;`Ww-4\ 8*96}PYZ4% BTKY[=Nl3вW9*5_ah36Y?~jl!s#9ӹ c6Oj1uu}1p5-g" B<0RE I]+gr@i jhEZJI݉>[vH >Kv5W;ERXahGi1.\) QƪcE[ӈW ^Ec6MgDqҍ1LBT>#\Kslp#-qlPLo}>L 4EKN ha,-_hp1D$Fƨcl?5)DKN1o" »ZZYw[[+~NKjĊ#|*\,dRij8=_ e!x߉Bjo?'GC+]a0K W[1=+iSn|m! i"xL'i\*P$+@eEd?dtc6b@}o PhG3nf#I^X&7)3!  5'x]d#lr"aӿoTSܺ $qS_`fNd1jslz)$+(Je sH{60 `"&OksbQH GǸv}_05˽IZ(hy{>5h)ټ`wDH4t )"y Gψ`S'7gmR;Q;jtK5sƋ ! p0%ŻCCؽcJSMn850 669gC67+rCқu$Zz'ԧ;a/çsŤh*'d} p'顎B7NA' &b埀݊[+ ?^e:M+wER+4"`}6-{֣[M)f *XMeBRiLy=i-ʼS=*߿f/2Ul4)!cϞU6x{I[(26"ӵ06DNyPq&QX9 `g@d3($NH_`@)7迣7zΞA`O=R@R垊f4ר!0s~93樳U(_)\ oOaikʀaGvJVXĘ%n{Wf9zo:!!97ƺ{)$ndHLBEq~I)<2Z5Xc5F@zĨRHkDmRHku_/{Fe`gg0Øb6X`=ӟv*e+dY*NRϲT%Tѝnn^s/DԜJFx .&L՛SG f@X6@"# @gc3ŽI~UnwAY8cԫ *a=o#5gPɵՖ"gI{L3ÎxT7ܠ7]1S%Ѣs!.Ԉ.7g0,$ӡ4*k%lj@ltZp $MF1Nw.1IکKk!3M,V6:Pi9 ȳ&P^e/YsfSchkI,R Ae'Xբ9FXhU(&4-ݞY۶W%fp1ӤKfXle8LI:ޚ ӭ,/2a"ߎ7P4xV!bݤLt_!NJWtB2z,\l2.>5b=dG-2.),6CJ5) T!ǩB6<2r밍Do2.!`" ^S-UiwF#%Oq inUBXC|RJ_u nX,^}{',sۻtۧ_5͖$yO}WQ4||\2̛0oü 0 Ig& t2"2/M-TT$yo49FM\0gFkude NIy?M{sXBlL3ipk>- ×>.~Ղ?3f|ZfN@4i[dvL5u\K.ǫ3h:چzmlpWn_ a |tAtZK ,xى4;qh n 2JNO:ٍ`Ӷц?>s^.T8rp"ΔB K8L T!ce͈̎_rwZo \b  Ny%yG#xARm"XG_Ԛ-ۻ|we=~wAd? ;lMVp݄te\17:;Y3~;E&6:Oq\)~1upPnʻm0jۛ~[ܻA7wY(4)Fՙm {[95->һ:F+ˇ˛py.oMh"uS$J'LeRDe>lo) I9*c- >'(3 hwn_ztaϿ@eIt>r}UxoBQ+Ҿȵ䜅\}!W#`q2ת߲>Oʿ{$0Oϵpy >eGԚ*.Nz8ZME5cd\yRm{HvgY _ֆ%u:z ܑrM#h(z$Ͳ,d/ /c,-ThZ5ZCs/ Nha o4"LBRnȫ q$bv;n9p k4.H7ɖCYfAr>ЗgFBfY byI 2 S?}"0N7 # L7D%$_<&W%RT"8GJDfeK-* S#CDJbhiJmFR#OA+XwJC)G5O!-l A) j$AU#r=>E4_hGnaJOP{"BNVE3BƏމ.Ud{FmX!㢶`'MQR#0P>xJ qоC0&*Ar0Lr.ܢG5b1Jq+ )n`[eq8J`2U',PQ QM0*&˛ `q8WPJ C{3H*69=30ZMF)gcީ_Whzr8~mOA׍_ԝTyJ/m!1ۧ}# s:MN)g Q ?Jbxl2. ?g;UA,S] ?Ԅ8jk\g4c@XЗ%YтBtHzS!2{>뷲kj`ڽ&VZj؅(`vhɘu5)ԥ\She^@Kb;AwYIuP"r$0QS(qM:F(JW!'NRÒsU@IՑ7cR'u5E==0]#E;G`0π;~ YcG! =@OpyrUȸPdF'?|Qz#?z:*|*raWӢ\zw{f#O@膲rR!-exNE.ɑfV^.pqHƃTgz9`u錳.?,&Se3H )YE2ؕ.%)uѸxhƈ-Ű *Jh#$^R$ Da쀁H +42 ;`qg '9|~BټUȁ'/2 ;|KCf4MPpGG 7LtJxE)UxX:ңn6D݀F/3FCP&P*ˏ|0CGBA4Y6俖_ͦosf"3Gt~|{J M!} yX@T"Lz=%Y?c_>?DQNG4{hkOE{VqyF&=[,}r1G]uo׽d'eL|ôq'f=7 {>30Kf0C֛O@܇htl-;\,K,wQ0J:q\\v LrPpz܁6yG=Еc| os y p'[_PCyIwv~ zBW߹Iy*J[qV yS78:~rSF^-~K7 ˆUSMY+KP6VUϧќh4- jN)1~:9M<$Jz` CWLYpQl#Ձ:F3ǰ2 UlQ)ՔPf쨯_|2<̨1Fr-u1[~[-\iـǡC!7?#-rɷo]YL[H(z޶ku!b`IaNѼF1X q ,cYfD..Q|]b~UNbwe%઎9:( ^d4bV6-UO%2L*QQV<"DﰊmɺߛFsy@`Rd4'!i;j3BV{d2` Q#(D`N_"K`ۯnϵN͐k-Nx<]Ϧ^38:R4Rf.Q6U0۞ - rVV+#Oy7> LOS8GSX9'w^ 5Sp4Dsc7(7M JPlBjzZ8N"Ca zEņWnG ?BM\1p)e!WUbM'[ :p|88{he["VpTgJC܁՘+VƇGx5qQ5b~7-{.RO|ގ͢f9뺱{T)XQ©BpGk.v[Q_)]Udu-9HakN(l͙i,1.y$ƀiRTP#cǒ?{g6Ίb``tRv$uD3;VDݔ4qk}>X>ul -zOd=+-mNԅ֣4c({} \*Mn}5m[=LCPx %is]ސeg.J|wXج<򟃯6`4?آj-؍>(`,r[Җxw`>Xd+c1.hGjL:Miż\"RKe5SݓWȋz=l|X2{@Oe`mOrxZ]hޖr;|wE?.ME#+eM"OfuN ڎF qcNؿsg T|5.ߗTz (ݔ4K}Gs+c4]op.)1QPͅ?gDCGl#}2tƚyw}:ȸRӝ2}Neu k %]/Eb,+#gj½8+F-]vid)qg)lnx8,ep 4Ȉ1D Gҍ6m>cq &FKbXȈӜ(|Jt^^Q]~R oO_%ST8[S[t)[ In6?$ C ͇+ӃP0w'ygSQ⹌[۽.3d4[8wBLF܇ѩ|uT<ڐ:!UjdāY!H3vs !r$60πREa>1Kˉ81q3mFz*3џv !@ȇ@p\R^ XFԣKK؏&`e]<$߅T1Oͯl$8ӓu@HQ#%u0%72ЀۉF̻ .kF!umg#?+<K4z@#rxqZ~Ҽb 3J H[`"[7dd%_|-'kdDi"N.faw+<]F.i.VT>!uE5]O< 杨jdLfLY?l)!] t4atĞl8]qƑϘ셽_NJ Jb7jdUބ` "9]#UQ btLeDMpJ첋uJ)n'qOB57@5`a^te(z揙@E;@f'S,HFg]}.V盪 _¢ RCQq עiR48Q)皸L %[uvwP̕.ә$sU N4) $M0wg50mj.g5u6g$Zs!/"J4D(F7~Hqp8WWhzT-h5S9_RdLj2~FF0zʱ$ D'c (YFmEFTT+[iЧNֈ첀<9Gx1N'>ҠSzu_{%in;yic\nƤ:V0okcX?_Ebvmf6j1-ed[\C 1_l4 ~`tY1|]0A .<yA>4$ -OʹP\ D@Neՙ* T2H|c? a`r[##&QzAN}ӆlesqɹw}-S r` P HuhbI e\6+H$P#F2UTi'I5Q.(`ڛTvr,f3HdUxM[ #~M̱'ޑ?5'Sus:ݺ (`6ڟi}!u)H }u;TaB m-гl/,}Hb2gtZdLpd + 0fH.xB$Vn\`':$em )( !R;!m؛diɀ.C;4vngUBߙ{̢K$BCN~WamleBYΔ(G Lfbv즷Ano.@fwQ9 wTil[ W`}MV]@~r#dlZeBHA0gi-ɕ92kjF͝'YA*{\W*)xƏF\div]d7Ӎp;T~]ՊH;+zn~dA=YQމv֫%}1{9Ȉ6ꊇyoȋ8P6ϐy_TIJiΌ뜨 |xa =9޵Ld c笌pY[4%a0SkNuBm4:C# ëGХQK MO@ ]ӳ$iΊ.52Uncb/9:gb)f9 jd|%og͘.2?2+.\d@LeJ |[܈FĈtdg7 Ɖ%QqF30VuJT1*Ke ϴRBH 3;)GFF zգ3n8ӗt^ry`4ɥ4,u<rZjb52"2E {x-t]QDF1HwU\7Nu+D (iq9XW6SiǍ;&DW~3K^52"nnUB>7ecWbK=xuCыZܥm( )wIPY=y81ȸ:.")U(ͷSx3 K+su‚&;jP:k{sbP\_]L<'RFT#*:`@A?al9Dq(RgO-יܓ@e_N4* =.nh1M <0xpԗx=l$bP0n pG ZD4(| VN؎3ԑu}NGʅFƕP)$R< ܷF5~mK@[Zxڂ #' mcnɷtuq0܋gv)2gp`aSi6_=YGKrȈNFD[C9sVc> O<{:}{T<y:Ejd#wg˳ϻf?Jʆ#+Mվ]Ȃ^VOt:*'2fj\( a9r Z0q{Db35tlen<P< qegG WJ~ ޝ]=&{ϻo 5j9Q##F(:yex!N(Yy֘^HYwmVG]Ke+S\Dq"G0RrO-/?"҇{6 (i2+3)K72^fPq1(&Mf]IѶ{.LOl_"Jؕiny͜l[V3ﯿZ˿7Xy<]^mi|qo3|90 W3g8`9}1拍'h~`( !?g%7ߙ c ,_ʿBf75!`>,f 0lX9!5$ei+bTw$vfD^0 4:T 1ǪiwZ]꒮kUq͓ƙOn:Ei4!&dHzן&gg}8Cr2 #mLW88P* J`ۥ1nןN02!XۢhGGHZdkmJaLx6 Pwӓ:P۶kozD뚶 Z=סI(CBYݺqQ{Em;r!!oo0Vj]6U'nW]E5"g2K+NC!{I0&  P<mQ:pA]W˿F _۫{g`)jlwϢU>TUϱ2jA G_#_/]!ə-eɲ~ߒ2{TR}k@5}#cê=}ǡ1:2B;u)ּ{.X㬦o3aӫǣ6 &BI&6h:lr *}{+y}HP>^Z5gku ƺ@ Qu"G CF$XT;YiF>R4B1y-)Io>"1)^jx wR`9)睦hD͎Gf_i8ӘEOlX:Nhxg wg k'ŭ ~Y9Zjh^^M?NcF? >NRxV{ϣA2cCZw\F Mǣ_V<YKws&hv|@P*F+ui` FT9č&>*KI(PPwh(8#">.!-*F R*FB} r@x.iZ$t,5 g)Њ1{K(UcS*, PaJ?@=\#hhY,S!?p;9góiyr"PE++ 1QeW3xʐ7IὴE$(qY#X=f4tF|ܜM?bHP%;оW Xzn&!}>RQc`x TRAI'Qd`FeL>CEY3~G8A&P(T+oQiBiK#4$IFwE#J"(=d2"Y6FZYVPCiPE7hV)v~2fot:롗:BmطM2 ID\z@mPt`*"% 7-k_[ @);N9:ެ};"nfj*K@׬Y4 9r: =̋tur_ O~to!)ef`뾑uGl}XmZ42hҗ4$٤ Wb}[tTgq!uͭ ~^X&>.P=)ݥK~G< #)~ů ͹h8[4]O݄Lkw͉_ KϟyX8_oA,_:q8BȤ Gxq}-M#c$Zȁie$eT`x9U_I K| |}ܸ ,*_ggʫ핮87dHa?鵥s1V-kj,w7'(32[ -tj)i)lӠq/#顄Yvld޾ŋ|Rʴ=Jz8A[vF'fWr8n/W#2qطyp}g_UGi4޻::֢w}eBER.2Pw}leoBi)'^x?*ݢpcPB>{,Tv*PTb5(BF ǪBDPWYuo, `تRl{%n>JȮhl?[uP9њRÏPuЌq}m+Ba#[#= q{#4@k%tl7 |Nnj6rS5eSb;)^C\@et4i$RtRh-=ttmFpBx#ч^yX h+ޗwxc),S)aQa>,Yy#2m26łoR!0wX6nߋe5yh&.*edS -Ӵ`RI-9@L:MΛB.'_ŽW|E|m<΂jc&?j+7%L9>(uK1TXgvH(`{$q{:MMkw6F~4:$uhBJ`ry4Pݗ Ҥ_Y*"4}}9b>x*1ia3P4K&}|/nnna}rc.)1)aFN h|N2;I,5 ޾$9ZF]xّQ&>)9S}[MΫ,B,)u~o̠bFQYcii',7azf+{&k7a{2aiX5 ӛ09v[4mK'w՗aÀ48tjԸˢL¬ɭpO xc)2سLWiedK2]md*/Zy ɴ:ξ[%̮͎kuOO#бO .[bDLPɋc4*Ќ.hڋjΆ[1g7n,+PE인ae+ApPA {7η7͵p!Eeh"rca)dKƕ!j4^2j{m68~p0~xQ;E 83DfON ';+·r1rqB49C֠v쨱.ǻwyXqʠEXխlSնukAbO+Vsm0Iӱ#HE6HRb@$O-ѧgY{8+U$k\|ȇiɢ),AUS#q$ҐÑh$TwUuu+ܛĖRQ%'YovM}`6]C Y{T6è](IA79zӅbWD<%J)XgSZnQ TOc#=y?wou Wц|E?n3rƻέ_3"JtOVtXU1C6aeObφs]&퇛"xG@{_79bEɕ@0Fʾ{` ݤV}o/pZ3F)a5c_ᒮG-ʴLf^@*'ʑD3Ɓq 0n>0h4h/ >i|8B@@–㈰G мBulXI).^Hؒi3AJ?Yԫ5J̲rjs`EZW6Z"S}f@2֭Q-Hõ]ׂoYm_ 0xжz%*eY&`lQ-hdTdOnCVтRy~T܃Kq4ZPBx얈N[M*LTIŽqx3T{dWi`OekD5LZwt]I><0BYŁTȮ%N[yR9o7<9^ "9CjCiQzj L-ieB$Ɔz[&`xd6 3v\Qk{_ }K#eO7Jer:~PC FXdZb#$(@m'PS8|uPv:k?~,8l68MN,GA 4Rj4_GiI"էK?VԆWq -O? 6˟CԛPMGÍxOLّTHR`xH'12[<hR9Hƣ5'>|~42mu!)r<*^jzItMWm# Tԋ@aNb/PB;)a@E| k%Ji[&\GELE+Bj;,Vzc .ZJIJTֈ)q< rJ( (ӛo{,x9c4Iq'y4ZujPP:hl-v ]|8-ڡvZĒʀJ7rH"g Mժ FdL١IuX7u<\yq ܃U78'#bT2$Z5 ul{袦a"ΌMm[ (V$zB1,nۮe׽ܧt-ěQ\wٚiJTFI"lTDƄ4)ZV 2_2#@%_MQڱzػD]~Wۄi$S<>Q1^,1E5u W _qOk.W1끣x( n lAh50[prB+dĉ 8eʫ mLeJ,RsYϤog{Zn݂J ̷(sb&$]HW،xuЫ+ȍӭq0JM`x$}[L<:m%)z<-wj*b/3 u+2R|s)Ɨ^[by)H]-`^dz,e@gƋ~(iۨL_S0 (9~J0s_arNsyK1ce-u&&}3TOّ^gL?2f6ѹý.N)<i$|Ͻn6ZlC(!xz}J>ϓťfxY~یtD! Yݢd Ys 8:,dT*όNࣿsf J-㭺l;T'$"$ѺO"ժ~&WZPj(~'ID[=adOnL Mw6+^jBՃ?GcrqSm 7ʎ8}  j_nm>]ZrQHlAq2Cpvk+fL?MU3*uIUa?J,.ŮH8)tWҳk H݉7Qk`zA]!c5ާ1 2xPksi!rrI%ED#ﷲ_ytºXָ ZTW/g2K8H hxzkP@1g5y[ K~OFkyi aH7ۧ8'A0?5e 2xFOʑ2iֳ[PKvzs[mhT1)s$6 ,bEXQ k=|x}AHF^0@@e@Y,ђ{$M)>QɢGJ١ m[ᅵT'%d gAݯr >'Ooك(P=Rd6L\\ѳIa"c:\#Ri'?߾}S=Q}GE^ "NyQa%nZjyiq(Z\]Dqy&7~,o/^xw;/ $gU:{vY=>xV.o~Nv1]P=уs2V >4^>G_Vq-陁iY/,E4ANFlRLQ a LĢo)0"O$> K\zKL~^'BAb{t:{[+VPz,g-zAm[JjqXbԊVѳɘjLb[ir*{r OSz׋+a$")&2lr 5#rC;H: g:ğ8vԵ؂>ËG;`"i$J`%= L!2z նwTNĺ9uBD'XUхc(~IF2hx`ꝧNuxtJn-HM7Yh|b$@P h2TZƒaA%lVw;;{XJ21d Zs .4lqXaD2B6G>:8j+2z JBYՙ4S.>!5qUS1C¹Z˾֖:Ʀr~N 6"FH`b0a0>a`ZAʳk$Q?T3c= \=ߘJZW~-!Ƀ ug$:']mP86:G̵a"tjV8[-U;U-ިa 4G[޴rpl}|KZ+PO9LbRcejE)%U% 3koMjBP+oUJ+sJ/jƖ՗M,GpoC2,]~y%߽.ytPtukSKk.Jӱ2߮?8JCJez;~XYjogg GY܋]wIJ]8%7 9=iț|@_3Ieޤgok΁ \U$ݲ=+#E*vvNzI4z%7RR(H#{yx=%W8ٓKV {;gZP=U,v ##ßX)>Ï˸>*#c1Fb]|u-1N췸w3LGNu8̠9@YIEz4tE9DnL1dJfC+αaTuM Gh(d=L=4ʥG4"u$'>;wmPi)+Y̘ mT}f(:ldlBhI-Э"$Z,NUQ^j.%P7l-2ڪDQf܄ԓSh[cj' mRs8+WDE(>[Q]&_LQi[)$L-68H؂sa$kT1ٷ¢"s!T=gr8xĨ=GRa[G>J_gK_cun?e@ ع!%-B %W3D7 +lD,7u) rU5V ?Ͽ˼W>nwe 3P1Œ9fsŴ4&7̑ >Y:cu~ʠD"AYcQVPJ[V ˵!: JqZ(%Wj E(JGa`* ci*A9F2(f-/*LR+W)]Sc;w'֋R(L꽆½rWPySM#jjtcfqܛ?X "'JPTvCqW&ÿfDR|?V[*ƨiEn.#\H RH4ǹFm3j:vw>.cm[2\R/XIhbn (SZ m;dV x1[j֤ﬧ;oKcB.4dbUG8qnMƋso4 /νōsڗTF«s˘-(j> :YK^#xP`8f]˟>F){&Ox{faM *z&h"{AVO'eQ[2|;;HXj捲L[3|9w@s=X]7ܦpedO?QFl`BfT1Ku+ߨ UQ-L޿@}d_~&3\nx2ޯ @r_ޯ[2^S$F|~9݈,]\'VwY鳉 [Jw_d5(R 8b-wtŇ)1DI-"~(ޯ7{sypj ɮʻȲMϼ٫*2(SuuQPrje6{j15oG:WU5VJiF?oT#HX|wT۾87F7",͕ç0.bn0*1qЄhaIPj %?h "}H.RJϊvBQBƸei!Iu<}KѨY) =qn_<,БX:Ȏ [w4`&EtLvX.& o5c97NQcν*d*UB"'<?~&,w]iFf~)[-9 Јbc Xsr!R?ruz>S(G'Y$O8yCIi;4.61GWvi)OC(RcuPd \]T)x#g4Z*-ʣabv{,Eqxe' Ey4;f_Uk_24WE-%3%Js.`X|$:( oҿxWe6r+~CG;hk ZZw43a&2) %[jos=BxXJh^DD6Abp@\:j)5(,ugF;3JݙQά۫pSDN@1<þ֔J5AUg|\lQ ;%w]"VmO9j~eL :L#:ø(OtJLQ oܖ g=Gb|ɵ&AA`K.e[V0zUAȱt%XP+gUA'`PPkJBv%}&wu7w?~zsr_q/yS<Ar誌;t51U&{Eٻ|EExw0Xp4Bׂ쪙*#էl82g!z[sDֽ(86"wN(c, t}yʾJ{f=J]mW”`:h˶T$vSC<=ݦ]-ҾdDЃ%dJC0E+cn^JS_W pU.TiF_'_Ǜ ӂ4}ǓNaC?ҠjiH) DTL&l+GSeE+ ,SQkO>Pk*vrAv y=y'E,^AtR:(O.뼶3YFֻHT8\Mϣ#'wHaSE:!BxwŚMuE#b {b@jm{|:o3$9O1( v!9k ĔG:hds9ÝLPE]9|Qn*-`h@ wi|HDg.WR0Kj-T]OHbCE zqp٧S`"L  f  -5xE-;px#K VI:D̰- Fv|n49N:钌im]36hU/-W4@ߩ{>R`y@c#g= d`W+>Rd0zGq&2|9G,ԫP0>VR @hZ쭽r#; 9߇?G28\AG^l'r1ptd͡\ƪ`޾ n(2< js&Lj%i^*B/H}/1NiTYNCzH6+S/UT&Xy( 5Ƅt+t[[ =j٧i,O+'(Kn\'7RwgBA4n"6CbC7.W#ziDFFF9h1-$Q>cf}l:Ebf6 .TnmDkD3nfm3o33GcmLm[2B%T!q0g_H52(-Q0ΖA'`h|'U;ԘS)nc]TV,RQ+zYD*v|]c  q0 "):(OECb_Ztdý{zШoLmq!c;UY%]G]p:`W~^m؛cdh˰AjXtRNH *Z۫N FE{@P2qkb3:ݵbPKUuŤ$COcԮ15}Bx'X7gr!YCD & uεeL;3%(So8SR.,V-g/u*#/rzLѓ@Ww08Je}31*'q8,UK#ص2ao-vNNݧ~΃W1'%/ڢ2/DmO)NRC5Rk׼XޯH ;5>{fQ%YA4:Ghh`rr}*I]p# 'J㓗eV]QrThL5Pj#!v9zוbIzPY8#tG>~/\2XM ; ',)^]*UUWhnfS7Qu[M(Nl c# R.G|5κ8 A|vP\C3EQMT'g*ꊺRhcQ t@JkUO[ًI&G0vOc"l){d; 0[56ugj`Iq'W]eU%ũf;UQ*: E @$&.X; 7# cAQ*n] 𡌅[1n^ Ş 3 sz52x 9㶎2A>FS jbH ¾V 9kUU.Ԡ~uVnu\;[ŻO%݃8b_u?HFժ;JN3tӬPH[%|#xs6eJY}+kz2ϰu),Ϭ}ߩpW;ʷ]4;Y4h4%׭5&s ;Ry_<d:.CE37A5ZK(tdMxǁqȱló_Y7HGHv8(%ղ{'Dzfv@YҪT'>L3ifh's^Aߓtrt-z%YbprtQ8ZjPוqW7!wSV3sft]yVA;nw 97g7v;j.}4/=atvc[nZ~ֆϿ__es-G,ϓ˷E,׷>?;e/=?o|[^#˓tzEe9~+7əZ"]}2KON/>ua92f6cĴ/0T"2ȁ1k7 '_./5~Xk5XB>#菺Ć'ys ru;ݗ._Yj/NZS_~Zy 3ѷpuKqy6b=h.5i_7o5Ϟl=0=daz!IIRؙĿd|i UӧK!G2g_P /j.%ۺnFMk`XH)%9%lCri*iAБNQf|X}D؛c|^A+6fn6֏\4ent&}66c7ʳls8L`>rvyWBTb Q QP}֣5[\ l@|Lv Q@_XEC\\V+&ST[ duQc>RS}v$)16ibmSCO<-b͵$W ,X$4 LɡucYD!<J~%b2RK^Ib&:5 @L1 =) 8K_K(M\Ԉ*dLƃv(^sSc+QQFNVTEb xh|݆fٙfGߧKJ/]#8(6VEx)(*ʨXsr5Z`,dSK\hF]q b1;9"#M"l_|zΘb&iD*Mlf$l,%{P&)PH؉t)"zN}_5ZB%('*|*Bsnh)8W ZHC!&Fmxj/BY4rrӂ}wn~O?9P-e|k&Qbe*ab,ڔ ֶ{W{bgMtk흪Z%ڇ^sZ'ֲMX /$J*m}bxbY UeyUx)cbV=TU(JhLZ( :^ i=2HGRK/'uWD7uv lF#VbmUYBe#dd}f"Rh^As!m6T9aÁg=@hD&o4) Z -D?D0lw'֢/ |\\ҢQϹxWM\=TPD(($V'm0v 8^M'n,Ϡ5>qzG#j1M$=%Z""ԲO}z3BKwfF+P7u+ס߼GQv*U7g n-̈́B4:אabB#Y5>&+1{8>6\.z:DAGMPwy\{Qñg~Q=Fl`x Ybl0HP -ݫ>8|$-wI,Vr+Р ^yʵXݐq(V8 Mae1L[ oPȗoz- KT1FXc"vȣn9C=ͥ&,!@D\Ap,q%%~ _s&@=[ӳ C5 :5h>o ,;<]2@a򀉸?Ԑԓ0=GM 4K&Zu&=~d h.@-L&=>UZpe*O,<->^L::;*j RrC`tCW&yziPC27BykVnRƹjWgjޮx1V3h fsr 9]y`5A-Mv^r'QS.L롏yZd۹ܮx9F؂v3ڼ#3qN(l O!`{o>kη+^t!~Cnm0(ZTag%z~;wf++43 5הKxÒc'?1is3GbP_*+^Ae'~j^k0 >ЮG;VA}SoM"Cׯ#Fyx֍=Ga>s\Sp(+hs8!s}{>)  fQ@^8 %9vG#A U,}*qdm$Jh튗#O}oi0@7<F?7:nFzzx]A\%NB ׏z=Ѯԛsa_6e}۔oUW&ho0v+Xŭ I]}Zmwsrcݛ.ă/?/W,珓Yû~P3~MȜۖN$$^POx?Ku)}WS&}fQ|)r=qɾmE ]#`X1*VM 7-uO,qL#O^}b_=oX]Bз :r͸5׌V8h~!wA_'vs|YgvbHlԁrT^\]>ƫ˚X=xo Zb,N.ʢjYHKT|\b-t LYykp[{|}Aleq]b4Ƃqʺ: KnW }G#wȍ ++Ɏdaj)6}͡zA;v=%y_ϮUd?.g*$W:s5ZlXAM$ا63=.4m77+^Erۜ(:lo8x$'8b_Y3ȃ})6>YrK]i]gDag麥l?AcKKZa.Z;={#ACPsjr5cd*f1vCYϦRqD6+F`C4pVj0_Rb|µ~*In/| .J"hE׉,^Ƴ>i&dz7=`L^ozľ7O{)Nsy&N8#xvri86z 9~89[[MY5AS6w·=`'0L3[r'/Jn)`f`7- av?:6*/@ *uof`7횏 ;O8ZUzZ nvF!g`c;}JkvFvl۠v~v3ۮզ{uܴN.u'gEIk1\|*U{YQ#vɔJjOdNQvŹ,50`k)>J IC,y^v#cډ_tCv۝9 œ;Y+QuA|BmɦY(D]yNU<=Gׅ\p OYyUb^vn:15fOnwI'N2X5@hI~1b&W?l-5]Od8uF`+ Gq8{0X <_E<a9sMaPz]&{#O=(aO0'uEw#Eg#j_c9c~@Mcq^7^kt hvkt_ˑPZX%ZD߃=;GsYweFzIٖy~8If3, 6mkGs)dZ1Pd_Q!ݔ(Yip{vX>&`Cj)TWE$@,ny@";r`lTg0y1f3="{"@v%+*a ZiJ(F5;D`wjQQ2#v!KO g^mL;Rx؝(L!;@*+;" Xa8E`¼I`cvFcNh''ɣ#\PYE`S\S?y5W\pklшiao.斚e'-[Q῟qN>|ڲl<>#͈/aݙi(*k)nL %3K UR)-Se8H2.i/TL3Q,Q3% 1Ycm|=A6ӊ|IMDBE1yrm|7'W+E=0Xy"uP9 uJ!("9"3o-&Fxd f^G`W _TiKv爉(0FSFqLTp-pTZbҼINcvqr4_XLT0@8g@@(ݢHI. &TSvyi5b[{Z_~dO<.My7Iq~2lաtD۾O&Z[!Ȉ*R ~h0AP2EDCgwi]b4m dgU!:1\72@b[ZۦnVQ7jhel@sVE&`%%u/at4-f?3NO<$K⻣'IʵJ4TI5K$)=35%X+#}[g(n-YfL P,|=y7.IQ֟X*OFXyܴx+QhW.k>K0"50O59*N%aĕνtnUqg#P"܂=ѯ#e{M~S7߾!.+|"2HxTXngx~QLά1k&偤v-htǴзO2&0F6wQ qA˜Щ9k'ȅ5$W !ao׽S֥xLh׬|t5k^)rW+:/γ {0hճX~-yǷ. ZQ]E,ℜTȲ~z+* 11u..;BR.9u/CKߋvZ#UU+6&h& ƄƼ -zBc_ lTpRJ<? RBS_xEr6 ql (<4|S\i GʼnKD$&Y1SmiΓ5x2kgª!W+bQģaZ#HbFE,zӒPn``OzYe =^AyUU]lS()'gr0,Ǟ~D6j-Bйd8k'Z.+Z5,C0STKS1\Ijh8o?l)mޘ;fi]J`p>s0DI]R/ %1 5J)e)0f$Q*U)ϷiKQ BI9}ۯoB yEœFDSI:M-%kXQ! ^(5޳UQC".Jd`4݋լ<"f{lTr|,H<3ZPBtO3Da\:KgkfGB Wr7G>]:mR缲*qLD`x1p Oun GdiY3D0t`ya(g|N-{ Ra)&+mo@DEFj)omfvT֟y~SS8Tk^oU(waq$зl z[* Boroܨ;HP-3Pb>!5(393^\KZgR9z+G5ba6 ό`+ဦVGM ")ʊaIB^Z_4B<nF>S)V7 %Tuq򀑃qBAX `$:&d 9WĤSL:e\^ )\H}!3c M&dl)Xi 2ޚ"jX-bMdbDU8\MF6Q,s{G~֝T'0M3Oч6h'fD:))݃./F'Kvvqgﻷh/ZV}"Gw4̓o6~JEyUgǓ?uM{s40c+)`E36RPў-ÿDSbe5Z9tV ^ + +-%z3 Ch'$p|g,( !df?0SaAɏ"P(LΠ>NGY N'?v~>}yB%mZ lBuJDh4yD%d+ 7iGѢ=gCwDڷ$RU} WNiycM<#P.i: 3&vqBc丢SoTHyD0.ZeT dZJj!y(\CQ 9 DVC5qZa񌂐.(4uPc85!/-᮳Ur:\*W*m 9ϙ˵rLT2d.)hGgnz6}{#al'm10;  q!8c!z%pNl哌(m">04b%+GYh$LȊpX{4oXIF9U(Θ&E`)M؅Cp*mC5M#[m$#v#jk#L@.G, '-|7| oʝkϓ;m+8oqRdo'?h|3/{K?oI*<7 !+o Bb[.}-K;vGg?%J(J<F6M)5Lh6?L!sOs':y5L*}?RdACg; ʹtV/lՓډ.e8^' ]m/tޢ!:*P wj[|vF'D5cZ ;'KV6Y+8hM=Ȼa"N/6Y5s*Dlߚ=ǮxM-tי.AINUu"\ ;SRL%#qE.e Q2-tiRa[xB2ħ kzFQ҄>)LOaD?pORE)KAdᮥxZ'xZ'& -(P [k".sD8pƂ?W9Psw~ @ri'EIE$.toJFb% +ش |#2rQچT$*#|!Bizg_u#f(C9I L 171eQ.I};zyO5S%-EЧ,Ic7PA8'qH5x.ɖxPYT) Hp5^:zO2BTj ev![nڲ%cZvOQcj6#!QҰAt6̒r@j6,cyDԈqFR9 4B*K_%mq!I8Ia6u(( (92ȑO!#&M)řH|f֧aQj)mlLFb% +t3^2hv!< LpB$)Ր6}ZF4r4 [+z#:p*)!D^QY؍6y{TvOiccj ?JVd<8a/Rkdn%TTAPphRR ـrKXp$lTsB>/,^[~ՏW?}7omRdǨjwaKww:F(m_K05G6$9l6ͅRZ# u! p&h0t = X@Zd*Quo^1_No) 򫺙SF3yˍyuˆXMۙ?Hd>LqROI.)W;غI\)Xvbv[tbqOwb;o(b'h't YkLA1]YYC.05'>}weq\Q+˯M3%؇ݕ]fcFwNŮ\|q.]lFٌHi(5 KkSM\@Qfl֚*> َM rG][o#+Ƽ.mNHf{da`:-; %j,[ıvU*_m;Ƥc;֬l80zStgu iͶUrmMs'c O -,h-$:plsM WԖ4e d@;Դ.fMM`F O+V9^"C+`165tJ޾=z)}T=͗UGtm lAjräXÐCE3xVzD%P}UI8;}?;eި.^u eZU) Oi7(fF9=&Ä:D"P_DF\YI 95vi9%VvLǰp 0nbTj*em%zyjC/,!֩c eh6C`j⺟@@QoAS՞ (R}8kH!1 F6B+yljk3FOVD@h}WFyD@*P;v^ \ݝ>M٧ne??\?>\OϠ@u)T*e1(y`i%{#k\Jq_UNq_U=]/5u<*r9dv'E`kc4p-8MCzۆZ\oΓ0M4tNhaX|I "HQu[ :AsSaibsApMTP ;LWӨLRhe#T$`V^(ˆinHGTґ)cP1X;y.niW`Fq[b,- mm~3ݵm㹣\Jjz:tD/pPRHSވbʙ׼2t䰻J„.-y HvUοWy;_w7=HB#\^&s@M H͉b4q!x hIlP+wmK)K/`s iXAj#2:&-]M N餫N餫z:nT ^ a^qM"`ׂ+5VRJ",E,6*mjeT6d晷LİJII>T";O`MG漙w_ |mWz zj/vxw%4_1o<'ؐ޳D+CN!xuF?g,r9_~-n> $1Z`qRed4SZ֒@D4Ïu1[XJP.@3Gp NRݯwO=Reȵ|4Qj6QO.ҋY'_?1*;P&wѵ(B-.SeKaÛ.6bços9=!ׅ߈nT:F}#Bŧb/6}S$#@M,h6ݤTrHbJ>@pGs7SʇG5l!mN 1DGAũ>kvVU Iy no[Hz_s]ziDMIњT9ǍqkxKq5.icg <[FEl>}zHr $El;}mL!N8`?އI>L+M;a(ZM*^8K& y\N#! ZWy`eרieEɛ7SVdTԞXT=z)ʼn$'t]lHvNbqkB}mI+,ݛyކQA$_ݟyOUSùt>=EH~%˵(\`@%2}k!y{n?\,:)>ąd9y.Vk#KUsO_zA⵲s mѕHT`dIS I*rJj mzCy0ק_. k>'Q? X #J1$Ncc)B9קJrʇε1@08J͎N>~Nkt_%Z\8Q H z6adCj쉗sxT|k)3 Ew E6Uq߆s"$ CHŤJ8*GW##G?]RgqI~_lwF+Uo- r#aDQkNv:(}{1&O> ƺiɱȗSRʢ/(NbtSFirVh̿d@5gY=5A9Yi&NE,q%h#NI {*oqwbD> XJAFN {zW?LdBDVH=t&Sx>< S[.sQ dێK'<_*1H3ڬ0&|K*!l&"<ጰ|q2"Pe19f}HpD7YEfZ V"Eei#E-:)8a*pH(o$S&5VEW&yQZApW?Tp6q@RQ)6.Fc뢋4;)JINY>zBRb$FZ$.r/wJvd.FkaXt0_Gt;f2z$1tjS2#PdFV9VH). cZ5f^^0Qxoミ2fŻ߇ "đ҃qCv#)u27 #xK6sw3™zw.|es4m˶.%\ 6߆D82rPUQ:[A.2_݅wfngGרUA:6`Ndb6:p A0zP:ۑI|oV-v ʌ1MJtlV཮h}f"\ޛ,v*M^Mm|PԌ|7{#f:"?LX :"T8j]e=_8h0B9Gwԩ${}ݙE|~DrG6SC OHW2$V|ŠR:d'U%J]9f3I!˻8("z29YYF"+!ErGBH%DBsԁ8ϓ&ZL)xcJ/4`^^KJm(j#@2W`0b_ht &J xdG#8Mj0.^C+ ˼Wi=8Ar<:@Zۅ=^:>' \Hr0:Wh]%kuUz_:Wn7 z7\]ǾKƍp^K&T@J8^J :Z XDmB!Q̠Jq啔K4<̙ {&F ^9t]BA885stDE^3&z.eXS>q2rMa%Wl8H?0[k6WU Iy noWTIY*1B4'}3qlj^|@0^`37k] e#]iKΧUWR>"*bd[(aBWئv64/Yj=xCD.W;ɹ}$2&[ f(㑅-30!+trn?ip]4Ai;"Yn^k-ݿZTXycɴҺ+]-ncE4$)$L~k7w(j?~,X)tT_2,̵Iz)%%i-&eÛKPn:ݟǭK64m(wtNgg<p;QHt3g=LGg}.p{NMFD^/ }M պ~c,&ɤ XǏ/ߕ'QPG<)WRiN%gxOJ L Pk9YZ.ˡA0 Aљ'ea_2\ۈiX8;fmԳᶱ%+ 4<6;:?`l~p|h.W{3 g߶DDM `sX%t!pg_pe72dO)(^ʞo]{3bAErUbTۍȀ\bڌ?Sublh&Y SVBP]V-Z^҈Ez8҈-)s9%Tg>ޚ ZSqν{k(&~` ٱg4⤤ +a8l0K*'Nn=5?>pF38M^G)#tfu!ZDtM4Z=g$OIF?#c LHmm<.TWqr~_Ow)S|5$cd 1'&š0(B:\1rr9<}Ac^t6TD)*+E=ǘ7)( ;  FO^Jn! ,fȹ3-b8 ٽα ;ks1tM;zûwn!VT%Z:7D'@i)fn?]~:sc6kڿ@U?:HV҃l m#2M~'x*L:{CQ5>+gai9 Gã᷏ b[:Ɲx8UY1|f3|Ӎ$__woAyMyI^NoLm+59n=t"؃GڙpƤƿƃNۛzv?%Nv\WdrY &GOz,a,{O?([ (I@36;(wGv0iҫ2{_w~b;ʇh4s:M;onyջzW'acqC#:ǃSׯ?o-Kfo ]=Lh3v;ې]`01ѐ <333^8UgI2}iƃaZФ4Kux [$b8|_oFȸ 80?iOf$ܧwiW^ _^e򦟚q+Β|M /b#s+=XqT񸟍֭ex|D+*WVQH_y+HiB CS Ai:ZCOx6f0CtQ3'Kjl0n-d<뼝:EXkiQuy^'9o?zp3si/cNG ,*i%.p-K6\q5\R. G<Õ^%URІKiq"6^H ^9%a #>E6,akY5OQ\&k(Ҝ[H´x"mb zaR0"uC!WP-g4E`tA`!X LQt7QqAYϷvA{o/Gm G v#* !%QRql)ܺQAynxԍ0S)2HU =*i3o{4,|3oBK=HCP%=ƥfƳTJa6 )l~e^xrevZJ?OF!|M )#$RS-076"(\VjyBj)f*$κ)]:c@ϲ ?NIp8" cL #eDs1`bl++1"muC#DR$`Pm:,`9gp2ˁQXejr_aQLͰf?蠡Z`.4plM.hQ!UiQ2j\F[S2j\Fh#0- |*g#KT#\F0s5.b."K lfE!Ω".ɵh #.9 lS+*XD|,x`lb8!e0*)M񑕓Kmg$4ݫLOUg\4aGH){Hd[I"%d+VV)\6U IxJYԜ{^63} k0{J#*vܙvPSjуryT$>卟9Qi{r*-VJiT1;Wqf{݊Z覟ZGfs z{f%kjj{rI|*BckNÇW@onF6nh kY^[[o% 5Z~(Y-w6d-Rٯc7FPu(e} #; kGbXv+~STa>{5[ZoWɕx)szز;i[Z Վ,beh`U0.]\]jbD9ϝ~sD{F4+]{\]g@!;?vWXc@I@i`vLz%$B>uvT^ӧ^Q@c~=-(9#/AgŊ{-E/^*}X^jWΗ |ܸIōw{h)[<,G{PX}Jv;D mvV3V߭4ץe:Vԩ+B;Ճ$ NW|Omx?_`wѲjjR! y`NDf ԺoʑXVLxgS-SO\?K5xW [9\+TgtjIT8=WtJ ):fT}h=gt T[<:ը?!ԈUOi9PYxCN[S=(TZޡJ9ܧ{w :=mm0ai79;鵣 U5KA:YZhpα*NMn<FM(}gk^&놥_uKUcTq7J{f6( {TEGBoQ`fCnCd_|'L S[7u0Nn#v]h_SHI9 FQB5z^ {>4؆AxwJߵ'mvSAuB(}2Qi%l#.V `ò6V4|k)hBj-ҥ;B`6KMa<c1/Z̯l4-'Gs"jC 2oP$/22 k[%|Zo Ex0A,BXH`J<($5"":.;—1WtC$ڃ#4BKSgM` DN@:+8PJ`,`<ÁҠxN%odxpА )]Z%Zv!Sa ns=s."}&D~D95'2T?IH}@j-CL c"!-E2Xv~9-_ r:Ƀm@+$ipU`G$D iٌ5fP栕\uCvLE9y;iy;m]x@ck]՛(rADQke4iu];3٧#F`sr5Hax2)́n`I س%*-˹, }ֻaa TE\Oaf<C0)T rn`3,V< "c=x*@Q|j92C09Tl [! 쎬N +AZ(i < dMBeHA(ZڪtF%+(%quoSv.2X]w/巽Ȃh%?Yキqzvwq֏z]s4~SI$nb%~L"ODxiKENx\Px?.q6ďS  +| $Lv2C Ck`&e-32a<0v+8=I$iP] 1%.qF#bKpA,;\2dFlnlV$7ՔYσ1n)֯Ūb%P. +]@fY*+ 9++YQD<3Z \ CC>rI1ݸnv?;Ck uGHHZǁbJ{,EN;t _%+&5W(wdbYe%D$+F?K0`nᮢQgޘ9$BR*G=HT8yrRE2?Il 9Ak +2K- 73EvI^ K.bXX6 l {cj!Xl .ֱjGpuB<==- e3! ~R3H9Z]'gJP}="@2*A='%0s!cG破 hρ5a)J:-",uOv Lx$RײeN80w&oXXIWxx^'0[>Ws:0OV _lS1q ȿMߋKB$]'^^^Їgy"8z)GO!Yp>i||\] 6~bwʜM]qmtBbOi%G7ի['wv|i?f)4JJ/?V7Ov7Ң4,e(Mn"(?~)T<<\]ޠV8u缼.lYPzr"$WS92&ʎOҷ{wʇuQj.)oqi.mb\ېk&T\AhPZ2ҹHw\hO81;@M&R5HLΉ_}qb42þRӵKwWDgUqD7[T=ÀۛL֑mvy_-N͔}Gެ)%#W|S/@T\w/i+ӾiR Dłt^lCXT,xL 3 O"%ؗ(b_}iXQ]JMEu`k1,5'8{&@PS'}QVܗ-٨v<1˾;"Xd"~+O(lILe,x/"tԗ'PQ$! f,_B;35΋iwfjtz8Y! ijb9dXl2ƍOlfqL۩9s3ףG! 1חŤjTOz!nZ@RjnK_T0G7U6)d =ZP۳T%n_sutrÊdDP,߿{]aMǚѠ%V ZM֣lPg sC]O|ᡸE %3p:y*7s|&gZ63fm hex!K<㼊usJJ-%94Fid?Wh"l$ 6T 1.(€7҂s)KjJ 2Sr&83DhoO( "ٲ0GS+Е"j&l>1dnO;X61Ҙ&d 3&^v[ʊs`JI:S+% *B]pq5>@Q+% i\E8g)&iYG@:U!xjUrK5SW)#-EŅh)ڕ-l4#Ib7̙Ϛըd1h֊+Vz UJAgXk0p(,CCSJ `ƬQ0tilH kd:WC7"g:2*$t_ex_>^1=r_{k6mc[-}\#Twq"OWn..Mwpx]U뤏$w.2e"g*7y-}vByl-svCB~"%SnG-}vJ/SjlBS;e %J.l&O7&Nȝ'ӎ_<2BOrD>~K:u2^l,%Nk' t\ZM4&!!2ߊS[빃VYIP%]E&NE^bJq<F-)Qm:;@|A:wB5v_@hX XNq66 _%aů*`<o ՘ѡ@KSɲ{"P!Pi^M G! ƺ(}\aj$WRDŅHp ±}9b41zڡyh%ŭ] H}EPpw%bRV0F̘G/LN3Gb;qR) *$Ѕ4$ٓ"uD] JѨ"JE ,1Ax9"|j݊Il43Ĉ*Aan^/حeD@4Al5)&r 7b= l 0bmZC/4fAİ؂+BEܙhBKu.FkǐZ1fXeR6EOB*q Є9)zRCeԉ $:{M ,!4 .HX,C-OUة[GFJq oZ` J%J-)&*.5-J1C ZFӭbZr"S"~ErwGfuR*PJtIm<ړ,1U.-]ipYTDQe/-EWEc WρK`d}{(!Ut>5x#J>BQ 40#LafptǙ%#n-"u0lQ[`wpafl$8᧔͂_pr._O/vքVe-靗wȠBv`.vo EA\nlאwaDI;wEzdbX-vLQ_6w<5A{Ctj0mSs8M 9;M;yc>II,SuAuOTh. L))gu{{Ӡ 1߿Jm@ "e C ?dBp}% a;CEDŤ3~o5փriM9餇IfR=gIbX,iW[Y|ZLjMӛ&Oi4aGqljk{rqٙx{TQz9ntO)|M I!1e.GsB2p '!V(m~KH⑴-xe5)od2s\+ˈ Ti#!Bh'ZP \RSMvVFjB24D]xsv Мn؂ÇZ{QAECfSkь=~ټސ)y֯V:Xq?Il/]2s6| 7kXayϟW7uSdX Y|Q4/S-f<5J#t2vrwbI,>N~y.EWL>'jMboߜ9)ATlTo\_/~actc"[^[-^$w?oX.*=#SƜ)7 :87h;,a|%;%/xE7x}uY<"|U\|\@U*>^:{^iJ*4?0V _'$̴s<VȉmV[iZ5٪RW<%Ʈ|:[3H4cc$ri]^9y5 +],$8vt^Hfv 4\$&$&PNNi oˮYPz[cSf!E&Лi=fǏfdO:X| wol-la˛172 oO ];o{ 1$!o(1BA$],e65eɵǓYwy ) CÔ9 )=0嫍x20ty#YT2wK'0y9C.S4Qn{W`p'kFE ;8b1Fv^&x5-E'e>f7#->MV}dX,~vFu՘'W_ Z9qy" $%<1'3OlnLz!,T" }gr @[֖-@ڻ;h aT|@m*[ݑ+tmX7qؘO"s?h_gm :A~Ar4$ c8–Y6uBAҼ0y|ue.봔6m7POb~Vѝ5?m{[7>zxxЀ9:5P Np$ׄ03zHJM̼!o.9EE/\.d))50AkZ r)KmJɠBUβ0_,%;5I] fQ Ȑ@@[RnWV릳5>o.n}OW'W݃':UP~~yqn^5I;wm/0L-|c§aHx(<>\0ƟzcjCt "9qj!% PLH۸T`-dUvӤLdH#t["'bh3T;ʦ]C{J=U*;~ B?,wҪ^w?WG7J2dU}\@q'b*xn?^ye8iΦ[Eq&Hy:+bf /Nq\qƒ9Džq&xyq|(t9AsQ`˳lfސ I"'r]BS.w!Xbw6Vq h"}\=6^RǬ9 &`0)Z+Bm5zNdG,Q".ja08ۇ?̿2iR3S;ԴI\`*fS}?\őί{UK1:͞X{o_9ƎAβ'V;²~^²,(+bW2ɘ2%ǯ\ĉݘ81Qmavl0qsQ@}Ę⊕^g3QQ!ɉ%ܳ)usv' 1чn nt;+NMkT!V Ӆ))vqggC u3^{O\xPqn[2H Vsd{DYaЏ@ Z:D_G"e/1fk,&(!QI!+Č6N D Fl8W-TNǽ5”2pΝRKSKE$AV!a8蝿A9h|ChLqXʷo$5XR 6rۂLx5V,hub|ChL)zsߺiAVSFu[rLZ֭XT6Ƈ1Sc Ny[H^RXΔ7sIg%RcXL'H4*e,ad %-֚3ʔјYmpghgN0'@q>6erq>I*P9Vr- `h߇FV=IqOߋU|D)UѫȦ=%ѕ$.ᚅq $֘2ERKK{|?/G #R-W׏W6Vr kNR\~pv<ť'9Pw9cIKb@S^xky:1Uf:!Ei&,3-O BLWuTJT: 6 ][oϕᛁrc`IW001(8 Cq_PWF"9l)FR W(54N! lEރwC B0BBF9wW- X!&@7B$Q 2,>^8K΂$tTɴβ* 2dx* !ogJ\_HEANh0]%EaNP <]ݘ$Q&WL$RN~Sui g*Fz*d*>lJH jn q iXKhL&ڊ 0} 8> J֚SGj/2SYZ$g@B1ȋ!uڞa c߄luhk0VT tT9D0OG[VdrRX1@ÊeR7e\H1׻XbFh_h/ CKtS6q`IiYU1%0HnDAu1eDrb[81 9sɉra V@DO)&xQHۨh\-}Cx(dJl4} z)~졃0,G$A{쮕nct?#nu%kU%2-ͦ??t}͏?0ܜc"YH}KA^k{{ORK@2^(X $,Q%FK2F}X5,cx])HPpU>e(H*p%S. d%%BdJHixŒk%^MORK*__0<%0Ӓl`,@x+Y]Nd$ZEB0{%v>2<ӉaS,‘(M/KX³. JӋ!ó-s6-K  oRO޴7W.;NN凓K4 bxI}l3NuI@22oԢDF7mx׼'\%wqPLeUg+EܥזZ(Z%W Mtrv(8+bx.~-9wly)x3cĀ-眄_>pNR<́69SOo`8g= 츩*x&;;\vܮ<|:ͦKk&[a)D KrttmzchwJ3&\~18TuM`1-6t մ@X%<`&Zq[(,љ/2ߪq( +NDN:FijV\1:3\Gp*q 9b"{V|U2C(Q8I˜yaHTITLg֠q)ͭp^iÔ]{ɷu{kM^_Oʫ{'by(,sMQ͌X춲b]msF+(}ԊҼNwke'j0,S$ u\H ^irY `gZ?sm!eȱm*NzafY!R7ƊGYo2̫Bo? S~Yx8~?hi]4eε2pwmotޒ7&] MS\9l&~F똅bMvd3p4B8>|vZU{ '1Ћ4:>h,Sx?n:O~Mu{?i}Sk>KՒ.[K.qeQ/%o_SrNYK;|rMuyhZm і!O cj5-Ժy7Y!4z ʼnf cl$-y8R|xs^djO[.{.)6׹=P7=z q^߽$ᄎufww1:WͨwZѻuVZ*wBLͻЅYzwWr8PJ{X@FZgZ Eaxk_!uFʮȖufMPL+$qNbˑ ~O-{*zC!۝ue9/y; fMՕ&#^(^ԏICuEӚU7(6aNY(X7֩H!Pub8RlavӀsq ԩs{w($!;&b4NVsha-9rDŽǖ0D'h!B|g@Ҭ `3 03!ph|?~!]>!v:bL9;I˜J"#_Mj)D 8d&baPxT^w)?}LR[+Y~nff$`A ].,mn .aؓ .ޒNދ$Iw>IҝOt$$Ng cB+uS'ceZ0Z[re\j wՆ*wƏVo_]6%?:Ot!L^\RB$XV0'J6radAxhګu'\/ھ ",#X|}ELܣ\X _ Յ([H_v le2o~~~" >GFT/y5U@#2yV槫%_HWZv/^^兟]Kx9y/}0f|?'0qIf,&c9f\^)QuH,L@YMeh%IU0sy g^O3ݏ?UzAUHoS$ܡ ﲣkmT;p0BT '] I:21& ǡXUIZ՛+W ysg;&I.d3&JYĻwi_ F37u;"YۖW#X$oxe.Aa95` דc3ه?iNRkl=;wRף7{8|SkNF#oߤBbq3~\]MMcrIk2fGEoߺd>B]PJٶ fE fhC* MЂL Ӓ56TXk0uqk0Ǵ7Z% Iy%<UWij=;'}Wp[$׺)ѻ k_;NbqV$ũm` %ѿqFEnv%\ߧ[zQOtX1Y$jqbl nƳWv<Fh^oV[;^fmd=ãKҏܿɬ;IaHܭzvŨKZ1^Ele67_!8e,봦2vjҐsS=)Λ巷?CfE!|>"#n652^޼-,yvΣ)' z Iߒۗ/rd W?nODD޺T O/.@ŧl7-=C̾[)zxh蛧kTb}wp7L)Pug`L bT\.R~yE^Z||TJ!"| p.RKw<qA©'qMdC;eDFE]O G5lN=SX mb֪¿E B$BMaucm8T \X#&#jk|&*hj8=ُ=7&؉Z ,22%QlCQa17j*~) %eVA0JO]/5Af"FMXaxN1º Se,V0 u]Tk|tMg,^E$9Pf"b㘑1B.d<ґ#p )%qaHŮVx^O=|4"X 㢈sQKXPdUB NPI>*/_nTr >&CdC{*8X ]TZs;ݯ7(xwn$ݻѬN?sqR JrDN3=0_n{ UXڱaxivKօPW"*w:Dhiv Q\xHʭOhKK sI2W$/EwF`%޹ALWmP-; aV(x9 l+kRR赁`vGV QS+#evguG ҩU),\v6{`6N0 ypДi=GL TA<ɫ| }ԯifLO^w_Ͳx'| N6~FX-UW+E0Qi kK#5&z M#>`\~#:X7 *rBݖx#e=I9"%(}IJϤ-Ä{mZ@/W'2X\;dQ4[EhMkevSzTeiu&BL<1#(o)[\o=h"<3-GQ0wZ?@Z;ka9!Qe M9GJaE.YCh)ZW0=gg8BTS=pKF'u,gi' F[*:;B/j9޹/ݵp?7@t| t=t=$;zH8{<;6D /stpv_ ft 4w`+]+O~ ]c:Eޛ>s`&;tn2pYNH)~y?9VI ca__=(x4bޛwO.ʺ~uu_>E7%x}I/C7~uS_ _]8ˏ.aWV_/&|vJ]ļwu=ϹLm (н@|0,)9-Hfe,fJ3nKr>љd2ì+?@]\L5+ Ѭ8Dp0,4#*Π?0I~˴O^fUGGU\U6p}gi޺8Mż~_dݫjmhAL$W'sQOIPtNuZ3:iFTO#A45&B:TR[QqXF֟gAQm`R^cgU3Mqa v"`eV0j:EXL1­Z7gh>_1Gtw\@PB`ë1Sp^gnnlڈ,,`3,_U V&7(|Aѯf8+1K~z&b/A ZgI!XtKEѴIPN-e)Zvb 0Q/ݧu~>"DUa>o7ǔ<&X]'ϕ=t&q]yQ?4RSpQxl.R K۸NMje8ɝ>>#R"甞z)A1=#U)GJS#D*g/SG$2p_ Mϧ.3,ye`CYr1UQ禟 tT AUm,z 6*(rgnU ^yОeQrd:.*; $\PV!|"VeE{Śثź_{t8[^V֪SPE),nBVp0KaZM[jУN<F-= ^U-lz]ECK@ ͒V`6Ls`Vdo0I4 gx խ̭̭̭49m;߶fzaK4=CԒ~!9 : D  YL_'/*QH$U$c*@<sQHQHVZ5Ius6`"2 YP2 ܭ!mĄ>'m9dV8f  ΁ ~dA$ }P0ᇡz `DĔaD )OC a F4TXGϛBMLLMEGтَ*?F$~ygr̆»_**4k9xەІg+!S[#sX8RqGX)!AL0}Ӕ<`B*dH F#VP*o>WN`ЎB0 Aހ@vk3bc; ر"`c!{0CD#Jc#c "(810qhO ?s$I\Q" "R"09 #| E~,c\±^Q B4O ㊂ ڼJI}e|""h=E[ ˏBV#\/K[1HϑЫԦ(@03 <uZ/X2B@%($h "1b~̅@ ĸvYaohþz ̧xXdtA_eJT0ci5Z7ϩb7ČZ93&l|7%C&& _CVЍgbyfq4f!0@a[3'ޯtiNlm#O(9ʖEY`^jJ(W*whh.ڥ1A6D1K򅄲 ΏѾ[^;{\}R?xqw(Ę"'"(_LՌ1ska L?Hw$UN9|AжsJ^|'8ׂKQmuNk7ȸ1N[ P;2dDjFGcE=Bc "b/ƄP)br>9T]4e&݀i<[.nlDo0D š brӿi6g4'F]x b%Y`CIkmI$ 46,P|FLquy|>Oϳi R9pTmpYm8;[juH?q&p0~s&6Kw&!D@:Uz3tia/-'}숍gcb?N^dnU%^|d9c+ #)N@B$7?lA5L&\Z5aZi!40z|be[gҡ] hcJx0T= 4>YζL`٪Л~}稟elrģڼdӧ$0%$DUA.-ّ9piǔvIQamJx`Zj^.S MiQ 8,%& ]IyԆ!]$˴`hyfgs=ih8+$FC`wbop쟳G8]Uɷt ʢ"ƊxH`z80Ԋ00ɻ @H=뷤Z CuS]jvMmqu1e vQ0?$vB Άyps/zEc Mz)e`?ŚNu(\%z=mSzv]톲S)SK24Uyy2_ys74˫?-'=w[x]SU;Wr6f1aChzWȰqCӻ󏍟ŽGjiR])){ݠnA/avTiznG U1ݣ Dpqq [ez'b .V~J>| f94a(M7kf4UC0|./^"go1x?F,M]qª_b1C 5 `ȑG 9þsߚ7yy_Fb'6s,w4fTQ AjFի^@)@y=f|8Z׋VمYE\߃z[5aLV[<ʉ^-6=ի^@+zX{K&[r`+P". t lzˋB=Η*bNSh+R4]]zR!?"}(l&#xzڨ(Yyycd #F;-'uۚyUdYfWۯ鰰zؽlbm2\P4\?z7822r8sy2 9էnHa Ӄx8C[̐rErpR^z![~SQ20wKfUxbh2W)tp-'-OI0Zv6WzԊޮE!E1l7դ5~.FR( okXo } ˲ ,<)t&Sd "6zbp Zme;mtk"-e R-:X6˝dX;oN玡Na;҆XX=aDCI+͌1(ixYxbϢwYjۂ W[⤇GAVE{=p?GWaWi#""A!p R:U_ &EێA+2BL4)c&;W%siD ì3?m(کug&N0I ז o5߿ZIwViw6.?{&EߴXʞt9 )H#fc|voNbH1bb[57&z,jv!pDD!"ж ȡ䮽6[ < 1KX0_S,)j)f $E=}v$70e5d֮5ۆh).oA>e%{d 4_]榰1T($R1+knXŗ{\'f_TR79vquJ -I[\M`0 b+- LOwO7dS(laƮ}206a[ lf}]}N//bӅĔ[8ۗyIM Zq+U]CaH:61(hӟ=(E ;=7ҹKVRć) XѻB=*e..BIԺS MH47iq(iK -Q>dQx(lpRO;!ҡ-KH B [SeaċrFt8~;6&*:f|-! H:O/~S/{6\JhU>@; Rnw#\>>5f~3֚ĥjۥr_Sr+ hq!D퐐j#߽u4bp18l+bq%2|x;n\5OAdw A]dk[} $[tR\z."ew[s˿{\.z{?Iq#g\+i, g23Ș&Z&H*fXi `%"S@D jnM6Wo@~ر^3h ̍=nV>) 0ȩ לݖnkkGM4*/İ"-E,҇"ԛկ/0@BV_=x׷<71׆ef"G{Q6 a84g>wl`4ݺg#e ܡ432)ߎ/mME㹺}̥wi5c 1jwmόmU6ȏLqL@]EZ2Q>3WhT@28쁪b YR,z>Kp Hz8yV "(Hs75ճFfq983⚗D~ ^6W YjNti.᷿#mu>]1d :x`oo(ПT_B= AWwF {fj NAiy.wnlYV-e$f7(xjZ?OVcQg4v [oz}J\Z; -C,̳ywTeWݤYpRrx+z`7KCybCƗ[hoh|CB kjֽ W5A'P{=7(Zsg\=ozѿ^0;la[z_=Y|9g(7CTw>H4+x4l>kPgvR͊sYO%N MċXoOGA)n6_Xm&ZXZ$bBo|v2mq³e[\=_σx`э`t(вvO3*GF@6iK7N?W+ѿ~I?u*Jڸ5=m$焤)s?f 4EXz6 K D{Vؚys6 ce"mKQ]kİYG[O -;ADZ8G`m7fE7magk7F qG)+k֣wn3XGD& @ꮮ>~Xn&EX& 폆d~&SA 2S>\ gcFc&-P;pu{:5#I2 5ɲD1LsB. V4iLk Y Dà*aHҡ*g4!D?pqe 0a 3 }1c ~J9ųD;%Jj;)(383Nd8RJAؤ*F7,ܯ͚@ܥXAs9xrb.W.+%nMbU ga >˂0^cfD1iYSn>N|aզm⡝* g0ʭD[tʸu.%+(iego\d|EFYm  BDT8ǭkޓ(IVGIUz)8AmmԺ6 V4xNAu-'1uҬF$BlsIN+;Մp?H60QzPH?i!٧To+(c"π |$T!7KrZݴ@%R*[KfwLT0K.b>N\N\| A2Ju2۩ #[LjT2N۶K-VD)L˽]k֚{`G xuJ8%)dGUzKhjpxK`^ghk3s *P'[[=T”J||e`ׯ]gѽOB*!~h pK#zj}.(jB1txH_t\*4\z2=Քs߭Ky'c /U3.N0(NNkgP'6̙BJToqp#sW#ֺW>otj|1!V+a%>dZرU-XQ0dʙTOwww>|{DM<*e~ycԷO Ōߐ3P8mrߵ+DZc=Q^\ Wfqt`½-P\hCAzv8N׀ĝ0 W48/ɫt]^yT0Kx?X Y?8T\Mcٮ0Kݚ.o̻5O|XMmRUh(4)?EX#X>'db)iE{5vʬi>&< >)_& 6{NiW7;H+Wm&wZNoY'dwM7um߮_>=smXN tKc{PoUGdEqHmOy3L@z W@zCM0Apwt5C9Ot)%¾๢zGCQKSXrΰ^~'␊C5$فS5Q*XEO^Y!PϚQS Pܼ5{Jم1kc nSE Άo!QVbP&[*0%Bt1x!d"vF㹺]}mL }el6(ՄSd~7.퍵7X{{S`ON(dBƒR@$S4X!qJ3(\'YF4c,U "L9EhaFCPO[ 4H_^{xS]w BޛbCY8C[$W[8pxw}C @6ͼXȣ;<Бyx c7]\=̯}5:3ЙJXsS2!I J&J("cTL*b)";Vkt=R6jVD?Eߗ=\oz ^ 7 @w/,H8iq,@1HL`J*L8Z%dRSI3b 8fqȸLp $f@ LkmVB_z{Va@ -Ӟq~IaVth/+KdG5AcK+3̐AIp uiЬ eq숲yFFrq)@hhOY`*!dUL7^)+5T?8/MYn__qle=E<@w6~BY+}_F:Oz\/e?3Yqr<ϛ [ R6BA!eo |'Gm, i"kՓ6b.Ԡ3.}1)?DHL8 p +Ϛߩ_HbUpJJ hjQLS I%dㅚiDLEtDOZqV{~ŹƁv]OCV:-53i.j=-+x?Zp)T0Z<0yhm?VhA )'' y͆Yi^X4AHBa˥>HZԪͫɾUhj(8NZ J&P;!% c*':18C$& "c#(1HnV"`ZLGFs?RG\U>{(6*c ;9ɹZ|XO0S_%Q!#h0P4⚵XF*2HIcSw GM`2OC@|`KmE <א;TXwҍΦ}!Ga7:Lx%*oƴv) NBBn׷*ʕ2REKuؕVjjv2I:d,In@;Mʽwލuܑ'l88dW\u6,PF.@tDuD`Ad-wQp ]SuXeEk[ h+]d\ :'] S>~N/*N>5q5͆7:e>LUAҾ#iQr~{8{{ƿ~$H K@*pAbTvMJ~y}Z Z"Qh5X%!n]<ظ̠! Aj\F nIQ7`.t(kpQ U;T9ٲ 淸jW3U0{qOD"a=ɸY ]m>RXESѹ3ű &3ra3kke B&K/6ɴ#͜XXIgrpPLb|E m  "qW59)M>7oȫSު4*++owjߣr{ɪZ恹T8pm&%,j A㉚/i>_tg᧔k=Ɗeqdjn,6ڦÒ%Io&=ɺRHBr\{jזlg͸g+)".<1o~~k!v^vf-rGZ\-myPَQelFhLjlhJ.$+Q յxNܭ5X?]iTzuD0VYvnbFkcnY/-{-4d|huL"k&zV{Mir,A^^j>`\'6lIk#Fd9vïxVhc]~ (hzV0-/[7.2AH k+N#|4]>yяyeA]E,P(WY=oJeQlܿ{c0?OrW^<4x@aOyl0%{@Eݡdu'1DӖ>&W|Le0{7fYNiϟYnvիo'j|G"+0 Quj2ԝ9Vu&#˙S̚V ȑU}+(N$ #?WvpOYf <>͉ږŔ!ҽ):^;3H2&j+s_@vG%zd<_CϿ%|q& ˉݤæ ZDzmxEF01_[ݹ B۵#to_l|YOExt\ADF>VDXqa@*˽\Q2C9jBDb=I"LI/>DSY|KުF5è.⒤rqud90jnp]rq> pѾ\bc0AsԱvL%ͰCPKj 8W 5\.{ǑbSH] Gf.[Kn"r^d 8Wn1_؉&$;3&r3R7z)ـzЋkLӠwg0 ëW=\ƶRB[.ϒr^#7/~A~+̟|ؕ!{߯ UeȬ'`=}g12^)|4Wδ1 Gey4 ΣYp͂_pV_͆?Eڔ L{!"&@HQb SP* &(xjRH?ޝVܪ@t*@ uGתIfc KJ74'[MS 8{*R&d(#/H@C"B Epig }h SBcI$hG^櫿+6N Q@tB\D;_tC2~ldt C^l*S6NBg.SDj]wpH01"&0/L˛ VįIڣ@=O(u7|zC}, nJ4މIyik)Of\ z?XL\'OMD< F"zO#*Zeʚ, KjI⇓ lo؛UO{k|A;seAI{$FO&D$~ZJ㝊{y5ٷj-#^`Eux8,HEwV(_#i Ki R3Iv'lb0 E1 uE  uIb&B>@u!_OwE`xԪuh8Yct/ffM?t.f ;c%8큵pdߧ(4E$M\3áXt8I8y*p(#g, D LTB`$e$~/pHF'b$SE_f 6VH:Gﲦ%%7-HN1[yd@%Z( 5KZ*av{J/g֛mY.0H$WƷ7{6-*}4\V6"V[t'"&g-Vw(+3yG"i&'+PI I*IF=T n܉2uW?nTcjsZ xmmI'ntnw-Qvq#K9F/l̇2r]2Ζ;2{x\;?t߻-? (˾ak0Z'khe}XBX<-B_&RZ2AZ3S|šjdKTyK˜Rw/u `k U5U8Vnh<8v;3hBQjOt`F dkHR9̾2)mx'Ł9hLJ@@\dSyo4X+sfN zY"ۈg1,Apq0[7 㑌9?G}, mw;g4t H)V=OT`d+[g Rz|Vϛ{!-7 񳖕6 .q{ ,-j`bqR1BǓ!I"|E0nI%~"N!gXnujq4$ݸפukɢ~BSQs~d'\ks%73#&&C+xHʓ8q[^ZL)`<- bg!84*va7m.PjiC&+(gi&=~οt ۖw)zR X[La\E3ż 7O͠č`:U2Q'BbjѱS|#Ŋ Xm2_]U*n`I[3.ylW^rR׽O\"ͼ-`֨BŠAA!+ͳbRmKr()td46˜& u}(STJ9Хė?YŽoaPO D;q_]^B^}˲+%7T/^0*;ug;r/`puze"$ڙD=D:6h5 1Tǀf٠(DhD,3GrtOz]=ݓ '\ ;ҒGl'$2bA)"2I 1aبȨA>PϽFzÑ'/ĝXq_V+,r,Z0Wb1(0Z$LQ5 "rB ^Åt':Ny7_D h~ýPAC>YATR{- RƩ&ʔƝT.BwW ͕IV\U4/ 2eW=bѫ{t]}H*@^4Qq3 #xdi]!R%H[(l"JZOٝ/*-x'+%hX)xuej |H=$FH .KˍW<(Cړ̮I&b~D3T<;xP`ib ܜ7^&R$1"GZ1BymVp17HbD35a& j-慸R 9б4bNyls D.I oPB(a(ϦWl ԥ#KaI-T5&@g."AsNJ`]/̓_~#Gkx@` W\$\lEah@%;.50a  퐂@7(cdX #0{HPJ`za E~ٝX0٫Ϧ(er ٩ieIAT?`3(hQtG;ulfe)`0 IQH*OB$Fydi6AKd iuQVYfOߏbUmtI)Oaw9/^!,W%ğ@rb,o הˮ9%(HHHW(kew1Ul{mRM޹1e>ִ'R!lw7R︹|nķ6ǧ9^++w؟Jw N3P6'xImJ驒7likwQUHBj"Ъ5ڞ{EDe 3GRhQ-N.Ϻ1~QFe 6Kt߫+)gՏ 4Ef(z1b-&D -MƤ̅ț<[TNN@h{zdgK3CTuF-7y<{ 5~''tӡlQC3ќ88VT{`j2`wN$P4!2lP^k`612Oۘퟦmw.#=64 N-4yF-l73\= ']篦qh2ފjm6WmṢiMH[q"`ZΔjvQo Ӳc~4 N14jAsV1 ;̢u@'?33c@KGHݲ ;?%pc}%f9tW_-/&Ϸp< Oj #3J[qo"yfl(6 d| ;t(%Z g]Fl8} V؇Ip. nnyڟV'ImƱѮ5oq _N0]*ΩةFf2Tƽ5]q4'ګ>k-AmMv&=x4=,`aC[O+ #h@\C=ӫp0pz0ezӊ]*&:!^U8\d_UBu!^Q8P<\kN劑TKw_6NFf2Olʸ<0L$XNo GW#Bj-HBP{\T욲U ՠ>ޟz7PP;+1钑LźywUۈt2ת#vM2hT*41t}[|Q{ooX2穌y*cy^/c~OfӌD4E\T>.6rjVXj*;ib S|V%[tm:bŔKPl? ۜGYsT^}\&>;+cWȁ;e&ZPnRu:An9,B1LsF O!I?:@8Ķp]04jͥ˴A}[E`Q#ÄW2x=s.{fTX Ȑ#('6:H . ,`5VH2*GH:e; EHeR~ gDP+!!r"\rϣDL` ZFU sy5ڰK!UP+ro]xF%%gYQbsAL 啎HB0egY4G'8(bHA! @H@KJc3b.'29TqN9"e%%KjO" AD1wK]a#'08sY_h;jAMJ,Lyɴ (Sjldj&qhj&ч[_9['nyaP~_W,OЉĩ,zuUyOըt7ǃʻLfS3.sT1f?e.־{7=v>m44t}3"=W8/oίLy97[R-w+^j۬tnzVQTJ}s;l\Rү~PUHQ/mz_Gq2_j|6No,BW3WfQ~.3?,SD[P5[Vɘ8M*3RL8ssJe qDl n4z$h)z XXvH`TC t-ƟB~ WZ嬇>hLy NwՉ>3T L^η VRճFtI# z->X$H= XdF #62S!0ʠ(۾!tҗ']lq>V=R\BP(gQ)FX{ ƞ[,+%`:8`d}HpnRne;䎺O2Vx5ĖXo0{VT(C8@V 2,0i:LAwDKb7F3#sy*- v l xG nT;RڑVTaE梃/İ1.4p!`(Y^3@`39,A ;f& ܑdvZ#`<W/j<(ɠЮjҐjr`*<&7drOW}X^[XyNKs*h~}ZeC̯rݔ >Q&A>rd:,O>#j9i:y<\LsQ2q"YI-$- jY*Ĺ28Oa[n/4dR?l7Hػtd=9tT;eD ,l)>wI(fN QKHZ65c(wh"K󉮥MD y0QWG>N,"t89%m}u0P$ѥYh:Wri ;7F{qcX'tkt]guT(gPbp"0Jk}67iՑİ莥[mMds>eC G:((Z]LVjCow0TE'OeJ ƫe_! fΔ>XՈz4fJ&,Lw`9O\V7(9'sÞϮ#/WRz) !(?{WF tQy0v#d!MY%fRƘ$"̈G[ˌV!G2B:鞺WyKSP4'zVJND]yןr'IP{񍈋Iҝhߕ$X-)#4(g/*4(I1&4G}զJTvOfl2UhUv2+P=p< 1zN~ݫbgy (Z.=&[>PiM4@9Qw<8Ԡ?܂Q,~'IbE>o$$H<_`Ƨ襉W Dze|W <bn*bnજ(_5 #^{DxprZ+k)%:D DUu_T몁/?^8xnS[s5huv~/u)T"g[TJFO2/'Rzs%hJ+tP0:$fB/_}幆e܂ W¦f ,fz]qc F pw"=02p0xÌʋ2;Ղ@.Ys哰;Y6_`ȑznQ3q=ߌFfA,<WqU\%ʫD9@x,e47"J́ ^*c¹I Sfດ;$=;pHNw8cח IO.% g; bGXu)*t`mC%,Q Yq/k3^B8V {$-o՘'Jtp\ 0oGi㶊 ]f+QPؘb/5xBE*nG,`oij#h/"xwY=v4SK+ŰD>JF4 kb[HzŠ>1.$AyǷ"M.TKѮ X,Rj]b["`a]^meٴVj\Se@9ӋSBs {-'^׆J2>&I GpN5ǔRǝ\9rj% H"s|.s?q­ T;OFk .QL0PKnX2 aHp@{NmLYA=NJH=)i];CC !Ig CX-əByy`aVTY#L#-.3\C@e[$ G8QX$\'KbZŐ"Ϻ;9n2Vݾ/,2 r쿗vbql6Sc3^B|4"[kc`Hz<~Js1uL75O%GS"!H$ʙ ͹R2[U{/3Do:P sOՆk T䛨\3pja f6 r9s !k,b( JC:OJkAqi 39 b^Sr0I*iܠjD65B-Lr%O]#=ظ|'!" t@˖= Y`iHTJ 1Kv$d pPds}N4dGHQB-$6P vט~ HDX#ŷ `"At/_k RAx $W0⾉X o~7Gp/n\ZNJ 9EnSBCE`qP1JW#MWeR)"tiȏ1|X vl!4,Z3lu(E7}-DEd9"'o'!Us^c*54omwk>R; ;Վ/X jǎU;3Gr1XvN=Vxtcqإ"LB)Pu[bkLcv$>RgCmhwlQ7 -'ˊƛyP]s(wvS2-ICM=JZ;-.亳)<SL: $:"IO1Wc@1ޡ7F9DkQ#_2 KI=V4jxE4sRv?'s4sՆ :f\= Û;(#zw k;I=^rj,R"SeYlF4aP\nKep1F Ke4N94 HÌQ38N)䒘W(Oq V?hԊ8ůԇa߾M5l0iYUI  :`09B\;as <0 28}bO%V =(E9QԮ8&ϕV1^83G8~o0_jM7f`mfN#J*3`jQfPA TT P;zgoCeҏO ¡ E;޿Og. V'iLЀ3iCX5+KV)Tz`2cp.׷2l;rڇ|6VCXf50{q֐Ky8,͇lZS;0%g֭>̧̚mvysW,`5 Hy l.[;^o6DƟak9,D=@`s Lޏ?vwiT~d_]HjGD<1;{3mұ~p~=-كwhf8'kHaZC+WG:]'m5n;n2#*K3Wu.ÍBxRҶ C q7:~R♆0fg6)N|t~M| #ZȶV;+v.4<%G0$eDMfA"MfIv:hTĘ*]20w~+ƣhl$A*WMp8nB!Hm(NvoY_ON o!]R٭ TmNtnpkp(~SC4fx߽wD,g _?{ۄ'^RX_(V=bLgԻ77?G}Z>؋1Eœy{qyo&A~|e"i].߽n sʇx5ϊE/Ar%/Y4,M7f񌭋q!=K#f|3 Q pZ&]Ir%ke bUqRb?-P/oF>[y3lf6kfn6N9'80i`|{?x3ݽDorQ]7Wf؛ٻo0~S;^6ƈ@<0Ϸ5@e^ BrvyL<|{;!{J4Y\ZJLg)B`pp#/OCR2]$ 8&Ȣx.*<2ȲR*| &;I ^(~Y.CULkK o{jÆ*o#'(AdiյS~^Kqu5mϸT9{(bD J R\yxU2ښ4RW$Jd5^scI\x!V44D8*zR!)E`F[oDb&]P_QGr-U6Sza!DkB#3rƳ; k+Z@hCV@ M#uT f8 m9 h%f5ﵱq>߯xx~=z7VeknGGC <<@ksm." U;~aU܎XSGWvv~Ìs4W U-Ny5!&D HM<|M7>$R5 D;SĈExAQ/)+m +Y!K܎DnS)$On$vN9IT>+!Dd(lJQB/DNYФYa)Wv" ܎#ڪF27+gu} L*z9~^I.j#n;t`W\RprD̾EsKk {%aF='EHZ\Zkq!e DIe7%`ka{2)B5#h@#vcXJ+jJ@F )& P!)$i݃a N԰vAPH X/Di +^PZh@P9>UPxA;ЎA n8ɩؾ}ZptK~8D`QW`hN\(B/8^V0 iG J4C޿`p@ͳHd,9NK' ( 9% 5 9!!_ ,gi܃Š"I)H5UB@ ?0{\@E׺ж`‹XYQO/Z8K7[c'IA zQ 0DCb0za@!hrCmY-X.^"MEq[n)3ɂl]yq/! ˨WdsKkU=Cu,J+ wb@:8eV*Ƀ5; hY ?]3_U!/ޯBy _l/+&c,`gYd"N/oE&D_LI7bErZ }Aj5AZc  Iz!d A [NoC$]u A~x !Yy3,fQ7B~EHEE)Nv1~_g_s~s=<ٯ\*?El, ?&L\r9J]GIţ_t=2qUԧbSc}!Xܱq>X)qтk&(LLKiBa,QfS"EȂ!sygs{^7nzS_+ޯusGIك_ ,z{HbK E9yi`b}C!ȹg+;R^o(4XPЯo(2|CI %J!/7 !P nUjXV)01#k!X,jK”XZ,?/7TH]oXP_P1rb}C%-3*uaZd-nIƨfgdH!,y'o?cQ}d-Vob RDželc) CV5Q7=7Jt$TB0A`E}vAgB7AXE%RaL@0Kao0kCLJVU SAV6vg FQH-4Ȧ)´樳cU˩uJw;ɉol} :#w}CSU)` v(oꋐU @>h8=3>V^C 7UUg{xpvdU(\< ,GZ+$QcQo `@vQV0=R{M,2#C߰zUbϳǝÃR)g{ j>}OIcG̦I H eiՐNC|;cd$4qOzRЙ" k2*{׈ꚯ!s3EZ4ra\v kbgww <_7o>὆~DSv3_vOurrUw4Y%o/8ӥp r_CyALYVg9۹_} oޞXOkX2ĚBw?O_>;L\ƫ1WIy L EĀqvl, #C >!ky\˴µKLxΧ(o=Kq3f!,cW33Y콴 ֜`/gS :O 0?Oiv**ϙCwkrڒ4~֬$~lLt-΍˦ Vg Uw+;gɊkŭY|hCum>k64tq/N#A`Y:^.S,m,Ϛ;[۔iy򴧻qa?,:ء,93Q+r޶%/u6H ZˍtlŘq|V9fq%ޞ MO ~fv~x${IF @7K?_k*ㆅkV=WG;WooOן/u;;u7d<=59 ^dSs^uRXmg%O܊y٩|t<1:א4˥-G[t529RiꏚQT4BoGͅ5%$D[QKH@2F͔nx@tRw7tpsuG+&G &+T_ `[Rx=HC-AΝ瞻%*gr'p+  vp<}޹?u܏kԮI/oxqy`ۅbfLy2c .8uCm{QE<˧:'e0'\l㋱L,2#GYF3saE9Z+w*\?:[J"hw=y* ^YW,dRvtj.!N+5 qVR@-0Hd 6z{/Frk+A Ɍ޲’1tDW2 AykB cry7~)Q}e hZzgΗEw?V=ʧJ[P\bj;b[52E#ۧǏ RXcАK>-ܕ޹]?߆^th`a Ç^<ѾGjt#V{5_[ݛb !["Z9.)a)2% 81#hNp%V>  aOk֖y>]:f*aXpӏ"Z%iOA~J2B_FP[4ZĖFlVTU\Vu7V_ qekϯ's%NU){r, z uFZNzx!-_u)v\%jKhB^.+@~T4{JaF5.%Z6 [3,vƞ:\ 3_ MŸuNbDC0P C'[l8GZv,TfѴR=EmA0jb bL ciҎkS _*'w;OC\E3y2 Ŭ9G{yZTN00 þoUon ;-*:7y־T9 4wׂ>e˚xj;Ǫ=l9k둎J,h$`MM{3Ԃ꼇ʙdwaUuYwoFǯ߾!k Dc ;5{C@*:NHAwJ%y ,Fn{$v>`_b泻`ƌ?(A?l֛f˯@<x Gl9 S3s1=>~޿_!^/VM8ݑDBXM\ Gom`c|Dn~o^ =nr[%{g-Jl/*kkFB.\Ddz c'aoDjD'0D F#WU9Ic eZk]]ʨ2k8*fEԝ쉖53MHȅhL% '=̉ 9ppLqE0 %>`tv[.eD'*ڭCNw%Qљv˞hYք\fT ) qhc -B@`19BBZNn2E_rn-EЀ\A%ǽsR-`ºC#9mh$X)GvŠ۟Gȴ;Nw%I]d;ґir"%Sv s Sʚ(UB8VvA`q8#k]h[0en2G햋Av]IT;=Ѳڭ p͒)\s%/  &  `DA0^9NK' Aib{n>h9ݑDΒ[Dj&$E4KhMߍxbz PR9: Nl,D X;޵5m$뿂n._T[:NIJ5W EjIЎsji I @al0=}3SZn\VZ֒-hTw;L75Qлh[h+"iލ#j)ė*De*Jq;OUA2+B!dMLw;L74QzݺА/\E[GVI5Wo[p;A ;]yX7Ѿ(ֶ9֎ >껏E? a)c}Jߩҥ%Sפ:%] 'Ht54i#=ީ%poDGK{SKJZڄj葵Y0zjkd=vZ3:dՒY둵N-A2vZIu%둵Y$ ǁ52#k=֩%P!YkD5עGzdSKOY#zdGֺ!kk#k=֥%PB!k0Y둵N-AptzUgGֺ!k:#k=֥%00C֘w-A"rzGGzdSKC8CGzdSK`\%Y둵n-A*|zȚ !k3#k=֥%H,!k0c/YωIhZ =#k]ZBZua {{d@')XZ!kZuk (vzȚZuh R u{ÁfZ77I'B4/np0$z ~blwTFrs;ʍsa>Kng_+ǒc9kKF|q^MW?^t1sbd_d~=E>}?Yp?sJ_*NZq.g@dCi]qy2fs3v{a@W$awaUK?l>r 5'~} _gTUΐ1.̖+пdsIWp*"OZzQVZY4I`8 ia9edFqFcB~% #p+4cJ` PAb ?c4W͛R}γ˴KKBہγթo +yRִ:<\FOn:͎pP&yrf6O/a㤽.g >Wp'gvűʁ ֋s˵TLGTԘghO"j)c;ׯ.|+30U9 @dj¶UW=*p |1T:lCj$Rmۋh4!|+c1Vi4x4oJoGw,ҕoe2yEޞ@jkvG8h|.{ІGQ\ 9$&- W},"! #FST{ZvU+KE5'B(/AIҺ% 59}@SMN*p#d"% pXv,O9pELQ(+ 0A%XE!mBNF} -R,28P??b#WPybAu`"Ɩ,$-K C1~@FSl J,Z]TXmg6UjǕVe_- 1oaW`0d0}^ 9"ygfo|1h6/VFz1& D Oub2[h+U34wwg7`YC3︴?a.IM5"CH90Gc wQY/y%aCҏ{oF\J/hRBag7]ٹN]MNc__M ԭqZ_L, )ģl(nPd? 9D+ή U#3 w_Q'wf!.N\|s;/Z0YJ%u9ewJ9Rp1ĀBv}N4*ˬt3u9dkRNx`Wq !6L<\wdr;P)ISbR#$ef}S4.zRwKH 0 #˲c5C;|S !;`2](_v749gFq͓7\dVݑx!&kwY mK!h?hɿ.~!z,4䡒v(6Z=v!X~eҵ8(y껔Asd$b.KC9IJ>/*$h m"|TRMdy͸}6=بxJCSw0F $ !uVk,dKq ,pVHPhMUl0d\L! +gRJanWSaYa:ďCf&> P%}֍GpkLnˋB*JpbP>$2QPYz )_՚}c}|7f[(n)JYF #-`\i~\9=HdJ MT[TaVmcHꐻHMolx& q6ɦ6u'78 d1R7@w rGNJu_?zj1J?RgCv{p${}L.z6-_U*^oNzXs->~4;GY19*,}/Ui?Z/QuO`%Su$c^W NXc/C5AeJ]}p*o!Z(z|RTI+>*9K'9~L7܉,/U"ʗ*x{`&+Sw#)A{*H_)F`Y_g*D I4xXinVD;Yl 3*DFV{P)Ն3 {¥GxIsN5 SC=CD80B;}1Irf"*b!/v1X p\A&" c6ClNōu찵턓@mfڲp+M>YצClpC߄R@\w H]];6a ||=Zԏ`XrH̙e(Qcbhc|рsbRTD.(.k<}gDFE;G(;K幧-ʴdhO4H#J~5/[vL=YcY9;ԥ."lϵ~ ) PmY횯?3SVs#SrzDe3v؄ T[󼅲q",GehpM7 :4yH58- RbH7a2ZtLmX;Yg!+;} ˿ y.uisNLrHS((eDĐ JF#"Z/Ce'cC$1O9hpisԠD`huXh4pGM B 8MІZYTn-  o$WoУDm j*䃭xy/K6G"x !{\"SFbͥX۞)7'fA%}ol2_5y5"ͳS5y'k+*73hZzGrZ}lG^!g8<1j} l~Qr7-%2OkKֻawŻdᎉ-of2\FH&]H]hNm"N$zǕn%J4K٥%w/.r1o}dâf5:Wfj4dN= IO=kj8_Ŕ:E 1#5 Vg~λk$}Avp[پSk'ƍ(g>iUYָ{HpM'i8xsk 5ǜm޷k&ϨހV-0~n4/f:۱nzN,W,yxן&\ru3=kklن~l G}M5Ԍq--XՂ-SZP?W ZQA(\t]RNFGX^Yڑ3Io.YA!$ueᒢcbC凉ԒVǂ{ShĺA`7P).( u$Gwd0;A 3Uhf铵$r({ev][o9+_=-RSY,p`22F҉gˑe[ݺEl%;̌[&YUŪb}~xU&eUzzmO]~|); md`T&$~9ڜp]qg7~SR$ᦰRQ @vhKEŌ*pzF6nSjH#Љf(Øt5F6,E<(YEȱqׄN88Te.P7^.?J}01Daґ0ѐ<rPqP{ٽ_QQ\Lk@rI(o%Ln>Ԋ5K[Im>ZLD 56 N_Mo]4|쇺&Ƈ;w7?/>$~췳W{;W1Z#w"Wqp@V|y7;jvOeLhܣ@./NGvs陋R<>f# E4G裷!׵jfry":sn Sݰn [hL1}vy1h\Nm\Viڭ E4C*_?c`tu\]:{tٕGl|9܌ΟKJgA ]XuiPK]лl=t@|(qO"Y@Ivitbbm:|3 e4oG7ۏ#gɥ=_L}-RWU3aI_-(|u24Ũltq$z27pW6&fbs@ab!c64) KY #mj,UhPY()+˻4Ʊ2QfJ"+?v& n]2=N^L7:9}a7 -%T j /DÂJ)( " WPXXUTn-s<%4Mt'fyhJs<ɧFJ qr SJ"97CcG8)Ӫٻ%ZJh=K3.:?܅ V8WPwmFW} B +?wMf}):w mUPru;/(0ea$FZSfMyi/Pl;N 'tS9b}9\G!'i'ULR[xqR3>^wО@O7J|R1iD5DNj0'E$8$8::ϥ1n)<9(G޵*ƕ0{}ZqU|lqiQe>3qpXjUD"w-eNׇ줋t.bvE3;i1^%s^7BHI!LQ՞HÃRTJ*"p3RFϫѻX]Wj/^;ѫѢX;Op~gOKB:!1crPn̡wùx_sy7w\p_iOJJgh2g')U8i'whDY;4w)}тjh'ewoW6t{?w[C&\5dҒxJ!"+|)+FVa(ѽe.$p902#J>s-2Je% )k.2OSSxJ-g9NiNAi!\[Q f k8ÍφP1{z^VhI g(^\MJc X!y2YR- r.iCͦ$t@$I{3_8l)'IR8NJCشkw4`G))8iJq1< 6Rp{N9FqlK2095eC&ω=}%"JE싦d7o=6➊JEW5u^8=qxVyhmrDYɹ-2f􋻘 dV-iI%,qQl\I-yZJte)eA+r Jll٨,>wݝ-^x}m_K:@ɮ_>(1.n)GRȊJlp"B@=eIՁ806,FrVz#g$qqo-*T1Ap@JJSPnApѕ4S"5c͂?__PIB{M'F]|]Mƭ=KfTLnL7wdG'%}91m$S h?K&=L H:QR3hk|1U\%S. F2w{,䏍 n>c$. !@/HPo?]5mI %60D(ZyiLQ (:>͸hFP9v %UD\%E9#6r u&*B.Z%2 'k-4& 2xArw,SnrԠC`;cĄ3ZVrȣK6ȌQ*r1A×J#R@#ѣo n-uX߽G!QvN`ubtp+o)Xl0&ڬ URЉ"QQ[\G! jDeR0fV|z{Ƈ$^Æ:zlkd>d?g?7_Gт'$ %GXv,7|g&x,qxK(oo>]oP^?t>@-O9>o׫N?4py=k4寗?\,5Vؖ?58l 'L*P.TV6zp7/[p>r%Qǧvxh/N&mnGDcp!}o øĠvP̐)͌Z,1G\TVU,>2A&?.Qu?=EVB$erߩH#1 @Y-s 9K#YNׁ媜f҅ _1b:*pe0*^I"PИWBU (j]ƷHzJ4SE6hL?r7CЄp5Z4+o~}rww 51`(&.\\`>"rDu2(gݷiwv~CUOf69e,i%;|9@m*C}s\2_K/:Ew*4-JS+EnLV6Bo)o,<]f!o!].IJ[8A=SAo wà'pѾ^9iPl>C{L'U|HwDwERRq"ȉP#x:1BOQĞ^:Ws5y/H)x_LITiF NvSyRJ(JktpY(d–S UirItUN.0ޒۊݭf._s&qx5g6if^2ۛPET=,"E;d s"Xgde+0~IXwe)4- 'TY4׊jbV B-J X0r&(jP{)?~8F^荡D䎐;+}/mMC_xwv/Γ>-|1 ݝhx0`8t`!ܼJZ!"ężȈ^T h҇y2}qD0_^ k&\f"aZZ46hyH]Zre&Vzo!%rb@ߜqtF}xir1u,[qzx ^EU F 0;b,HM}lWVLwQj .}op뙌l6~$j ņ_*1hrpGjO> Re;sr6ua)t*FjN1o2PE4or=dxV.W쐧ܿOI0)3\p2H+! r0ĚJ" RNvWjJ]CLJw TM.z@7PASJK73R!<3,?v;?x=U;yk}"s:Ax䋿Q`Wf0t~룭puah;M;(2Z3LQF`nuX 1Ma6r*\^6Ӳ.U.B~b!]|->{[_hqʝZpxsdb@y;[Gl DN !O)APM dqsJ SeHTPʗ`UP P= tֹ_ 71t}'{Oyu#7$<&_?^M|?Gxt5p|l->`+o'Bwfat?gu}N :.  [R -Y鷼op65| ?upcK &pʙc6 bK@_9K3|&2)16q48|ܤP+I~-@\~mKeƯ"9h\:0`4a1pi$\Q(SP`pDRĆ &qxZeO*MjU˻":]FQrh}"E"(lE+@IJJ6aFs&%R\)0["YO7g|2+7S* ,՜vmlY` $yiQJ1#eN /R&M֠}RRM60sƂ̄>@kɝ7U*e-TbИ Gj3%qDrDzdsqʉ’x=.``QzRpId0q.B,^ Vn>@E"aSd£/ `sB1m*\Op(;0 s@ `5Ʉ |WAd p5=J lśjiܔ F]w՝07Oz [o. C!G|el68LOw>3 l>x֞>ljl\}7]|Ihv_EˆK$ufHLcrWg~+}:jVs~9W0 0 0 0, C񠣖 )xlF@M8 Heyp $k9f9̨TY8{бjG:V|;hd E XofL-w~8+0<03`p(&މ 1W:ai&)֠-'(#FXvNVwEFIBX\[8 6PRRP,eFi26/6 'MdG'7#kRZ~\>xLs*QZq8ؓ-8K1<yس5QHfc$uȚ[L!ϣoP%dG>ԯ[1旅s*c oĻ BeMl'|,noW ^Cy2T0FS@ xVanbmܗon[ bȲzrϯwM^{Cg4PF^h rPK: kI_$ڌ4v+~ KZ 8c-Q__? ȧ\DH.\M:GrOof&/լ7 ?[nBhtd9ǰoHlǑ?ըxTdT|+: }`D\6-Am:? 4H*EwrGLpq,%g#(Z&d(des͕LOڼAO7 YHp(! _/d$'mdvUJǴ!R &u藍m$Dv,2_ꬳ{4[!kl2ʄW TM=)OBe9I/^a anelHgM>Q-a_ _&þ2_OFZ(^VH<7"؛lzՋ=7=R ^Gժeޖ49y:kvs2~qk *3PG>)J2Xqϑ:n\K}N@]YoKZdy?\(Y3`S%+ ;\tP%n.r BeBngݚGɧٖ Xe!N*5 ͉w 3-$dIMD`.S-R᝿{E\vC$:ApyP $(GnC 4.-s?ftyI3t6;٤ι쪑-Wl;/x$G7Թ^%3/9d`}0?r;/YwWzX:gɅEOg_(H;Px$Ol !^x4BƯ gc\qmdƋLr=aZ_6 a!}ܪB}7f2DCMU%KH!L?ڟi䆡zY~y~J i;^>XBIspISMD"s0RghF_mYF 73?H6:y-t"cuk/4ko -y&YS1`|w|t6uu+P/.^lEDZPItq,\=J:E!xcg)jF؊rgrm 0/ l>LgϋM1f)8`hZ9GrZoDJ|Hˆ &|(rc]nhe]%򆃱D ֜tm@8;b/ÝL>ߪԄP $9-Ŧ2!DR_'HVy[ƇCz4S=̳btX_77AtIf.:-d5xCsZݽ,GOSnrqyBL7?8!kMm&dXCyO̊O?&֍u)-;'4=o䵕НO B keRޭ1KP_J#.ÍŶ#)$#ةGImse$ r"7Ä=T3.1¿9Ίa yqalM{QT"Pw|]ēy!xT|} IECϪ f(v֛@`JYE=(@ =hلJ2%j9y6]@H8ʫ@a_WSZ=&pYo PByN@j&M"i\&Q2fεiR(th!UۭavZR'=1bu)࢜s?&F`1$n|DDk7ڀWr@ scRbE]VPqJRD7 +w~_Ai/K=Uܟo# H򣷕mǪ\=!CQAyZ$ Q8[%ԁθrEJEK+4*2)X0Szz>\߮fiy/_6RRN{rT+FVwgWy/8  _6={세S!f.SN[K1!]6nD@H Rҵ$o޵tT\ۙ=7wg6a24 5ۖ>엻7sjl%}8^]^nb}I`z`_X#[wm.rQ_AN&½ m#xЪ-@ |6()waӝXhhuIm'0'^Y! s+[ت gϸ;g?ޜ蠻Q#YZO`)֝ kܶޓrBF~]2{;vwک!'_\ gO9K}_no7]}h~ri|/8X>>Y-!M%ZMht(^s>rYDx/t#5G .aAierٹ#/_=uo/x,>ӏ~kk3W7MWӉɄFJI9#c!t0do ?jhԞrw?^F?.7t)o&.tvK#Sw$?'8\s h#]޽)Z-@ΑҾa^$BD1BjMK eU@`Tj%rLr˶JV*6ě-m#5ҦnV8N $τRXU)pmB^ yѵXN;$Z[*F{fPbI 3}0*5Ўq@XZ|}K׵mtD3hu+Q)Y뙖#WD|t/AjlNiJn@Fi"F8J%)=mi G"Z99[䁐2#J? ";)HKs4{GL> `*:Q ϢXUhzxD QQ\ZN"ÜCOruhbf9cg5"k:BzO͊.K+3"̄ҹA"-CO讴x^ZQ~sw/ҊpDݡViE~q챴Xs8bH5vXiEfҊx0 W )`ī86;4ҼoK>>Wt֪9ݳv%$(8z#c}Vc֫l#PUޝV8hk|jB\?7YrŸ {Qo®}j1Mvw8{,)ejRwrL~4SC,JTmQ\=HV/DξTQ vJI-`Aj`*&;ƺӀZ1Y`'Z+m; "`2)hP p[IȒ"˸H{ꨳTw'^vM f_6DU!%"vivP M M9[Xlym 9gОT( y8=l'0QVXk!2!bNy CH0dJesíT&Y[ؽY|ZmݡZ5ív ;eI0ɌPsɔ b,2DNtJO:܂[ll# k7 mB"a@Ѯ kL?KRTNhcR홠|f7F|677yސc7F%x*&+Q֚BCF'4+rR;5$!]@@FͅD|h-!=rmPz:Vdmvw@kv% q'\n>Gie&GZ_8e i}("1]3=FWKY~ӊgMG1M+j{V7=`.GkQn@{8% N~vgX,>V͖S0pld$Yk~=OX@m8Y 6fV7>Q8 ShB䃻 ,?9H(D=W@w(m ?}޺ \%d9 nPLPͷ\ai9漯Ks羴K$ xi<4 h$+񵡯jSC6Z8=AFf%XYR{woZz JvUw[I\:":SC`6C lv}1 {]ߧ?Øs7lz?ɗ3'<]7yTp>OP;K_S7/D.|q2cj٥4?{3Hr=X2At@KYJ\e(*֐|"ZK:yKݔ[]Nwh/)vj&$+Z2UtvM[]Nwh>O +xڭ~vkBBrݐbva)z3Y?iM !ؠbA-AjY ]9R2D[M($)4K$nsܲ`(<"s^j!"mR%/#` Zɝα4wڢ(}K@_ŜZ9׀jdhZ5b_ҏi<|}iy1>Pv[tyi3rvXw<5,AbXYypNl3ϖ;6dJ7m%[}!L/]5-6UOɥIe q.RDx}sLJr3Ø$JvD\tK柴 VXS﫸B:}ϖdm~Wq75(ޞLƨ>z|ZVB3Ebf.Kقf-Rm^j40K 1ځvɤq^Ad)3}~okqa;;,bMK[E="UMnFU0, NjkWqχ}FuMunV\vS=Jل2c^b`xIyZؔT,TK/hE,yb.t zeiƲR%v75_hamO6{%g{QdO}NtpfӬɢ> VlnbD&Ҥvf| ꌻ]yd/;a[/|dE D"Mb`/XFG8R._gcI .SݜJ3=پm[}[JfiVFXh2ݲf3]yp:>ٳ Qڐ|O*-M*FWg"Q!G$>;O˖v9#GǗ,e㬑.^FaZG5:#MCU˸oPW{lAHޓx[ 6]dzg"y%{T~ЊE[F@3aiR{eE_ 2ENW>ki [C3^W] hxFr{<)+ ,1у9,}|M'd{ސME+zF7 aGɳӈ1S8- V;T-" ?[*KpOTTz/H|5 ?~؋bsV6\n^Ozdo^Ҷ8_~Y׵ C!euQHo6$_-DiB9jTQ6oVB(̳:-Pv$o+ ϼ"Zm5 0f! ECV}˱st)'z(XUz~@?$R3Dh*5/whH%U{`@\6Z7r p^-h|X)X[W)"(dL9}9y K)>$edS2iPIqlyҖn-Ku*bܲ[YaqfjR)zcp'B8)MQ=q4A{PEAF$:؄n) Yu>u%&wLI@X[50E3$\;Bi=Čwmx} !p:=s7|F "M>[A O8-ϔć+ܖC#_mt/y{VbSh?"}ʌ{*?G)}7oxxuEgFPRs#V4x|=cr͸llVM̀-34{mk;^ikB1m)Ώ9x2 `" '8ZX0pe}퀱Zm_$8[Z tlA2g VqE #$p%5+șQ ]eR̊iYA`Ɯq.!F?Qq!: `^zm7LS<XD3+f7f+}44>e{΋%-Fpދdvzr3lO(0p̪At2IoMa 6cvZq,kr|f6΍jwv<(EɁ7[F޿P4۵{epPRލN 0 ߜz2ʩoզD新9$d l&LATY&>YPl! mMN KoX%9[Qvh$[nz3Ri/P2&ΥN қPvݘ@\51rwy+K͚ciiةyR?c\/4YkAl\bCQd\RbYmVe%-%MwxPH+{ي7k+'wS;.3G &c8'. kW{櫗!jPj9dkL(V]@xZ{PxǓtCӎui#Zqx2pq6?9QwwqOOnQr,޻){Be)/i{Ak"-8KǼ;~mc˅B_I03٤f~Miդ/\,H_Bу RLQp 4SNs]@ -_ԮB˗o-йkeeW)'SmP4Jh2F}{/<F뀱f0ef<<9d1^^Twp]C#T5^#urmY*NCƘ <+>w/J{5(g[_MÎ].=߄m'oz}]68;6A[_[ceg+7/mY^x*OA.l=PvZ= >=%onqN`}Hn[<.0g # Et{Pрdɫ.ㇿ|K Y#$N \E"Fo7gO}Q17-qwoO>ȟL_}Q8{{2_vcNm4 ixNJŅOCޥIfYr>kw.QC!{-<'rҬ\1v}* ,nj%`(h 8V+W(̅hTPӨ Wѷ\>VR,j2~ !ꚝm5cjθka7V*)Qw܂e ! pNyGOBלyZr,XI0C) T\#o)DkV|]F>8E){yoPvѩz|BV&<6ܟV6g9ߋHBd!kqAfi9?e%^ˉq41]qPގή.ixAS0XDk+̙RNF2 ;v%TٻFr$Wz  ؗY03yZ-[mIe}Iٲ҇HUUg!F;HCuj={|X]!ťs;o_ށЭ#<}{ziqu6bBޥƼ?X|&NkK,kbo9ryb0uy |Pb!-[/#/mJ+yl^5X DP -FzµovLv­}f0vv9Mi*dd4Q%ba|u~=CYj'׃qT׎t։u~0 ~$n";ހA`%(DƁH̭U j1<4m,BY%R|yIgaޮWM H㷄_PQ7F1!f-/ORjhj0>Y]qE/@Uߊ^f`E'LKޔL+͕ zO3^}ǩ3 zvs{+ o zI}!X!^zi?L`:z!{4 a(jOif͙!l:x+];آQ,W8WӃ̮E`F(Q0GtO1/ ܂_,@|ÓMSf}0mw޷-gINw u6%R2]Gu!TFtQp]1gNEi2A*v t~[CG IV;hY3wj*.A#Y;Uq^OplZ>h!'R1*9@?\f_r,zY!Eiph#ZR'\/E_ue$pdEIFϙn<[s3фpz.ύ_2tV+h|l1^R0BTМ !ϕ@;30u` #^ '[$(eѳ"C8@>v!x:kgVUaXNxAyEe~ l Y8> 6ov.3 E/G6xbUEN5#n*|ؔo]>rb:_Ԏ;<-JיKWΣI~y}RUrpQu~7QyYk#UaK}鷧Xy*>.a?MºPR}b< 4D&^F:g@L3y>c|SwϓD+)MD`S{Xq> t@hv ˕u@0=eDJr,Y'*\Vr-іX&<XJXj8e}d2NcHNH%r]?1R: gkoRNRnəд ^+T`o}C3('E R"9`=1Ƌi3zEot65#0Iwv]H46E1l&0c= gp)t ptD1秣.o(hפ2 yŢ1UYE1>y Ňsmc  pNtO0A"ZUP UYLg: A ŵB&vA2 FDhPVWWyLr.+24آq )h}\ٷ,<pi<"*/HĄP%ue='?'HMmASg="z9ͣELPƠ+qE/fR򓇀; Ԫڸ=eU@M9erEGﯸ'uc*RBQ,gK*~N\\L)luim$4JURcpWhE-镘j(}6tI:dqԮkdQ(~cWJo%R_ LRM1qz*ڗ +>u_[!HzoC!_cg*ϝ58U_wDlFr%~QUs%D{0 \R]NtF%h0 *hk{p#ChGNáe ;v= )J+Rz? [pA,6H9ΐe 8{cl,t˼3~yQ1U',}I/ft@u&`7nLq(!^v!vm3蠷jG$3q”cH/;v! F8E'kcLz-{&+O(n #\0p D5 &x)0ScbpہSb)P9Q}џ⦑XmQtkƧD0Axю1^NW-̠쭙ӑft xuR`-uM 1ijcS<~k&*B:v;k>- p46'qhQ/q xK쇩Khm.3 hIhS <~VkYDJVl qV@ѵL8IU$f*h͹"- vdu G"D}:W)ש&~BBvJe-9k+.]_[V^IGX.X ^u.uNlJ]4:|$S5A sk7E^3v]!SSAk7 pʘՀk>ҺcQ|jL/_E>%>;4 + G Ghcqƽcl(ig~+%r) _%6uy%>)ӕOqHB&2*3Vyqcl(>,"$:H}'iVj6f9rXb(*UrI%\.UNRФL*T2qhwbĒs!!fZ~?11Xf4.naS,9-?XCfdzN[_X'\]5xmS&^wr9V+W|[ V H >bulvYڽ*t>ٔJ7 ~'6r I*9ޝ;$'|v6$pT   x0#oX}p;<ƆCW)t{ ap’$! $#&Hb(y Ńm~{oe0re2UW8rCGO6L/5G4KGF$HIlm5&E;vոx?Z_&,7ϓ鴥WD63,D6+r6@ Noǖ܋ |$g<qe@ ^J3+[; +}u0u|LƉ*H4ڬ'r@㗕qZ @Z@޷C4CayE-W} -%6x hހ<'B!Aϓ#l/?6:u- ;[ΦֿoG%U/)|!՗+EQ#+v1Ap d4{ߤǂ}O I80,tԉg'ksk|VF}3_fu9 ɀhqD'9Y]֠d?|Kƙ*]mЄ($#Evy .JfP-e$M G R ]s*Gh3ZxS&r*y1SxJSU+k7KK!Lh]:T8JdydZ4LWwξ.Wm4/PVk5AEäj}!Xn<{{A&xk;[Ícc<Z—yԎlt8zS%tg95sZx`9ؓEp8ApnAsįwMm\P78g_T/1ġ HRY*I&3`$ьY!hft^~}FH5ëi\uXq33(bB&֦["hb"܃oN$/0VҲ*mW>ngV-c궴R7?f4vvrtjI"jK8FAf"M2 phcɋș~ %RɚBprX!`2F&3F&rQ h?bRJ l3zhie_Nz5@ԣ֣z! /+Qu(֘𖦅VqE%u~YU8=JU[)W#t#r,7<ؾїkeV6.a[zs޷}&*.cHJ5 N* 62Mp3V 8 on[ĠM7Uiwf3\[JōuM-+)֌}|݂SުYA8Y3.Of4NRʑm]!V&SʶSd¹Lj8=N˃{~dd4龤 zg򠭉2sNȌ92kTR ~3!u8L-32MFp\˵6H S-([TLC0#1+n5_0S skNrX vzB2Rl'*BqD)GL0د'R 1˱X#԰FKp!sl\,m\Ŭq {nk^Mu75z3&˔.!`G_Ul"PB)Hr,`$7D`%B1+׀)/W25oeuOC^3>;dN"/6'ǾR@3aK.E I=kc( V@mOԪ.gE&oTpr|F9DRJLѬ7C@ RppcDH8b! !Y9oFӆhx}1gVwӮq2:x_n"Z1pB9#!e6b!<\HNQrHI dBJ@4aCh+RH+KyӆIykKeR^a}П EkZvBG5Q"}7M7o1|%]Qu3/'[ͽOU{A+PʿC[KcΘxIZ.hTҥOM(z(z^.>3SG,*Pύ 2u<*<.rb<ŚSm6DD4 zTԴf936^jBn%S~!%*-yj)nSΥRB9刢"|$3&R`捯MjM/`#1b #.7% 3 NyƔ% h 1&%Q9ƥ!R@O O:Dn6̔7lk8[crkN",BY<_*(B?~V[M#c݈,KIV%;8RtK){]}+eYA8Ҍ#_k],#*{Dy3X`%ZɬQ3Jh 06xr0Oy0Oyr2`yy=1{!ᬈ0#bWhιE?e^`Mj'6^_% =Ӵ |i,Oi,O򦱲 Ga 2!KV6:\E)TҦTԴFB޴B8ҭԁr0 '+2 [,@ԯ'/a=uje!S6vNIEdU 4 BPDa!ZUϹS)L2)TfX*u`b<Oscsqύ@~0J'I'.g '3L?t6fh?~sftg~O?A+73)}yw|}f{7;Ϋ cz Ϋ"zq|/7w~ !'Q|oF7#ZXVy%b;sKg`/tEv+в~BE%"w9~{~J6y螣o{{N7 ys>8k6;OH=n;]%~]/xNֶ래׍p\+y6*|t}9G]w\ӫ>iR J Vnh7==IJ3m8qED.(^?_[76^.:y(./Of=ثI1~>|Jwl3x.?-yg~ىa魺"g2꼚;qƜuG,v^.7Ofn 'в?~11w_ΊxLWc; b}?=G>) 9 >V^31׿N񴷃Yq9pTοyX~' O؝pxlsZ>up2ީ&@ - q4_oq$'R]25n󳲘 LP-HwN+l"D^ UIF]Ifg ~Z2ܿ mp  .Ѻ-\2yP0N2:DE@:iQ`H#@Q/P1w,3 WeFŶg{+ZfNFU ~v-6oS$NwzƳ໚I;i,^|er :4Qvaӥ/dUB:hϯlΤ\yQg S͡{{f;Mo+a=G`磫3~?=;קhٻ}͒6^<*^&מ;{EM ="3:t+RIpF_/,exTp<98ui,X=; W֋>bb pN% b?/7Fzد;[?TJք cptBrZkUĻ KQLgrOI#z i)]* L3KxWBRU Wpp WV*\Y+pey•qnU*T ~XE`^^\5Uh%r\PJɮ^^sW(7SEpFĈLj(ǍBZ"'P |s <SZzW[vjnWzWz Nno1LKrr;N?di!KY~O*B|)z.FUJ ]H~"B(mD͢}W VJ՘;)WiJO6Fe"QHph-!svƧann9'WfG蓽 ꦋ>|$+GS/sժt|:͞јJ4n1Z!5z|07%]'Mhΰg"JΊ"@r G0A|ނ >R)\ȁ<NӂbǑzno⃸Maϫbu>=`c®U1`G7*ū @h*hP(W yg`]$ ܫqi 6OI-/6eo0G@Dp3!@ BJy<q@` I?`4@9O5sݧq{Aud͝NpoD9= qm#-o_b`s'u0ص}p>RsnV~sOq4hvHO3NDדGo`bbE)ԋ!D &+H¡[h9'?KzR"m{ƟxzKJO?9nInQn86zNO>"C X9=ғ(>dv3Mpi}#*6bzx(9=[]wvmߋۧ9gϢt=VeZm<]]t5Gru$ʜpϞzro{B / A:o~qhܖ%^C_fTz&tp^9@wZ8{64͟__?k|L*ʶt[h\F7nX},C20 q ^+5 ɶm:ՋMw%?Pn^޿=75gBh)0*HVA`N^nQ$x7{wWWTkj^ Bܪ>HkvjM㭶7q $UnwK&P 2OvC6r4(uQPф|`̈zH Ao1]tu0TT.U}DO U'L<1$e*,zcL}ϗB8Yj-ί8>A E3)"9ψ'dUkeGQPie3K0\7iq㉫YbQ0_]/smP0v5FLT(C+c| Xt~ׯ"8 ^,|6:Ñj2q$4GЍlـb_ʹ![h',mϗ DNOs3GwYXTe%R'dF sD_`4G"r p{:W"(CAdsʼn߂ f2] ox]%ӹ,О\A d4D\oOH.j#s}|Nk{T%-PZZ  +5!sJKLi0/d قʞ/0yʧm d+Q1CHƵ=d0T(%a hDЯfi{Tچf)]rkshaagb@䙂.(I|G+|̗F=O'fuKVi|ZyExsb %kG ȣ֔{1% |a@_4(ꛓ~U+혇F`k*.}&b94Qj@H{sedL$+$h{T!e=t|7aNؙ L8Qs;3.D Cf\Txƭ=N\1 hCy~,ԏa2!$42jp)u؇I [& غ])8j Lv,/zݵhdmnldRsm)wt?Tn~~(Fڿ|.Zy{N=:ӟ԰8v=PfFhͺ>oWi%eDS @ˬV&1oc- V)Z VVkӷgދSX3Cik0Szd0ʊVIhv qMƙ헟R lH7s㔙dҖYS>! IǜjDa". v3.`33" Lmqm0}wNkqTXz඼ǡw?/]\~&۳^=Vbȑbouyf;^{]V}ۖsNv>~t|(.=?3aGad#3 nJu3^}XzcO@G5x bf.]o$sfGVuh#УA""E1(c+|w>=CJ|ndcp>&kW]S$a[cBM _xHjy{u= ܦ9$:lM;]Kƽ6 S(ӆCʒkDFF7ɍ#miCJ ɍ5m<'5y_71ű6jE\2{&CDwjuO/sx '1^gIOh3oϏ~g&wfU'biƵ&aqS-jLDJe$aZ8NL4!qJ F l[I-=S]er$v'RzK:KQ9)YeзY1փoj9ϩ օ[F[KwmݾƖRI`"ylz %pJ2LyQg1QGT`HbrEr2$!\ugk4j}cejO8ug"P@x[L9& [Ba(Cy&pg lY ReՎ2:br X\拸t,F>%\bH:Uvf| $ިn}>.o@ 2X7uۨn$B=tMuj|%36NWLMѨѼ]?w<OZ7彺ҒRI2#X<= EJL'1˯&lB\Q96{MCdNX"I~򱍸Fݾ?fGvx,$Nb!yddO*jGK?I >)?S!ؤ#^qj-fZ|1_kR2=@gJNs"a+*E,;W]|W]VD]KkYa{be#h=-'_њڋ0d:n4=h+&ᣞ? Jd~Dr%;xJ 7xvA'~>gJhH4c$9](} x ْm9 \eyL$8""C|6\z~%G  H*sJ9OMUO&rU䕰-_暉3W]PŅe5cג*f?r5c/wUTTXrǫO-d.O]i }"l5KsWz͒ĀQA*Ԅ]elcU͚xѳܱӈ Q!WV\ts nf& |]SVwJY( CRMy:L5+bu曟A}a@\{a@JFhˆyIuJi) x]>ېIZ){veq{&!ؙ峤KͲlOpg7SOwc.=aK%-ϲVr H|m}xRn"2*oXDǸc/3jJr"P{ %4Qm^@p;|4ˣy^h3f} #a}k?miCtXvCo.&HLf 詬 "ALWicEB1,m):,XV4m?R" ͟hǣ A⋪_Ic }9mh:F lx84iNP2$0|4ȂIdQ~53 Zl/`^Tꟽyx,ֽo_hP0-f?~G,yVmE6"_<džL0zรѥؔU2OgQFILp^uha5c@ gFK!eф:9]yDqͩ`5 Ar@:aUo ﭣ舳@_RG.ԘB+4Ξ5=p:Hs k%;#K*SHzʡKRLp NI "=rN|P+‚HN+q@yF2ExΆ9zO^3c|/$bB՟J*,#9t(T]Cܔv_hjK$FVA-{%ZLF*ꥥ$lӷ62#*.2Y̞e9YuҶTnUlŁ$ƘgCaSۻ<A$q8M@gDXe"sv86Ϯ5q\`C01dRi|t(%NYjeŞ1zEREmP:nn;7YI\̗KK}vx3^{z1zE ¬ bsCo?i;4ٮLd9TgZΰ}lvB,LLosKKU:ٹRuzޒ1l{-A"ۊaOhP$ȊKyOȑX ϒwLOxQ]8kcŊצei#vЁMI5JbHJv.!))jufHJUB?^Kt-c(ÔTҴ[;T)Gؓe-u:vZWkuBlc6NqV8z,0A4;L0)B@y`  =kUzfѡ!cn[CLl#= +u0H4uGaE̜s̉F(aC; #~X0)G1Hx-=.,w;Qc(E8b}əüIjwr!ayu(@_ t/hk,غmc{mP %z%юBƩ9пoFƗc_,.!d$V6sT$XVV' \/B"28 4BC<-ΛZ%Y/[JnT2,2<,Y4MBZf3D~haF!>}!(lSRdzM֖Ѯ9"MJDD*lDo!V;%^̳\ޖCp(;j%<-uב츕j3qoYr,["DFOKP(A2e0Tz*=oDE?Unh!,g| 2+HBqYS볒jPѮC,=lŌ0I@m4@E$ ٻ6nfW㠧-6Ӥ}7o$\ҫDIvg+k[W֮,9DH{#Gϐyár/=n)zS؊/ZTm<@pt2 beB6YΰjֹjhH@dHyǧ6Q0J#Mh8 cO6$!r~wtĴ_0dh: NSM~eBɶ'hH %le`tv՟୥j;ɼ'}ӆu]W&»nujeq0iebTUӪҴ8trBZgnC4q5I:3BE,N/v1Gg .N=mkNŹ}UMM`xb蜵+*U #QmIWb8sU@z/C۹1oOyW/5cٗp4ty?Gkj%7Ŗ?5A?=x'Gpw(,SIKUe) 5T`WJyq8v.ƸLad9<R3([¨lzaVni{:aц%w@A(Β܁P9ΎӅT67i\O)V5%R-3J}%>Q-o,] k*6Fd::]rfڥYFͭlڀ hM&[I2Iw/gilgT)N2Zԋ}$ U xƛqP&W޵OJR&D7Ι41I-Mm5 5l({@RZ R)Ap=ll .J]%( {}1In%+Icar2lA*JNnBKص!Ц{ǠAB;S=V{G-,cgѬ3skŸ(yLalU f4&[Q:vK7D*,Zň;']Z}[e>: WxC>zwk]2>ff%ѹ\MeMU[)Hs$m5u[sY;%w|qͶnoom/*> s`ֈtuYtH"_m=76Fto"͚<:|&Ӧx7tː| ]O+nuB}PU[]LjHwi#N-\ wnyiتiN4Ol\WBYۂkm%Lg+@Jq.zfy:Z2jK;qٵOh\h+򮛻ASqi; ,,m VKM*$d 8&9~TG`ۄzgƽMD$шŔjїnU*CR[l B~[vp|U.Y/cgtr1?NԱ3 Z}k@7w0fpK#e  |MN9fr7he椮VaV7l\Wă- &lԁ_IX3Ķ%1œ6[sSwR:,x8;$?Er3¯*3dMTD+~c}mb#ȅ8GE.[I'd@ ޶%h|`,lc KxLawNm|h\-5m{S8^Gaƚm4uMq3BpQ *ra Tm"n,na%qvI$JHH̕ĕm2xX $fGJTwvh44sEh N&w&13\ L&LZb"8): vë]u+RJe+5"m3[ yu܍jN,ahu4QvoUGΆo% žP5-PY,[_cn'L>T|ZsE-i>1g DD4o)a[B/Wz+5i3B-⑻tN*0k6v[s%F>TE#;ׅ#ŷ".\9 =pcd+(D/nؘ0UČT.q +vhf:MqzBLjK`AJ,X_|ݎ;u@Kߐl8}MzIS2핏?9{lGdr:Jd &)ofL㔡߾^Z'oK W+҄`CF0&>{>StPi`:zq.~G:pK/$pHp)Wg)=&,=ӌ{d^_8:a1)g\KTAcV9\a.%ܱΟ׼-]uy[Qmsv2d0r: ϊ:;J@ڕo/!9#p[f ! ! ! !6:`]r}^T%y`tT{> ^.m` /^hO&c[.ut~2CB±*n|߳ЪQu9P@rCd@Ǧ嵠dNfpnEw^V1 ?vBb /)T`VƕUUh֙H>MYiYŌ|w= .ͯit rfݿ^f:9vJl'NO$*&x# 9'nfwdp%3yؗg ?:709h%8v\zrA첶Ƞl,俾Yx://O81%fc(\5 GG`Ɠpm'?G,'P!@BES8 a&gBy?0zU=˹W9織9Wp^dLsߤS \(8qiS"O*\W+g]a_{!a\ʝߕ7}78Uڠ8~ gO_M+vO~qi1@#qtQXbȿݗsʉſooܧCg^B9 B740ʼnPOI nU j1٫Kw69qkݠ'iq\?Qq 7<{ xOEE%,7Nbg;5CW ib"!pr+Ά[WO{gAyEMgpvy-P/. ؇ʑݵg?I zt4#~_O_|5X \V"exqv2|:pB6B_G7Ѭcs©ze}:>_T| DGoSW qq;h G=}VemGcC㳂)ǰgΛle ?Tjxtr<\=v얲_=?RؙlosKuK]\-|3x;P9'j9/b 3Q}џ>h L4garڻs'I$aDbRKPЗ4!AɥQ:wo4"0@3` 95% =x Gs_ ]^A?^$*.dEŽxd".Aq9l0ґs, 9Gu9TYg> , H\<&NSnRrU\e'ܧE)( >"HGB@E9<"Sq"AP$C 5N$ͧ!1ǵ9,ai&M,%A%=K2TGk+%<%ؐ~ ]2P$CuPx}؎l& ̈%rsV2S,UoeHR $-ñ* E&uLHQdB E&8K-&q^BF٨QDkTHs*t7TGO ᥘʎg ]ņCˠ\(rCY:S2WP.Ѭ2L>͕-_4`OpB^Z(»^F1(o2Ey 2cAE8RLh N Ʃ)MΐKXضA[BƱ3AKAUyɃhr x<6:.Y ,ȃ̇"- hpEpBI &Iv$4- o0#my&sŠ4/B UEJu}TH"%WC6͔ѱ %Ug)q$iXib88ҞY>M\)H1WCTyIh M{VzJݻ@(u;F4*fz%S\>a ,H䳂 )-fF"T [^R,5BypCMݱ-6vYK*ҦvSRFw8?l$[ҀQiT0L +54NJnM2ĢHOZ2d%[2)y22(lQH*1Rx&K3 i[Z֭|֡(Ch8.l>$)Xʞ3id\cDWۇV)NJئXd&֭Α7 $A&|& )(]"M7L:57!'XXi$t1d\S#^:b7r`&AnЄ$$$ԤjP Z1 >% "v\xbEbS&N, &G5Jf0_iܽ:[FX 1B"SКu1C2=ԪLSDNWtg+I6r#E!E~Xm cڲϒ_QԒnYa O]b}U%e22c@'wjR̋@*UTebnկT'mӔMf#)38S4$^Ó_$HrPӣZun#upB_JNFhd"YHR,ؓlZ4R%I4-F!J tB.ad4R.WQ$W[m)RڦHik ptԦOw^h9  )9+HAeT?R9`qOfdT{7%de= -/Z*N` >1^ЮfbT\dVL+,s _-ɊDC0^8`JƓId]B'CQtQ9m C$6LGu`0r~A\d%:*Gv )gbh*A!ȟcBNED)"vT.:]S@V)r hB2 GFW8Ѳ%VW6N b $bXKr#2RCr"\%b@C(Dmh{@?`5Bߟ~Vӿf ##:*?4GIϫ>qhǿu`m;>6 <Ps5Ps>0"AmuΟ%qr Řcr ɇiǧ~xf ?q)#L Znñ-7>fja37S}'.e=| QC3m#Ϛ>Fk&cGQ闣Ay?ϑC苣2r@W7 Nw$ՊS(^h:G浮@A?߲ƐpG~CA?t?8MHgB4`w1Zv>HwAeۑ(HF _SVL,I ٲ˻[ze ɿo|dS 4/]X˿bɅRH}-.S_LJ0{?./ 3j6;.p?0U(ofWz˛ݏ^?14!=@ڈD$>NS+Fq5 y3L_u; ƚ^5Ё( JUj0"|I//W^c>3`@fj7bXSw jrr+ruI^%Y/s"2&S|D(yBJ\hoKE +*95p1܇ongac2)@L^'oooF&20h.T8 =Ri J9 kg29cbSqdW0{$(OK!DI0*fQ'0J*"^GTz$*#wDõk@X #DL RG˃Jꔨȼ+EѯpLt*!WjEضXroB|<ޏ?'b y-;`b@z|ujD9J[bLWu$ƶY'}| ?|Pi{,]@6̮tYTL),?R^) GpuH) Py5 YMN mP^2mAVs H%ѕR+Vu,M*NԧTs*$2i)hυRk'~M)m\3]- EUIvKovgZf./_z; ?%oIOj|xzbIښEF=zt/`x:FN Gbp7l/=8GrSNgu=^m]Φ2m^Q3DcH--1q2w57)J'ǩE bE6. VbUxQB<ɪRS(9Hn4d,HjeQOs'ƭDj=1@}8P02J 9Qʐ,+ ك*xɮjNLWRNEɵQ!ȜHx B ,ZH5P,GA Sw}We' JqլqFD),/'ָP/%H0uŠi^zQUݦ_Ϭ\ )DB.+*Фe#dR.\&84iv\!0L*}*up=e9&SVQO1mrW6)5'X2м{)RiShF)Rg3Kz +)2δM*EBHkmZ0jUOLicC*<#zH>g-8e'?2%w&sK*+_J4G j#o+/ (~;7kVb'Z{ϫgV:ɕe}6>P59WAI."em8sK#PF{YȒ*r!q ]HE-%I1%SzDf!*1堤͋BV!.8t"=d(`UܓB"m.KAZx z FtniDĀJv@7*vs% 1C)Z4yR.m\Aj<hrbh<1ԃT.v^.۸&ѓBtPjkM$h܎5W[pn6HB~'EF6S,p+yA+CDMJdH`r[KRk$ -CId1mQ$`#w"T f4~ ii%i&3$iBL6nڙMB* &ZҞmZcdQ˼\r,$ ^K{q+'O"xs@r'5qr\<ݐ\['o5uDUZNi/4N2d emFSlLTRWyC1{C1ܦ ə8 +6{bMlӎLbqR-{)Vcd䀓hT$SܘXگUKye|,&iŠ)[8iFgG*ׯY`S"Ye$TE9#hӨENw5Pɠ#mz{Z׆LZn8JC+i]MDcOVlQ>@1*H7B֛Ǜ}l@'/aJ?&Mh*w(>^IB/]V Lפe 1ŭ1=JڴmPbxQJ{-LT' /O\j%UcQ?&/EWo+\|ʢ"݄] ~=;E+ ^zW +)P@ᡞf[_6g޾@"E5_GFW,"j& i C6*u&0¬MئjY"bxYC^dwPdE>Ybm| }@H_?tVJQ_cjW=o9'Ϩz6ٴ,TZlyV?yTqpm'&_eqh;NQFrM&`D=黻p_I!Qr5El;Eq-g ؊ ć펳mۍ=Dvkfec9kw+/(Ep7Cy|V1>SzN>_9Ls6W;gd?/C!Z;MZK?Sta=s;q܊8r.XI>q1p>ٗ1 k_n807/@V?V=dLfWh MR1~gZ'UY᪃&K Y$2_MH&󧹑Lc2c bB_ѸߥQ?vցl~t3tN QJ#N25 Dԅt%Z3^YὀbY|Kу>K3{ ce1aWsju,ܟW=z5gx4>`ޛҷ yJ|!,7aలz{ۼqqm]?;B-3B $kFIzIĀM: *GءMy u]1XYi|Mm󕗅co?]z>cY5acߦNsR̓O5gv{Pw?#g6(?(]B~/p>ˉzJK/=!s>-IÖ/Rs{Üَ|@?k]C(|k +&MV'QlA_= V 2r$6=Ig׳w>\v@m99m3c@fFϚ@O"vİޕ$Bew~5v֍vwK(OS4xp.;$ y5w 1`fj>'蘤F_}9]_>G³t4O'7._6L!?)S2%/S2%/T`dFB](FeraoC磋bFS3}bL~ʻiycX̯}<Vx⾙CU|@ S|.|B%pl NOcj$Ȟyl@&㌐~,hA&\=ȷ Y̧~_e{^-&F+ [DGe4`AaDDY`UZ̶7je1|w3\S —e |:_;C:ýbʨ]ZGcZT$\m},C%fLѕTyu|Q+h{՜uC C0&z o 6`!D>Z(h4TɈ s${o_!44bYmK"6ЉMGK&tgμVQpw'cx̜b2/x[NL:Q(q (b xVdtpX* aT+#,sWZz.5x ˁKheT6lnPڨy9kTG](\aӜ()]Μ3%$Ҁ# RPΙaXA(J8XލOiUݓ̺9s%ʂlo b X%) *k\,"@8$K54x"yRz S \7a IX ' mVң8&8 ͭ5b&;s^s# M6PrbÂ)0 =JjGWQҏB\hk56"bSy΃ )|H)A-S8_Й8 SsSuq" Q)BTpҭ_qrig3ۍ!EMTXR  Hư;T8fg,$pfI*K]mS 9ze.S[;r:6kbgry+skb+2:mQ.056Eۢ&LwcR qp>iJ2{RZB7>ĉ㇏ م#=nL?99lg7lvgKWV"6k@|ʦ\`5xeeCǧ3Q9H׌XP12Dll%RQ%!,giD#B`AYYn49# &ồQIi&O'oNڜXֈfEqU01*.-DsO3Ft`)e,^F2|> jjZqPZJauVpk@sy=Ur&"#D!g2RO+4!B&4&'X"%RQ2J$9vH-%O7FQ*\4 Mm$&D)k^[El,JE*S͈`ꨫ G޸2g1H{L]0E@)d^oALaQ "rѵc̜˜D 0rIJ ƧQ p\3h 6bɸAx*@tGx@vo2h,GψAnD`E²SB0]r"CDmuYsVh蚬9y$F詐qR ҩ8'w2/#U߸ ?9HH D_cSsIJ )Plgp *#F%R{-O)+]n5^+0RDMA̩kDA)ED )&jQ%i j5I/ȋAҞ]7((h%P'm,s6+V֡K|ڝ`2Ql7ԀLpڐ' k}&"+qxBdZ=<^0i,74靅0vod\6V1ŻPoCdr}uw ^_ VΜACW% {?t~ֵ(O9[3ҲNwO-*54k;i>ml`)3va|o*tcT -"҇"y^>emޝsԝ3nJMzx` ƆcŮge$,baͤOL|@,o)aW!J6% E t74JO~M k%ܬ;v+m|?ui~+do )}L7K2+z>Y[ĕ ŕ ,tv@K׾vO;>ko4%\}rrf_7xض |q9b|._mT0&w?D%5_>~"= 8h9D'}Z!z8]R]Y8OWZ1Ь~ ūN>]NXs5䆰[ &SX7a#6)a9 .`*i ԡ#"ǸX7Ҵ- b3M8!zmya*Aʏ{$/՞aY p0ZqՐ:4銎ўy$p3msD8E~j?9ߎĻkXo1CsNEo^Nl=ajʩh7m}4we~uCNR"'5ZZR]%,6m9p5+˴{v{7=5/'/u^q/Sgo,TZUwt+ˤ<b~Z"Y$c5:8j{Z~vaE(Rx#t;huW$-T%tiwdϞJgSQh0FbTe}u]dzFTοfJ:1LigOusfZÊnOjt@X :-:餅Z#E4QIyMjUƶذ^a :pn3{uXltwkmA"4p}c'Xf@[*$lV !(3,yDN{cґSքIaqfAmҶVQqLЭ.2ŽKT# 4]`5ZJ|ᔈ6^ Li  C oC:9'+yDL<<x ލI tO4 z@gvH&! ռտ*-F:>Ӧ~r֧hv|@3z<~s2~p鏴N7-]ZMVgKk ~99_nw7y< ä| ldz0@i3V:mN٥Og9ӊ8Z2r~{Re'м1JzfkR 3#:W.mdJp+M{nm1h$j2vkڭ y"LQ%-t`͍sӧ%`dZw'pHvFH=*yNߺAU :RZs@3aADy(8S`BYI T7tņ{,nXK4NǪDdRI}*~6bК9 [_O` PLgeyfe{)q& Ǜѱ zԌ 2,Vl?*f(0.gEz HmУϐjٯm|v5 +[unDDDJRkyپ܏?cT.Rdih^MF{|Վ]ZYwm͍S @UzHy7)oy%-XL)jM44ÙFw __ht76aFkhUoFviƏqJ/}`_IZ|=vQŇ$yCV9!bovWsG$]Z#e}<*YXpJ prV[(hi'/8-r Ie˾rTd[]L_HY P ƌ#zs~0fq쬅 8v_ۤFѓM:w[kSz'45'IOad\ͧ]Œ3WŒ38︜68BąhfՈ!氛Y5VMZ3+Y Bˊ1LsoeEO4LtޱtҌЎ9p lA[b4J37 JS`;܊t- )w+S O' J};C9x+᭄Kz{,D˞zL;#eў8蠒X.Hb%d=o2B'] .y㼷钊 "X%JhMZt}UQn/)ol'R?O~K׈wկLO{rf|"pmTh!H1Z8b>hM 3kB>h8s/fcpN(;3m~ VIbz_^ %~ϖ cƤXhWph{my8O0#2"!ϻzToԩi2qK˜Bi4-Z7k-ttJg1WTCzdaL]-yyAJP2-:NM(ߣn/? .m((G#3ܐPQSAHq}8;bgӉ;_}nsp{ɫ1M1g> -j1j@Lmb'!RocJL}ݙb(Z:|1Xnәek5ҙ rע!FHֵ35&o5?z5B%mF׈\,hٱkakd2'oƜzG˔U2r}e|Sŕˮf+.c/Q*HxRB ZS ٨w;dz ͅҲWGt?,\~sjyr?01g3UfgiiߦIR@ 7~ ^~Y9ȴ0֒R.SmڀpCA2.83~c2uڷJ\xVTǯFL#;if}H:4SF㞀*C M~(P(#J%s4r҆ګkߥ٤yw~# B \JђL1ƃ.X0XF%c-%+HH4@O)&qeQTGZ#]C-#T$&ćǿ.制dsj95@ |7@X)XwF\x@hRKUĒRhU+<@5:ҙD $!X0gu".~񴖌6z}t,w5όm4,+qwZ8$9wThxeiHs؜Eg_n`rW.%͊;1/4Z "1/0 iIM[.RCA.[< UsdtTׄ׎++wԅ(j>:dNa0in]_~t6= ||>i!WmhV E-&ZpPp렺׍KgvsbL[):A pW\t_Oxvè 4\yYhJXVkq"b3Zi$FPHW8? @P8-c^2HK!-_;iİNA砍 @QD~!>,$$AvLr ,PZ.N2bG:PRgum 0fj6dGjNۙ30Dr0J;#M0k>p*(D^2M0 Fk@pX]m g  ] tBw.]2i %DNYaoi44x2ƣ (RzbKq ;#eᴷA[[ڈ~h"'>Z8 H:95 IkjxJQB{_p")QaG BTpq1B 2 [Mpr'8k!8m`@1ҍUSb4ͬ&7 PZKzt)HZ^ d AЭZM PT@Ѭ@=饂 H\|Zc5=/@Z2گ\3o}zL.QfN_k6Pߩ{oA2DEf5۹Z(ڡ(NkH~SADTYVXJCAq@EFv,ðich?-HP &/Z,^m0X~3wNi*rE޴c䏼m^p|j-ɔkt <(T:΍',89>!Z^rÃAITZ d+`7D߄Pjgql9+Q6YF7B i!.r"3`A8%"R)5J. zi.QFhn-k_5|m@[3 &Q/aBxFDM-n {R' &8,YlF'9W3'm :hV (pF*8 GRrBK4%( s\颔>Vm9*kvhe2Z{pdJPDE\VOshJ >F#Gm\ZɅz-G_kYńWT "KLn%DUg"n>['{+0N.byusU0q Avmy{7ybÔռr^ Ac]>oAS\K2wvKus7GCÑ*Tq"Z8oe.K _.mġՙ-GT:W kVj)p )TаU|ҷfWKcT)oƨ,<4}tSys1Au/yț|5cAƶo<%5{8# xj-VaAE~ _a"S_~V tY˄ȊJ\f-q85{s+:nn~}{xx?>ͯ7l@|e1v6z'>.Mǫ/ /YȔ&g}|,_v| 7*ˑ3fI&Zjscyޞ8$qvQ}iZP[yLfvwύwΏ5۟GrH:^_u6Ub+ָ!8L/&E5WEvAJiv`1еu/Vuf$k}ŚƢ5R@2Uͮ'"BP<؂;h& @k,XoNvNb{lC[qJ=Uu'N'˫{gf58QOj8Nz[ DjUF'lVĴ%:m<+i] *@ lAB u6FKSnI$-B3^1L&'#Nj#3or=Y\ gJK4"B VX{P,y1L/Cq9L2Bbm-53!DG!˥@uY!Lnq>MJ֊*;Ä7CT`KPj;)4U="1I)%xu8{C8{$PsyJ, Xƚ9" Dtn8pKɼB$zRhYvjwF]X)E%-NAVY" .YrwD>tAbB+dJs+^ ;JFuc'|TLTFq3ΙE\ZuZ$L.h;3Ϯ)I 8pӄ{T]u: atsX "qb.RnfAe}e:pAQ׋g-Y Ӝ3Bt4S9U9Dmc:#yS:-Oy.Q;D.1LY#*z QY/8e/sO ОЋlcRd$/;j?0ϯSr^4@ &Ld,L䣑$ҷ!BC3'<'Sf-(⤑y-FZ3>?8x>H\HP9/pΥ $eEasr-# Sn9mu>@2>/AIyo,E{ynP/IEKVOko . rKs`NX9C:Kn ^^=(z=P m,sH:qθFìmB;Y]pGȂ *%! ܵŔ} A|asEPd˄y`RH'mn)BV9kmI"r-r9Ⱥ.BrAaˢ\KcnqE(Ev^HT\.6$kS[mbR`1 sJqч9C+1^ϰD<ȉjoq%Z$x9 mbaƠz 5d'w4ٱAw  $kSpsiR mDJk Ra ~EX P>Gq`cKSQn\( P9- QOeL4QmuZb~Zҁ|~LXtruͩ昰S.^mD_8yx>5aزXh @^]=Z^M->owHIg{W ¾ q{ỹ) +nj3~| >o߹3Nngi2URSD[*Ch0|LJnp)Q%e>-I;:ZDBH!\j$N(|P|%Z304E4+%;=oɼ2Gr%`Q)9VJ0m 9k+n 1Ar(798?9MƴݘGD /t82T[kbAZN^h6%W(3qa o]z{ *eKaAf*m[2XV)^43l78VݔS* %ȑSsoӏ00.JJ]GeĶZvZOV R2ŃSyQ2Po/^^h\JW  Au"m9 %HIri8^O5uθc;Z!AIRX)BI޺^U_?T=^^6,8X\=ο{5J)nz]|(xHNT$f#9y6Uۢzu{8p;d6wPTæ$$.5/OsQ;}SKssN#d6IbmRvd v)>ۢ98jjhopƤAɦ[U!7_$#0|E &Jk>\@!2)qH)r=)XGpX'U+p *`ipTQnM%Eo,҂V- bF?VU|WQO뫧go۷Ӈ{ e' 1C3=PDtil;SKݝ7AGϋՅvZw~zLwG ]ޭxx9Rwes- KD*Q0EP3Vc /8 Pp;,G TsqGHB{ &b=i!2˜#S: *4氼 etBN`*x#rS wFhNJPcx3Iu!J ~|-ٴΑ/x(]LvIӧ0\}3Z请\BBgJfa d *rTaqllq H\CZJ<@;ǵ߱; htPиzM['e[q'<9@)'nr+ds<5`a2< X5<{f~Wi]4ai0_`|,jVHC,pPFnmƕXuY(q6¨*.x,;Uj5Hrs9/{$^ .Dgƥ {.g?jok6[{352Ju0H? Ю}tlvi|/eEuɢUi\JdIT.ꎪFsv/ uUK6UGgw P&6cPBx`\UpIUQh{kTU!. "OAu*YqG6~^GngښƑ_QԞ]ʸ_\35u&8>xcndI#d%ZL H*Ȗyݯxݥf?VpVټŘ;kuV) $3 'c'xb#`JPRL(qX*,^y>H g;ZE0}t-WI%gGͣ'GYM=ѳyBP_9_bYueu>SVͅ <& #3INJk8N[:\#5bVTu94S[`=`A5898! 9@bPi_]&DF*м휖 vŵeg`!Ko?Ĝ`>TV ++_ V.h6bQw%8Mag [] 9eAcF+?K`Jɥ=<u ~4ZzHfsqP(F6v5Z]8.#p.ZB"I46 xs$JCc*K8w"lu{T{ IpR聬럒TK!F<9𼊳gINFx>` 12/_/TzTIs @.;!>JZ^o*9tN0 dįvŽ<4`8f1#""`)9! )c 1@RJ=LBcL_cYh*νk9c'ľ92]cq$Kib&%NRBh,pJ 8 $E$Yѳy^0X^݄2XfFiqlh Ǒ0dDK Dbp ⱌ$ E(B3B* c{ٴPYCh'@(H&Ē)q a`JZ&$wdLBBBq{h[W8&-ÞXV*JKE3LAHd1EZkaz~(!1Ek$PJւA~r&8% *NSe4 E#8%1FPH!R@TEN *5_c-ljD!㘥fMs!D!S=HqPR!HΑνk9sZyE)SF&SXM*Tdnq^^NJu %x!.]AJ98џNIq9KrzIR2 OwkSg^㗇 3D+?ɧգp 4Y;%/oM*KtᲳu=xiN?݂۹ WF߽< d\8oM 0Տkc{lowө4p_4)XI*{5h)Z#4oӞH~ IKAc MύkQw͋Jq*Bxf9OjQWX(Y;J(ݬV dZK6o2 ,pTNR +K4{6JMQ~xa:;$eerC| ?譡^\omUEA#K=($|s-VIo|0}DDB40D>246;O4hJ,T{̴!M!Ň5ֱZ@;B N) T#$`(&+m/>T\9ѥ:b@Gg+Z2TΉhS+g1PWפ57Ϝ\9wu&aR41ZQT_,u> lRzk%kĒ!a [K:gBZQ[6j Iö;2Yts{5ԃA9;"}+x z3}Z8ޑ' gz@V59Sі&k\]@t{s?z u?>Koa0N}Z\u9miD\67G^?]:BdZBinkʒu#8Be%`0^USXfsnOr*<'vPoéVMM*4/ ͣc%Gp\r.MSo6+Á͗o_Ho³[;Irj[4MRh@ H fX%8\W(PzNp$CyQcƏn|v@RQt,fuIm bf c)MǼD]K$)JKdۭ=3.v ƽmQ dgO#[c2d[#7R۲z޿))|ܧ<Kg( )D?ǖe!~H_\D]d h7JvsFt\QGg4֙vsw0Yj6$/.N2OC=j7W lDu[6)ݭ'Yj6$/.N2OG bgꈁղvH4t3uδ hs2rQwFݭ:wv-x$*27!"Cqd A@L(87^d*i^Sx ͻ*;&-,"fcw8|$j_$Diׄw + 1 +؀b/V`;Qq&'jxǑAt hIbkq謷(i&AZ;ϼ*&W"Nce2;w3Enx8СFS.zxb$٤SH !z0a{;*R,?b\Ǎ|z~!F@\ T IJcf%Bz2ܜܩܼЗ;YĦKW r3?m@O5kN/ trո#|驓_ennNN>HFv'ɽRDGSɗ҈3z8iYc+O\aYW']؀+H\Lص ԶS=٥W'{Ta3-o%Km:9A.-C~rZW.O!>'BbA2"FO59T?]^/Gq8lIR;]}stx}Φ;JI")`$ID4\b9%T˔W*P%ϓYN}4\ /UHO74RFf]?$\s5/'zDjR&ku/µ9pm,\  'ċlH0<sxH)DJPAU1Y)N]is]syN9ئs}hiY=%/o,ʄQ s.Y)ʏk{_Sۖ{:[޳h]@v.fe>ҘDF"b~l‘ޜ׭Gn&{^zԝ<|3 A/jRB褖iuM^@^1@VſB*=V=h) wӪYFma>Tl*-Y.KX5Kw~ f_qxsRWs9p;DgU'` $3^0h@OB~x!ˡ?xwm%+rX}p=}$ћ'ja?]%EJ>*Zܟ^8{xy/YDDr漴9HݞK;gpO#rFD\N{zITJl:t ?«[ +%# MU)޴X A{⩍A_9TZeBLpV=MSTW7'#WXtW{D{\>aK8Llqmʏ-0`S4DoMծpW[Q\i:5o[^Y@Đ0Ui9"@?DhJL572aBYGX*oxua4QmpV4@Z>5b,luYF3pӄ,_+fMk`%%q>fqsnOOM˖7pwz#gts/ csʾ~g ln-#1r6[6 a)u)*IQVJISտ%$(aH'^1VKgZ8BK7JW?!:rODe'>~5#>l 9]wwu:3CF_g)r Ny! 8Է4f2<6įqwuNjDM?}蛛Gt`ZϮ䠑iRhMD߬/YOӪUs)Фo_-k<-3zB~݆./?ބ7m @6J.%+2_M)FyRX V5?|ISjl,ׁZR8c8_%Ҹ,aP<1yJ.#+f=X&\%^%ZՓ|AS M: !o/|:o7_:o~m ptVXUqN"RE"b L%%X) ,%EId-tQnQ=^WJX `mjͳ*/=ws:j V }k#P¯Jf]^θ×ǾxPl<8S\ݧ>xa$9GA_r x[qF#ɛonL<8yMx ROG&8] >ŶsxS{%b؍b% gq3XUIW2jNI@x_ڛGc usvm]&$ZSH9[_MGyfvSM&M>iALqjC.h֌|nh^E-R縛~mk0l= F5 >M`Ij YR6A>du"? oSxw@P7:NoBEww|{S||3s9]61:?USDz\=(}ֹ Jb"=e0dCE(O^0Y}x#HgN g#s\8[]n s؀ܻ (Kn%T}+cthuֆ~wBˎB#cU4E,zdy ːgMVFp?- ԌL 7Voj2{*2J:r(^`IHzUzJF0"rnD HQgadrS3Icn.>Z)t~mF-=}fo=Vk> &qޝ/\Od"p.D |8`uY"`pt2& QZQ&Hϸw{=Oß鄐[?[.>޽{o6Ή>1ݏo=H7_~?bDFt}1j:*^K`|2drAncT ׋Ʌh>jil? Փa;K/H 0pɁ-rmUxK0`6-pG2G`Yj٭$n|~{h᲍n\"hQ/=G'ޤX!%\z nűU0IF,#zEGI~V)ӟَ I"׳rή ּO4<]Sj]i)yqZ>w ox䌶>TbnF~i| `C!W|8ϼцQ+w9U36tra>\:3Tk?[dۀGuѿ¨wqN5ߞoBV{Y!YETkfヲ8*/:q~QFrY*;2>*/eO!%#,eMxR[P\W{+ *Py^WHs '=( 9wu!jhM6O%+8JG\Ds6pK׊1ӂ)Xռm *6F x'x6Ѽbnl? F>熜/? E+{ {TCnD4TdF%3 9S:Dl_4),jݺ] x_;/2&G![1ƗjxE5͆;C.͎\ieEN(fk2;Q1򧐩x$םKL6~8pJ@[ahNrGQhQ}r4w}Ҥ&oаt0'o)ZΊf57Iצ{߽$݂f ذ'Zk}-ǭHRI ,w;V##(dRqn4TɔC尾kn \fjRdk]4pQwI_w*ˊ15\C.,Ԓ%+`Gwdċ?6Pez5$)kBYrE snMU ڐ'FL#R2ɍm$YupNC.gg9J).'˲'W" g(\)0r}X6Qg d,i$UKɤt>D+BHSҚt7yㅀ3f|^ ߋR_1qbY_s0Yu4;[̀`}[Z}vs1 ʧi$կT}kahX֭@wOUUTB%]YdACԤqGEm%Q] }7x3\ZhT@ H07[_ F#t6`>54b`rd wރ@jrHYú>wXMK2n%rN"CYY ,ƵdWICH:I},VRo-F8_.B(Ds@"}_6ІuڰB6К"mWr1SHhNm^eGlEf)̂u[(鶃o/Be.g@~%eܯ) 5a4ה5@Y`ԁYY_Rᬰ"h!FP,r I?e$tˣ߶o-&gGW$z>.xe|n8?.6//XZ6"mkeoˠj]K`tjr;^'n ᇧ2`^\W=@C[j{ X{Nuͳ4^M tiG|wœ1LC9eoȏ ~=]1W9G_َO?+8>ojquD$P&g{by߇ 0Zݿg_b߷]!4.h-+JDI'ŒK&;+-H8/c,0F#`0ks^sCPX]*Ui]S K.,EHMOh، in=zuQHؗ4 U1ĺbE"+Hc *1R'dE"}j$ZRux“4}no*>ԠFt IJd+"H^LE꜒WV+cDNY)ژ8]x']6qcB/R)%-ϊ0nHWIW&러*KP@R ^vH6>v$B*͎ $‘6xhJy$(&B"B;ldgM hp#Y}{=;ß}f{LoM|`={/C!gz8_!r =v x j+-$`&%yHŞ,X5.=v"vN<IrwihOW==!)"jw%56ӻI/xӓ6˯ Bo_\//?s|"$B -1@H,*-OJ ܼ,-FYF@76 ԂJYg$MYY Ee+Ptl' t\}:l9G4~a^E lLJ $"FNףNMHnig ["{±nдd|k5$mmwO-.y|N50N^d źA%C- Jp갵!4@eO.y|ƈSXlpS. FVq9EBӝr#7 t nW`EuRF܀ՎxOnτs}ֵ~)?}ݿze`Ik1ߣYoH*he }}KJFتL4y%x&`0A"HBl_7It~FT ]^eC?V"HВ,(֣Q'> \ܽ +U@uAUGDp}t~6+%u.9.HJ̺t~k4:pu)DnĘi0ֺ uEhfnʹ[KNI*dRט pTn [Vz{ԉ()=҆1 jHzltai!Xbh@dyJ ϨLadIjLQ ,s).Lc+n8a9;G颜z@XZYNEHU%8sgON.1_R%R;تO^"=dlHJyШfhZ.xKVfP&A++uǖ/=oKiս &9~Z; Ã1_CWy)g3syaVlDg9#kҰ/'ӓ%sS0wlnJ(;=BbHBN~(TBȤ[08KdZTv\CLfӉjM;AUB=~6XjjO:n@}0,V6݃ăV{y9m=cW*=]ЬG7 ܉~/wr+yQgc>0qP= [Y,K`a#gMZyiVף"yx,T93mLV;Y# pްP3H: $/-{$Lu]kumo=]u]kQ8b!w8 G)-9zzM=z^O9zzpGpTj{vutH6GWO'LhGۇXestgd Q1?hKoN+8{4Op:ACȏ^!v%/h } Y)1jaԨN GX?[7(fHh wm\]߼2YP ݒۋH^\oS>\D't[ |϶HTlw{~eA/Wpu9]d\*~>װjrOOI{k!rz]iYpZf>?@s/@|*,DtɢU(J u հrXQ20SaE VZw 8S)ӎ3Π!S+uc:7ʘQ ]hnjᨯaƝYe5Z:(w C vaWeމ)пk@Q嚥TpE$nVD9&-'?VWt%Lt7,+R-_V#QR4WEPR9-̸?JTsCrE7;\sAv\(U=o~:ltJ }͌+/d~s n.?rAv<:JC=* L(qǐg6z;#Đ،?Dqde;MǦ Ou93F_|vsi#g[뷧5rᙌ3Q`U˘o&wLiOJ.辤Ptۚ'B^ڀ Q7HtPA}cp[h~=jG_ 167"k ީ-qx:xARŎ.Ev!)`7D`d(7,I9CIc2Deq$aTHi{^@\ZꨥKIXpUQÐ| 0Y >:TQ:"ςLXn1FKS̃9aZ#N0S EDb24i"N*29Q$2˔ Ip%gKgJJKcxi&øJy`o odiߤ&g ~ڰ no =O&:,yOӝ# -l Qz%Q̝ ri> Yr/1K5ic(!%'֧l-ZyIThY/0%ZSQr!s%T-C 6'9a4ط2l9';X0M$)ZuBt*_QsI{V(S^Y?g<Lbz(V09ӏ:-Q)N<0ߢ=q]k=ܶ3]%\1InX-o58nM_>y6)\?bʂ"GrnvDMf$S{)1Jl|&vkwNց5&ndbB hJ_*B8+ 'E38p)NYgFuתf&ұ k3G<M{#cIfzIv r!4hDw*t5c)Te$2 |ԁG ΍ xE, rVx+#oj0ӛo rf 1Ut&lJOms=]k>nyz(}A# (!|DA7 @(AUI@%Fto=#\CqS07_U|o[^? ]ko㸒+AYŲ7Y􇋞vp=w;NVҏKَ#ےEYp/fӎT>,Vo,vTlTi1rl(I= jybﮤcsxSpLy[CR TwN{r-x^UE;giHj[zC^چuqQ( |\,u̾C˟YdGGЭ8] NJgg:꿾#It:tP#HaV%A8֑JX eq"1!K̃!IDΔ sf#d75F6sd_]:T)6 , kǡz]\wjyhs݂!r{6&Nۭn'} ,ǁy 9ސNFi`x*E74`" D7v:VS(PZ*X 8DJrj Pw9ʌ{,õ4HU@ݙT;{@vgjr"VgBb~/7SX!nԄjǒJLd|)'R#WٺCV3ziJ9Zg/Yil6CUq e\|nCd7Z.)ڨrEdn1$2*{ Ǐd͐Ly8;dlw'sw ++1` uou㇋5ҷ%~nEVvr"+WJJ8ڀTneXv?X7r3ׁL!裝YbCǓ(zګ+х H{^/71ZVj^1H;`,+;C,! 8ѱ"Q'8018 d|KIȹ  жR: '-lV:b-NIc`LM|h&eVEr~ϫDǹg˞!1KYE" i*"` ' N^9{Uawٵݹ[f zmuBE 0`#?@! C7c|ߺjٰxx)WfᒉsۓZM9G5I/j>Y\5;xNיYnZ4>o;^GNj̩x'DGϩҀ(iT$k/p$N,dѣeӗmA%GF;7oZO ´4CfT-aa$q2w=vKH 66^w 3oܪ*'Rw4 >^IR+~5r$xO|dɯ:+pUsNG֒p(Nʸ>,L@`e@?^[0cX'cX.PEt]*Gt"kSĚ*tGLz60P?FSųm,0XEfR F$] @5<EJqwF+iO$a VC& &T+۲;jW'Su9#T;`Dg'q+¬vI''9Y#I;V˃C־Z\%iɖM{_&% _>Uf_ ϼ!U]%nPO>wPr0sTLLI){J/> (w _cYcL9cX[L>;4$514H3J5 X`myNɤ Bful̀TWSHo?`a;8d@~Hye{>M\z;SJ,P3CB_J J'ztcy_'=@ゖ]\H\8D+k[8`ط`mJ-nvuuݛq Ƽfl\Ă8DT#aPIͣwoIPXqUǟnoFX8fIFy;H%,My81A,D!YbI&TSJh@l^n[ - r=ϲx~ی3O;X(W wNmw\^}ZzB>̀|H~YjRpF Ufo5\yL- ihevq!d~ mҚR¬6w)c* (D,EZ1Bqt#~\<4qSp0Hs m8 6ؠD&%pIHrE" B1Yyv)FÃ#8kIGXVbH72QIZ , y-»t-. 5/GQ\ A/~`-tazih X wt*Ha|^xYm  ՝Y;apᣯu;co84L4PJίR ǧ2ҭ)6/(%a~,<&䅦g$x;42+G\<%P 6ь=>b'BH%쉰Z+1z=s%!h>%_/ʞ;Cb8qۂϩ0ԝ ʵ*]̇2wE]:P5Vo>FnWLZ0 JsRKt;Sebѷ{;߰ݩo^(&LߌP{lA>I(Xf(&iQ$4@4A,R)6o(%yc8m̑V]1G`KS9xn]1wcr|W/_Nj\@+kRcm`*wYX*,}(LƤ :0(zd@)Is>D:>JڟEe#jqxWSn 묦:-ϰ0 $~X{cKݑ %_ͮĒeN5RCc>I(QB}UIP/ ;Uޗ(A2=򪄉TgQTƟS /*^Z;?9Ɋq %ke2bm}}M%oTjP=[R(hCtaB/茅Šν u#[RݷpyoKm>d¥v]HrfX8hXtxO4>XI:5LVsE)Sl+vXX*yA+9K2Tu<|5hD ☙YJ BC3ᙦ {ꫳ8puK]ސO^~b,ЉR 1D ,Q_KcT])$.d#` /895 VT,b+.|$Y̧QZ*COAvoGGPCpE?vAnҏ/8\%3cJ>zj$鱿qA/9*1~EdVɱ6n, 2s+r.~sX]pufhe%["MZ.Ss.4#o`Uik fLhWzWe 2e'T&' #73#X"JPympĐ |5vkWf^&yVKH YulN^6Ϥ$?9ƒ;s-Z~+W",ɱM*nI6_~r?{A`{m}ZopV4h5;y(%1z192vnKHvŤekv$\k Ãh}*fsCڥKr%OZ8rOd&8 Ul"0f.ƒ@uEF\^/JG`W^:RJAL_QzC91 уZADCcBwq m0-@@V0^]$%'V(aLA3ǔTT1V"TY"h5Da53%54<]jJ%pG#ŴH^ [Z\{Qej E̘_=_N@n'7u^/ P wr]j4>נ!GYb#=mxW07}PK 55~tC(n4lN\d ЊvlvU4˨KDGa2Sxx5T'f';{;u :IVz-$n5AB l|vgpfТ`<]HuZ}d̀`t9s.ׄ(W:D>D ՠP{/;"{&I4.-tmn/%\ .0߿E:trk I{wzĤ3Zb(*!gNQJ^|w>:r&$ZX鈱GA"(]d"SR EoA#΄oiOj R E>h}d>ˈW$˴' g{Bcf=k+wq1+>qNwu$BYcm塘2ݏs;E'r*rSҜ[ع>frBF W&?4 |u6ū9l7~﬊ 21Fqٺ8Pjě;`Թb&rN &[VQi TibedwJt28뫑 b}Ӽ|X{$?|k3LUx-'0|\M+)b(Їl"8ˮd}SRY/TSL+͋}$n< ɇk^1+YGiB+oyFzʺy8_ ^ Orz^v}, !,{.Y~r1S 2*u2:;sow}rm塀l1M.5{6ٱ T:BhJ69זS)!^K$]J'o87\蕌Jɱ!KTM^ QEH)M xBԕgcqIq糋!?S|2W*t'ݝLHhW#Ȭy 4aރ5 d{۔WBd-$m*H( %hr{tMOJ: )x4Bh}ȄX㴲IS,(ȘV ΚAq16kp Q܉H&W ŘŐ 0k}:8EWPIIљF/}jVh` .ڠs-3)*4G#DF],Y/s&-ewmmyȫ\G^:^G99fbiG S&R %A91s Bhm!rXJk֋ufRx8s$x;L?oioDg/7g.npz:변ɖMZ ?=:?ȅB+DbX$u?4ܔ3̌O$oF=ok4(]?󑽺|*jjRyڇ冚+F/fR<=7(Ę y6 8Uð^k=EzWRO+ "cIR +LTWjr'p~<꾀<(CSx~"f_7j5Rˆg\)sB=ɾjdb Ėl4nJP 4?>k؀ d_tDh~Q:8 ⰰ:kk* Xp'C4&JzqGZ& cLg*fO/c 6$s%0  m$^_Z`Ā20B(ayRQ!"I )yT4-}PQb÷FCo BԚCr,Uc|Qr  u* H:$ jL-\Qp0A>JTd>tH`3 !E-=CBKsD;sӤT{Fq Q"(}QIFvs9IZTګN˘IO:pƪ5E)0x T'MNHd>yWdk[Zj8\M׹| 6|>o&!<}n>ig raU!&Gws{phےUAfPm99Wߐ<6E!yltqS]qt7רdZooH̹;uLFI LIE\SZO\-'@+>0ޝn'ńd3Ȯ2ȔFu~N'- J[:<]3^|m{^ 0,-=eBF3*5XL@I*ӄii҈X1Ě3 $FyI9AXyi:#.0댰V8+j"J (8+A&1o<3-Y̰(per%c0:2`narwxKבw YSO33yۆF6uvie&ɛ7ŔxSNJ5+/ jwMR{9-_|+~W*ӣg@^HJ^0gEYȍxYx/⮴xlo79ţmrmcF19R؛ĞruWwst}C S=R iŌ/Oh*a%]"񍸔  ;i"$2ib$vC= #P !IOOU "g堔dBD!i΍W4j)".$8D!wp2 `M^Q&'\*H""䴠n*\UDR X[ Ś9pEc2:R.94ڝ`Fl,|ar@mC-ܒ9-ܒ9dۆf6DP/'` ~gfI MDfk6T҇om0_l?8+C15Y˴KNh躘aLܝzA/sAt!P-jC>| !TH(-P8 "9tP, I oLt2 ]f̩/Ir);rK%)>cIOxϴtTuK^{ݯ*kYy-\Ey.V c8>&kmNYþ5xq6ubW&ed8"p~z B,!fkw B<=3"@/z<.4ԣ>&B0n("ZS9SM{ΒŻMR)1h0Vo/hoe}KokqBxDv1UIF0"$t2 @p)2+)Vs$ x`PRrCt5c b>LN+GŁ p`|*H1a&ncSrG`u`B,ψ:b f LXkR\=ʛ7KxQq7Xbf8%qȎq$n(;`\M)H1Z,ơ8f4 i*{SjtS^ =D1`$jaIHKlD%¸UYг`-^A'W9*Vbx‰7:f;*ml![GȈVj#AaM6, &5#-QYISgE& *Zl2i2'QA@UL Ҧ\2!6Z.rɒ91 ɳe.q2/Y>@L2 p/,XS\rY?զv{{oQt2iX"b<㈎H$UͿA)m sn2"F;V P\59\Hib]SAP*YV= ;[΅H1vUË`'HrGpWpa$^_B (S Z NeT\ ;ә `E#`Y4{NrH^/Cv|ӍYopw@J 77fq*#Ӫ 핹uU ЅzU"bmossgf'̼gwzʌjNvn{'#5߁[Y3>q`ӭI&]R>N>zdyYxƌq3Ѭ"WF~.Ėk4ހxf. Ie#'KChrQFst qU.G,Q Ok6V&ӒnjG5Ő|EM/n/ )NJB{Ȅbsq`Y= yF5ebn[dj0>,e!HX\ ?B!Z>I_%wU+@:Zr\ #Daߗ23s5,N9%WJ9kXę܉`^^0 I 4&qDc\KrEذ ź1MtAc)&H^P}#%ZY{f.*eF{l]Y|ffeZpa· w$?ƴJzMf4'ٴ5?^8Q["0eUonbiB RJ6IF\(HS R~̞E!/6f7%JB@-y.<};o$}zV`+! `#}3pVeL\G0{RNFzSĐ601kI]CL`4&JDH+=WTbPEGna鑛 UVX-X5@ɲ L9HlvfgjoyrUW,Y˦\+_ϵt(q+V"+%rG(6 3Wʞ7 E`L̟?o!ǒiK[OMZgp&i]^wH79#K%E|β\5m8&7d>3wW4a@9!7F¿w;% ADHڅPkv]E^]Ds1shEL bbIl ӭ\.4+>Xs63|^/? 'o–5C,|7Y{<{qvvzY&wc 4GPRe߽ufST(Ř]s Kq)luɗo{QLNq+5,Xs-M9Wz87s-q9xֹEdr+ܭ`­OBPɠklfr:S8=^'yi;tFm$ kOvɁVݓM.ƓJn v7emJc5zBNoM +<6bWIѧ8`Zl1E}8Fށ`<|Yk7_-Nϟlkhn;!L5W3ڥgqNX Gӻ=O~ׂ0|GytY&+xϞݽx" ]uѫ8sE{I)G"nO& O &^ڲh/t{45Ti뵈 lTRB&U9D ˚^P>=D;I+OA"D*He%%E;Kӧ>ks-&ŞK@nwQ-dSOʗ~Ц.!%&rn-"po-EFhM|H(E#1+A'8B|tN?}kym߷epf Uwc_D&J.Vȡ% $`T 7G fXep%r4xz8Xx%'2l(GMe9EΕ,‰〦*xmy.,yǾ2H'wn-*Q|KdFznp-v7$؁s+7X::#hg'ΨOssV!ev1{…rTs҅߫`& {XX KK^SI*:ϯ`l59K+JT\%K̅C"7QIZsu3Qa*$,l]|Vj}X;SΨ~| 8~ ws};DUaN'g&B ۟Na믆W2F\kM1ۇBpi^vp |v^Yw?QO/޽z듫Gqھj y~[hv\> X^[ nѿں :ȶmh ⾘[_Pn:ݣ)ugN}~vͯNb0cN=Ż7Şy{Ig.NO~S?NO_o'71{z?{zq'^lû˓n_}~4g9.>7;Cc.w:o ӮӹRp=ݾgXX_'?GےGug{`+=nC|ϰ?pn,g_0o< ;f:ThL+N I7M'sh'ȚNͧ٥N`o^w :Mo6Њ}s<6Sv¾w;޷֛ˡ 퀥36)u3pN3\%]va8XzIƽ ɷw~9B+h+Z}"'ngt'xO~h9tz~K:p_Љ= Nwa?GW#I, Ɵ`F2RU4z`dlXl'gmFwh1)"a-$hxG }QGt=̲"xev aO'AMOU`DղTZ*Lb0!^,P\ኑm;;j!X8~D&_X3ez0 ]kQ]y|>V` ~@ǔs/b?6A?~д}ZZ;-30 g?Ia1:;;#Z} uf40u[OX[3 Sm.uEw omgu[o2 ?Kf7;i0{R-b{aRuh6E2$%gjռhvݙCbIA@\567h |?ODB2I<=_s<=AR2̖ {*AOntF c3\ilx(ň*;硒U\=== ze1 } rAރ"#[3^~-ds)'03)Rl+&1:~q=-wyn]g1~qm?M՚S=S4/رFk2=M4rݾX\_~g߿?-woh2s[ןKKcH;g a)qFD%C׿ԥiL>G?: ^ I:W'}C%zPs +|ù?7k!"%'+0ejDX4z[HIO!^}e: ^g1N~sy- |F4\]4 V H>έ4~q7vGӊa̤߮ҰEJjzbOs tʆ{>9ryQ9<;gY'$$+F5wxNv$wTS%mixg,EEg:a KFl:ߜNE4i3+d/k.%fZS"՜@{"LT}k鉑6 L]>^YF3$38̨\LBJI;KbIfPZ 0!=> md?FaͿ4Vö0i$+2L-͘, Kт/|iXʂkOfCΖ2;peX3u9rpF%ev,d~ ^ڰM%8k |U^'[i? foTET>$O=GI=`v_/HӞ-)E}һA0ѻVd\,å9;1F=3pY2͌!)B>*5o9|vWG&[ݼ]>FcŽ9:i*b0Vϕ]WJal>g>1F!=im??V9Me诃`>|zu2X܁L9qK[=;r1ϜYqilR5jd?vuh{Ȯ M$U@D8Br+q?$Iobح q2v9*Iӆ'09VvL='R=BIatq',x)laW]NсB0 ao88k2R$U0@^NbƗ1 '\:'7?<mwmhK.?֭38h;Mn]8ɭ.CbNu'v]?)cCD6APg,=i y{o1+͘M6c<*͜_`vC: ּ!!jɓHtPA:yCEG)=cYE Wό -^JF=1wA>*I`!h? {Ըg=Gj$HW>jgDNׄ^[ys R:RD $Ưrr*&H uTC͠ r +)_4Ɖ1;L 5ylR.p˼|0$߿k1S$m6 {˯ƧU)t FkYͧNzM' vK};y@8gF@WTph#KN@Qg.Q)_hM˩HI%ih4/bb^Ik:u"S˓Qy_xԩ=&úΣξW Ie ܚGz,GPԭ뇤 ibQ_̩sҥFꜪ MaM5a #yRX5.u%0gTcuqcV >t+:űD|N#=^}P X BE[\NdD$wv͚ӕ&מGhH}1MpQ"^x[}` ӫd jo܊m /h.ܫwo?K4X (.q_- 'ʩ*5f(!,*b +GL+(Q%4#Vְ"\Hp<9vC+,q3mbKa L_/osr4Be,r~/x|X,?ʠ_GOm* %uE>5Mc dAR/ZWeNndKQ8w7Eń7?kŕܨ1$<i\q$PlWtLΕj aҠ_h,iχSGK'WW#{~i( &l^]Aw-[#)r#v`9 l )V[J΅aS' -[MNlx087Zd췹WvX`hV́-.h" X/  qAS29܀&* 8EJj3N!soޜ&\XP{'_7?ɥϋoMXJۓ5aE{r9wNL MlI\s$D(d ·QX 0@PKY{00% `nnKJ͏_ -טip1`uŖR)rQYB~խx#.Wc*Ch@5o㍶IFۤ㍶Imqh>)BXoeDt -3&c9WBI; 8drP18W$•s]Phi`y iZ3M %H-̺2/3ϣ}8"(C8ZLJbk0 BX'[q SL4 faϣQAWͮM̷ܐE fvj6{֐N'[?}+ߐ7>R"fU?[#^ ""C$LdtG0hA9/%n<Mn}G\Qxvm=}GyEbdt1JpR |L/?cyEA}r?әN=کFud1|%jh s]809NP+p5(`VZf-(^]EQ;FHA*ld)-bg'|jX2Q{->/=| h5$w10+eb0`)3J) :Ah$ ¥R: u&Xu^V q.̍#uۀji?JΌ`27t ܚ9pW2jŞC%n /eɑc jix+JMcF`:r T"AlCT/sBKXObL΀0V"g*Y8h3V āP.s/M)E f٧P;y7_h į /m,f3m떽p Wb;S0^08&dyUПATޖlZ:i4F_~eHFJ+_7^5d g tыfaDb{}ޡ4{xi略r[ewȴ(rS'LtH(a2)X0 Fc3etVqiQ|KXuLjڐۃ#dN{VMX_|̪ڃU4mz{O'/!nhtOvz:**G4R>ڪHAc%|s |5zpeFaችu$/=`UxpLQ7aFshaW&'_Q .G)aIxW , ;Im}T[\ƏaF%M3Vh!+Qޮ;37dljtr{++;wxP !2򏰢xSS+'NMtOߪ#+V]7@Y{6 ҩΕҫ2ZQtiEݍZg VP[UB:o"/ٝO __mv0-|}ݳs32JjcVK۩ӻ \ŠgRT6⬣H=@kjCSڔ7٭-H)<pTv f_ Z6޵5q#lFh\\[I*{S*\mdIKRvA2P3HZ}zlڏUlvScVHQc=)WUc}]󐷫O~ G~mJ]cʐ1r^sjeQhpuEVel98 sRZcRrc7Z"wYau":2 L }hP˒{%׷Lᢨ_8)1:;|3^gىjVv 6GL@drrN딬dNNeŒޫ_]~?l3zuS;/\ [>jOΡP58mխoj>nE@1i{)ji5Y%J&BYY&RUT%ݹJ$m 4ۂw\jnƞ67M7ǒ:TUʝ񭰿'FUotny !5nR63|ch1Z*Ѭv+ܶڻن4 6dUֈr]^:&XLe۔)J hE1hlĨ P2߮O-v.EAn)Bn?n [ 6+:ʐ|%,Cxd*eloY}TC71>}[l~xܬIL^,duv,=N|(;!ml]zhщ/5boewPK -S#kn캬ij?S. [{$#&"'joZi;vg\bcXڶ+ ۚ痜@ Lkg"4"֥iiaݠc{yiUc?lfn&{F eДk5gj771gS6Ϫx:yQkنc إMU4Θwib"'d8m9=O'Xs&jy[}?_ aÓ!e.>9>llL~ҿn Y'U:>IT|p6:}2ZD/>ыcե-̸Us2ydߓo ɷk;[ JTN'WYb!ј%o3BzLG(0Z!WI\fmFa(f=O]&f>&Ԛ >=?ZVcRkk{ujߧڌЃ0ZھomFcZ뭻l3͈FMVsO֕KXVlM^ǫ)l+>Kx#"c"(}7xQsba>3kOcc=#Kkd>jr( pe|ͮkהkh[}ieI{;kZ~ΐ6 ٝXN5Kŷ6j"k }J̑GB+4Ul2лpW 3|990^TLe]I}e䕌F_rJfZɛҌ%Z&B:DvD.1CpUIk@\N_eʘeO4>\61$Iq6ptE 'I:f5\F!" *Z()81dU^{fKb7,.r:Ey,l-Jz+T(5$d3hR(o]o\>=3ûPշ_qUH+~Pz`Xb<(M?cw~<5Lr%I5A>'lë!}ʫ?^Nč'oK~8= s]5"_sf1r8ZǷǼ±iP%Ҧ5hl ɝ5g .X Hz+gTm֜̈4N,z O+LTQy-),$u yq-e]1eQo@.3g4&Eadd59)?1H F#cBO.Zbߤ<ڊ(D#!KL rnkr)xFM4Gp4- +h9n-Jp.iP 9GnBoyJZ r0%Z)`9D,jcDNYV9R A(JUD(. AkxA,rW6 ˚.rX=_Ӧ=V t =V kՒNre OmS"H/1%Orm EQY CpWs Xo~܆=VAHc0`^$(h&*|Ĉi0ō #cw2r(l E100`24jfeV̺Az,㓳iҀ0"rZ+ WFCCCI,6hE[84I9f3.LpH BH\ji!XN+z)vBK[#֦9%jҘj 8if OE8cYA%3=)UͻrNa6zC ;1]݈)#?sg)5+(càd&#p_-%FS(&@i1%~|ጠ=łeI|6&r4x]\2ĸ#+yYAjkک^ǿtvYMϟ3Z[ 9ַB|VθJ@k'a뇋cERINb\~Eی6FmJyqR"8,V1[挕@x{A~uSqf{.l`^`ŇٯLq`䂞|8{ l.Wg`3q~9*khW۰oT2R1`7$0#vpMm}&Yayɉ&i,ې?8]>]v{YdFrRޣœ=$:1e)έF'˃N!voLx TyiJf _nTLV6iloO `tZGelY촏s&y9YL8. pױq&1>>0Ew(/zYgi<E+%?])((QQ+o7Ihr1*Z=0:(FDHҤLg.d:yNiQKN{D6hǴ@nLuh[ևp1Nm ,WtbGfG'U-<%F_ ܸat[2]cģϤ* t:SJ oR>Ie~.[SczE0ۤOtidr8Tݐ7|q dRR|*YNÔ$6f_FKqfGg'o&~㓏q +rbD͙V3c5ƽ]۪͈Ku[4|cVx/ju@s6݃c=W[! c6gxRݴiLuؤ|;{a[`Wzk<=J+,iAll}Uw{6 Y[AlpKSAބw/PKԏ zpLk?]['%Oޕ3-6L[uc=pNj()YQ!2](Xcet**rJ'MO;^/|B1zA9)$t!렄U'U&<(KKNFڽwI%+s粒܋sIO +da"R1r4UD43-=I`D #T)͏&Y0;.I VWcn; ew:ApF;$(w-Xhʙ,* w'EC!D![|V(L2*gMZV~Q`S8wP4yenS[R>)dJaP u1"K'Yf-i i1Y͓A&)5n%;ݞ9u,dL~>F)JaIUUR$虯? é)y}Eo|;KO^DmM,g(] ^W ƞ.ze͊==+*b|I+k7P"rT.RN"T\d'R.py=Łd\6Vaj06 ʙ" Å9rY¢BHxx:C_+EIB*=786 }8 =\MeQC?ХԊr8Wצ|tV6-f^^%LH^j'!]XyɸMRWGelv͎fat|V GLVD)f I'EH) >zesɐL.cy?L|[/z][o8+F^UdԛyE/g0O )v}N,/Ν:u= ؆uŪb| 2\;=QLt`y]߄?z*#U~6{zTmt[}~|Z=]xQk'.߿y޿qo3Im>={}Қė7QHݰxeImVv 00NY- 30LpRGe%CTe!XRUG/kx%=TUȺ%QKWː+ozAmѸy3%iϘ.Sp$OoM:[ Mp¬n nO {mYƅBՁڀ\z_sp+!FFEjuM99lՆC=(aQ`r̓9 r|l O5j:mayĕ"x걀A8_s(-C'2|V ?gh!rs:Vt j*6`\bp߆ =4[<tM3s9~@!Q}6!du o_}X܈!wkSX,sPhV< s{UJągޝ909sFqaRBC#<̃Y.O HqgKs)&X6 x@N{/ya-ysF@|y6J \cnt?;_Fz ({#<6-㨖^~Ovæ*&βV#F&J./Mz$?ZZH 4%Z>l}Xikzx۽n^Jb3Mޔ{4ߴˣ rI`ci:#ut_t8^DH}(o_6Sۤwu.ƾ/$Mg \1>Rx8"s1nC!FٜQPX-OAsG5Z6 k,Z'+&Zԃ2y$- ҅8SNQJevCߐqOqAJ_pnr{%:˨cJo8[5w񋪠7ZNN:U.((*eW K΄JDkX\(*YvN yT YoXBh!0h$L:2:rqU1a( |YQ IY:x"9'P]H&C9Zyw~N/B|,]rj>ԧuJ4w%@Ws'7JbrH腭,qO;5Wu;}}%v[TtEic+/0 n\A|޼^'w)X7~ƟWHU0Q4E⿋7b4WǸ X<|]0[XfqOx]plqI6"hJ7sSV7?m7%直Aq=9.`˨0:gwKk2Wpd{t\utG|a{Ca]Ƹ\>:m]0')v>+:o L攽xс1 ^Slʿ"W}_ޢ>.PBC6Fsأ݇Uco6[j3~]ORM, zurdNX ߗDӎgL(v%m}jHENj#F%R.'&W蕬|4N y0wcL0uS6r~yi}M,?uYSKxN @c<#סR*X~~B+NA{Fpgcr<{sF l$/o_tӠixv79M%8dlDŽq&K!0;{e3;膆鼧!&ɑ́;̃Ǧz\hF1OYl)*CQ2``%- bϜ >R|ȍz.DƭˈV۾ot5D͎gL 9drԑGfW'dz[Z;.؈2[n "tƉ1R{-?vW\ 9[q l(>^gfSf,a|ct97\3`\0u19>5X@SɽV h1.x' _PB*ȆZ@i*AdDF}VT6wh#1b.9$&bYL۪L@ש[K4.R^"s%TۼK[nJR+%ׇ%(n!& ET 4c-{\]&RS I'DpęVr)82TRyCSҐD$2ДBY zJ7UXPW*@Z>ZO}_^_q'G%ů nLձb=-IVh$g%*G,3reИcRCb.t@i& jKEtY6qvW$v >$B?7V6յ\6 vDg?RnwVl&Qc'QI տE@.ޚXX5=͎gL cOL`2G DĮ̀O|\ /jԉI 4 /vcݧɋ(fwx6"X*AO'r4$ $:v lU9q~4cg`K=3Фj 0pʔsXP gH?+$hȜ>{NYz:Gq'&*,Q+$+eDbh3xo);m&}Db=_V{f߼Y/.PBj_XuJ?Ud*JmergKV%c.F$RMIxƴIlI&ɍR~xӗEO#񖓦3. &/'”̾P4.m4n;]6iRN]6P7DHuzVD;7,W}ytalNdČ. }Hf/8;&Ռ&Ācԁٰ94[x/u9)"5s4IZ4S<ٟ uݥXd]:c?$2a.5trGUFu"62q_eN%mχ[]xoւ_.-_~-\XUh/E9uNݺSԭf\x)-ga qJŜuu~`B W}~yw@n \]jt 'x(_ϼX^q ~64;I P VLGF4U>jyUϫlb"Ti[o6g+gF`Ɗ[H*B:> qtëSPS v[ ǙF_.I[.$]J?|Y}\,*>^VDPqY -;m*!5 K@"UVFC.K+C%YXXqUՕ. A&i |?3>NCmCEX]OPp!È3Dž8k.1~ rFQxGක2.H&6̧҃,cufQNwmI_!6%E O~JH U)jHH3BXŢ=տҊꒂuDQ&`mKqX\I(ZL;qN)B5H:UikފCoѠ\¶Ҧ<-.vOŧ#|%lfv !G +|2BufX0 'L]Ҵ&|ɐ[Պa? R-^ IxDH8G;"6a1ߦwsޢ&6뇿楧1 l>/"s ZlS\F,"aM"K€!!ȓ`b%%[d"vd͟Kavؽ/;A4 4̾ͳXVڿnt.lk:tv0L/yN) `ƎЧT!ԍb-R9 W xkh $jXoJ&%}XDfvc6`#OJj3\RZЪ m:.7. >ZŸGz WI171o8hl ^Ф&\WðbkƬ=| O{ZJ/m_q;T"Vl²16HRٚi*Oa[Jd a |V&1,'QHFSB?_=v&o{oG_<-Yp;jz^in͝0`PXOW$/b ~ݿ"^c?6cW5A<.s!r7=ݨ s*7D@ zx^w+!\ACyw|b|)]w-mlRZt[Ls¼CUas9Gͥ,|wfR/7xHN1Rb=SGg1 r:,gx򾹥DpV7;C18p:ńIQ(\pRstg@oR p?vlE9*0x6 NT ĘF{h7xB;TWLzE_lS,87^JlEBaBV'1 Q w0Yƣs:ȁ+ SmԡW|eϟy_rH"L)Qz{ɓcb)! L`z'O:d EF]UR,WHEA(k`XK\FHj:10# ̍,ȾCm &Te-֠#X+7k:H_i~aReZ Vhc6I%%H\EZq&T++:]4`/Ӈst $nD}8$J?Du;h'`bݥr^a-&4XprҔ'a{lafRB~+r"؈s%$ ~}/<1%fl̄\h7[2c:Swm `Ke.Ζur7Q'?fQ\LyW{7!iT6wH.9n5"mFb %۲HXǖ "#PTc3X.`u=r ]X%\XJ_0,W$eIy !cWJq}=d*(;`>[] 2`Hߠ])>R(2vfV|9Ar}6@. Z"cp:޾>(km?F3sBb9d۱DnVZbeQ,SKЁ0yVHAގf̈cı"8bV2VZ=T@"ǁH%iJɲ{&yF@J8MS-ywaz=@-!<1686\&@J9jX) N[UA%x@/H$Qy4 I ܂&4C+u%봪>*wx/7F,RgaiG&qa8Wp1Fd  49-J܀FQ|>N| /O|U-3' nl$H{'co&Ƒ^ 6DOu>?4sAu!6#&Qb\ftJ CCgi :#L4 htr#aǯ ' T'Z;x \K*h)SLbHp;,!d;=Qc;Ή&"NSVBgd+4 {`_$8t•$(RuG:V SЛP 7h+e8ՠ#nJc"eƺm-S pa:](esP$L$[]0306w탣TfFYݾjwS{}ur*U>޾p'd?~kU1.q[wX#r)$87BRaLa"`sW .e,X1i`!eLY! b:: 3? / w_u!\D\\2(I=1l/;L-G \VS]9Q uCZX`:V1yj#{@#ZX3:td0"LL<rTʙ2ɒ1#oXȁ8YM=bL `E-3HT T/bj ^֒S«f6}A+RV->̓;H#$:[|3Z?Ѯ"rjĚAf k/P1%qԢIddETqL!@PE% (DS "EFpǢAKe$.϶c5?5.ݱjAƠT2_6+s{uprL6dW idF܍ΩsN&29 (fl_Ϳƾr}u2_a5| -ӵt''FIYڹYk^^oo}z3S@{FGdxjPz (̊aiW-IkբV9S޺o?pEU2nAzC<.ycC1ri?jXrIZVlA"-Q 7H'W87ո3hwHM.|8[QKT[źa>TN&CSͺ J'*77e^\ޏBޏ~--AH=1구Rt`"["D"3&uRF=w6=8D读wꛧ{g`1-uWQ5|Pk;*_}GOHyEJIHJS8QS:,TԆ˲]ծrj\8SeS&>Lj@J_$Q ףp1ev!Mwlڰz4/i,-;yr`tIrl\tF.h֜t'?˽u_N'KSKy>}?Ek_&{d]&P:_a>ҏUxz>kIbG&oA}`'Aݿӏ"rz3ϚXVŊjؐ=>,[k!kknHݜ _T='هj0YH-I) DD'UDїC0maWaQ٠{+?6l2D ާ:j55N0[B]b0K11z$EJ SDs\r0Xo]D#ބSx}h=S?cu'8 yg+jARLcqZ]oZ! o!T2W:o_* T'!B T8 ?<(~* [d)* _?Q[ {IJjR^N[1]IE*=~vZ=J{F.U+_qog6 5ҭï~oԢ'jOf̿3Q1]*hR²%t_D]hFYAi[P0$+EIA\[>y+vvyJWnv)͠zaTOjxdRh_cv}qfQk)b~tӦUbE`A6(Ч '~:߶Ąݏ;'Z\of݆J)yQ1aJ< Eu!x!  i\l45'w LD#c lx랺p̻oKÛy3<1 1-ML%}Y)M bA0Xh(˒T(FJ­c9a2\32")<:9Q~ZYaO1ΌN jfpg9C+FZ0 o)i+daθMJK6Ԛ.SQNR-n%ƨW%EC[e(SV n+x%MglXK& ="bZ$FLe{9'copȺH&%p6QĽTeƐJ$F 0D3B4U,Sp3{`nIUê}\;I]e~b߽W.g~8Nyk=7"sތ3wb ++s= 7=lꏳoߞ]V~sp2`m((11;aEp՗/դ'2t1./'=9dRy*g,:9vD*T(1ɝPRjl'_Ǔ_dNQ(7p ( $ Og1\)0(ʧNy.LC%9 Lq[فӖ^;gt\Y P4K,pe'qq,OufV-;Vu z3 Hz?5nN:T3-R_UHJW>4+k:YѭLyDx,\?ޅ[s^.C|ug]Ya|uq|V /b>EV1J2J5RD)6Z G-.S,g Y1,JŇct9ـ!Vaxl?/n0~b+ⅸ£.`j -ӂ2_ҬwlnxsL``<;eq N!!2L3SlY Wt,Ń\ Ӌţ/_T]|eۣN#5#jWO۶AtJ|`Tl1[n}K] u3 )~5A*G"@Cyh[஛fY 4>wplr[![Ag&oǿpsy8zf *g_w0Õ՘s4ꞴFI?߽wC~~I+3kR0;͉F)RT8X[;CMҫzұcj΁: eyY,h0 '$(osW"vm]ܕ*:kN-/hT-k+L[|9="TI[Rmsm# DX[Ni遵 RQZHUF|o>0xE>s m}pD25l}Yy@WM0`pGrHSڌ\ ڟja%=󊋥ywIdŽ%d!?t(25e>h>뽉dsY䞞pft{RpiZqqG޸>WqTeG9D*ATJb]%W(p`c!(b'M ;9S``1,xiAHT3DX`Sx0T`b`D!j⅋CUٚ܏f[w<!Zjit (xp'xOO /îu7fd//.r>䫹(\^bR4P*\w= 1b*ASC5A`K_j\&^8VL`#R ^" Xip ߫<>{w}cfnd~eQ3~0Wyn/wqr*@GPyo@G?]DDH8: ?xsnd:[>WWg)N]scou|?mP$9y/60&q$8Ӽ|%7a.JFl0d'²嶋 !n۩LdXAÍwB4:UJH *c26aɌۃZ)FV >-jn)H\Q￐ڂ=)*z񂗣TPD @zD8)cN0oRШnJ\nHseg:rLq1HFZrG@ C]nznۃ# :i;=Tݳc(,.[ A|?,h5QDô5a"qj(xNCmj[aLj]rj9E%[+G%WwdtĜ+;J{riBCQI̅O,\ڥ|{sf=>kI q^lB]nCXz'(ȃjιE6va֌zu@_4NhҸ{6Pwnߘ ^nݥ&šŷ@d4 /h j1awD!یpy9!`RfՔ[gwH8JFM-pfHf_]gG1P/KF<_d0/iTr7]+^]>dkp%tݨYL_)Yu\ J%( 'tMVuojnP5޼TRTج5 S{˽XݑCmP- *)Zxo~2,> B昋;;`0ϾMM-l7b s➎O|:\7gg<ٺM_>fFse6$kCVǣ˻h˕dX9oАr» 9vŠ>v;9Zv ݊ڭ y"#ScA9Nb{,): 3`ю YKVۙ׷oۢib?{q+!@(.l7(n q@Ć"uIʎ]gHJZ%& vDj͜9g<vVaMW+_jQ,v?ExTaJ .7~V[J>6y珠Rxܻ܇pLJ^ ϼBAoRQE"! i+h`U~`!\jdQIr1!!S] G_lPVB&&7?ulϋTiSTE6ŃT5_#7)$q xLSg _|\6'g/u/.&\ȇG伂 |AI"iL{I Lz`5g)6g)2º,%^ڞc_06/aԋ_Ih؇_,xSY`EUQkC3M\&pjoL!㫬ͼvdcmu mkcobFLssc)k F#`Ou*Ö! =p&x xOw'P؅x9ܛݗϷei?3]3g u3Bg2݆⨉PUx־NT6NjɽٿZ>,eQԾZO`]6;-ɻTqY1*1Re KO)a] 9jWM TpJi !`6FYD9,% @V")7CnbU-0e[ bcM@{oxY'J#hFSPj;*~%pvȐ5N'u;rNf#)׍y.J*=pj#mɴ8 ;|S3X+ڒu.CZ֞}vd u17{ ĀޝP:{܎ xLo7/N_h1Y01f#&X8H}yr:3, Flxuy) Hk"8(PX9gLIr {$I`bѝۇ]cTis^]r*>Ve9 {)s`t{յRQŕ"1KPfSjKLXwCoy̖Oަޫ"<68"K h\1RaIͬGx9@ZtViQ0f<~I،2jSYFx<&ݷܗ ̝f+#b6Ҙg*p ᔢ?kr!12?G޽}+b0Ox~5ODDHrgӛn'F|r|xjU.JfrSԈqĺOwS0/c$EUqn8GJj8KΨK RFv>{_x'pW )yW@T"|)B AIITS'|]pWYIX΄H*anJq.u12%D璨T{30s 7 P FyH;L~υ,aiN?%ޤdڛL{ioɴfFyXP8:ib.ZЯJdCY¤ X LqC3 B}>NB\uލF?"̄׶]]u>NE!Lo ӧ5>3 eW~Sҵj*M:/@M0U8*&&GbAaK׹lgsӝ&+-Tܚ+]{of5@Hlv_/ [v_Ke)Ҹ8{Iuap6:gxzRP# S-lD*D %B`_PHMG:{S*f"蘠[>7 8ZB<`pb( =-d`aE(2  `=0>6n0+rW=d?0 N0)W4F`QG`LL BP\`U"zJ5zw J`_aWNa vPZ21[-`DJi$Σ*Gq[w`qRȴ?mt1bI΋TIb¶<ZeQeHցF tx+o.ašt3;Z42 x%,R*`b"1nu,R ,[L`` a*ueTA| jGJH*ANPN@˒ (пtKzeσFH6g!ĸ׿T[mkP…^Ln#@]W}+tkj9m ] @z]V-&M7hޟ#^}w/w`(:1.:wVKg6=?7wgTSȶZ:7,ݬEp R56|LG'N=xzM3E\ZFr-Y _;i ?y#[F s-wW q.So#fp>OZkb9(0V&.H@4Vw`Jdڊ 8l]}f4|ʪ{>jjcI!#uL ke2*)ˆZN /QZSgQ;hW(,_B/\sbM"ndV wP&Bhr)õ,u^`E \;IY b׬9K$Y#tnKxC~!**ΩQT)) +\0& )mTTDmYJIJc1?g]jx7ﭫT*Nh}DBښlO?Yw&N #Eȧ1`KR*5q Ur)lI&,!ټMΆMR!.\=yr_(1yt''ZpLWLւ)Yz_Bve$wor > 5_z~*b P*@|Odp?iMȃZ~ w;lr_1RIA]Ⳝ/y6(g+x!c­x ^+*,j hh= Lh`; *p><~US,JK- nmi ی*2FY:*3" LE! 8 fVm ?i Y 0JT*N'QÎZ gD0Ոbc=(# K*#(VBKQ!knw"%A^/(g :_MBr'&ۭjb ?TX")->0eh kXU)Lo…*R2liE<+Af*sPcܗfN$Ljm8z=MgjbӓY/AS/gt"ɾ;dywZ] 9ۢop8ivnW3|x{{d㑽 }bRJon>P梋;J cT_S qv$@s0J1<%pV@,ALn"Q 5DDa 6Y[6SP4A49dRX`r쓹gFx(5~\b ABBY8}Rϵa`ة,=vyUXv-BHE:jlY~iLPJ앸lTf$RBe^{1gTpR~Ua!}3c\N$!pG N9xaEnD$#9YvSd Ւ@ցWYz@WX8N1^KXG5 N;I 8iprBU+RcII1ʅcđy/D9F # yAEӲPg޵h]̈́]Zr4S9Ez ZiAV`%tGxI03-'Ǐ*,6ˉmdթ,:spddJbX #*a~'>olp]t_&e@0 kl`p{g(qm%>`"0q8?G]q69ɤrEauQoO6lB9HG/YA{ bXИw24R$V\:\;pEGyFrȘ*:4_q cltmXdz+Y|fc rMyiSTP} C#8&t](IV-k)Zw-j\UhoSED- 5W TUۼV7}E8_tO !E6%JH UlGb)ҋѯB ~*P/ tѯO ^:RCQIW k-&s{9R151%F)MJQ:|6h=%]9@9R_9bH4ڣ@wUX ڍ|F^-}*'h'^Y5'L /ܜ'u@BDq98_µ* D^! c<\aD7v t3KIqN4=<sp!}Tsjv7|2p59h6no|]}e-?߮RDmg BU&n>_|77WQYqiQJѶ{%e׭ M %6R3rXqW|azʾ_bn"$H(FqEoX;BUJAuN=H!( 8B9JOc(8TI>iIÐ$q!.P=~нc+L0=|!O PW6N:G\\y h%gӶЖȅ.0h.sc 0ra)&ǩgkl8-+;wt . Pe7:k/ xx|:&';ٻw+yw+,YNO(+2 BAb˺YE-hA]hFČz4Ҋ`zDCU0EƭNr1@E΃sp6 g߸ W Vlت@20_yuպw'KL%"su؂յ8(^P9'8"݉١AuυTt͞R2ɮ.?X[#H (*ΞR1CObL1'jf _:JY|KɏR楗XAH^d;OqjCP|`&!38,]-9 OVR$96XM!,䒂MU1W\{˕i`zND4f?BCDaQ>fzҨzlڿdAv@A˔6+3#[`e#vB)#/ F!]Xž ! BZTkn$rBwibwO#ER#3L vim)kMwΜ4s}tn)ưЌNQ2Kw dNͩV US@5/ rs,ڛ4r7!I>tv oosr]8Mfx>=ognWHcMyЛй՟ϣr8/TL5[~Y+ow |t+FC4|]+y[Ȕ x:PS jT bD'u:7햞2R!!qݓ))Sk1?x-" DE'DrVps>τ0}Ӯ}BTtRPk\àZ+4(O Қ cS1="2E7/9#;|jo ݔR!71EULĜ4!88A +șԒ{iZΚN%oθ.H_vbi&oakܒGt>g%D V0})j?%[(LNhF}A3s) h)|{vCr~q9{?hw|a!(7;5sou& kޏff7$wtpX!nKҘ^Ε %;ouJ"@"nG31b0v`"9l1/AkY^k&@ZK/X Z=/Ktm w?#Y,F'a M vtg җ IR#JɊ5{s)=ObbCVȃY[$fhyuQwq}1}&K&݇ca]vN/ҧ? CVSbH$̤dF%6-sJ/.+ak ÕĹJAE(v7y ~1.R*/7BƟ=zwVm0o|jC`BϡIdRلWN4fHR-.H]3=O݇|2{0~7 N}$!c+Oop/aKGdBBq1llҙ;8BfC<c\ ;{(FzI>d.H>(5)qnf:=~4 Řjڕ3|x{{d㑽 }Ũ4vS&-O>vj1=0yQ-O>ltj1=0y7CXU>8:P'_M5vsaW8i0**ݏ~8 kZFUB`c~5UvAWsն&y0Oc)Z\fFRKA5hS~4qxJAFq 82c^ -/\ռt4Q7JJquA*Cp/X.FL69Z9Pj Z yCHCFAdv jY&`HLx-V1)W RkEϐp4nC*zYvInӍde]\ό x͞"kKQXhcRhoQ(n8EKŽظuʶجS*ky0 Cć@,)#a#uaL AqV*\[1w5RQW;e1 l?`UnL\0 < ԚqU'Gjb^G0-@D ˲a5@;փ5&gBǯ$&7qQeǷB`ٯJh4pH/ޜt.%%&53*VD/qmUJ ]OjG@BjU`*5治آ R >pK?V:aT,`HB b0-*}:Jy ʹJ6Tn̾;/[cΝ#XA>L&c/ ?,%>,fǙ}ϟ!bʆZoRFxx^jwK=sp4ޙ_*={qd:v<7RKrj߮.1q[.w$f3.Q%B}u^Bz`'F\}1DGc6P%bd2Bҋ}>%3{uQd3bR dBrNԕP 6.rGQ 8c:W I3Nl>hznfz)ENa#;y#.j>ofmB'*ÐaeaFO֙AD5TD#YY{$ǜP,odrR1_yj(h%(|qeacF6BoHoK_},]\Ϻ8/͠_/1c } hD=]h OK]5G%sCK\\u 0*ĸXm_`Υ'` CHwa;(y,D2.Ms<}HÉ?W_a iI#^>D=FKۣY^oKBcJ&)f77e9=qbELbn !!(yÕ{PzR5z*kŌ n4Z"6ڨ8?Xdd : ȡ=onӾѺjA%?=9XZTg{n@`4ņ198PVr [-줈4Qh9z6s(Xν{ߢZfR-p3bV7uȒ~m<$ert, @`,1AbIVDzf.;|m^HLeRVdâZ 6rɹ1"xD :ҼTy,G`Y`9 !Xh F c& f #!DV56pU|>5^h,·hhAggJGϰu*=Kpw` ˺A F>1 (XC#ѥo"1\amSulwX3UAfU_Y"zCBSԇ?6 e%ow9L%O7Q"FT}ئ1 Sjлx/[!tdfelR?`aq'aR +1HcK}el4kXR&߇y+e8S+|i,_ŔQr-gi2\5))-2e4>]JRR@y z^Iw.dћ]1vkA}FvBwxZMY﷝ݺ\D[xЍǏ>I5Qry*kМyz=8-ßj-Vr QFDS4 ֌D}$5 9^5ќV5guhN^IEAZPi|Ad an4l$<ISQKƨ - Qjl@Dt&lB/n~+G-~D> c/%ZC-*6un#;5iWڟ|Y ԩ ht;}`ڿH xYg:# c:i3U-*[TL-ۈ0W4#}O^hәx)Mbu?.3kzpq~OwxKK ?@>UwNxWLlHQKL~w@拕 |!>B( /~\iܯ\GJYÊUܾ0J`E_}}5{n HO SWWs1lL٢*ຍ`r\b͟*s_̻|B?1T_>϶yoH.K,!ջikqbLa>\=F9O5B.w5TK@>QC.HZd!fR{l̲ -.@ 0~@u ]s|CC'ŲXH8eDs2zp^'58o6ԫǽz噠Z1)NzD ɾ~zDaaz9:mRl5M cY{G]+O<0>LG~Ya^FX"y7 &=>Lg_˿'VN/\Y|}/fzzc_:He^=%ڕ)j58{lcdWLt bp̻o3i_5p{Zr.K2{ϚLDT-Zk33b X*Gq}М#究B{<-Eāml8"AWԁ-{!ͤi5&*H4eGNGN8L  /oE:hA5PS&$rRH$HK›u&QѕiҌjk%c¾q3 "q RI)bgh4%N} f@3/P)̹Á@_UEmKF;bI8B*U,˺u߈CYu Suk*vi/ET`z-g@Fq5"DaWj-/&pJF e\ %*~MAnTzx=:ikR|9+D7-)'w~jcZ,_ʽ#ҴG3Mkv(1a:{;Z< } -dVwY0xƨ!|D2j1b3_MQez>bّ2;.~!xkqK"=6n An=_18}nIBK Ǧ<$ZᲺ{>Y˫g~2:Q1מ>P8v~0 qN2`OuX(P4Ȭ=SɤC>A$ +"䄣Z ,'j907Ш@oH@B 0G5e6s)q2|qpeٔ+ TJܚ;sT 5AM nf73 VoWj L~ ϙ;~O6퓵]fZvnƨJRb3Ɯj-{iL9>dZu;ܒ/R|T $Za-Ҍ]VhǤnщWaW%o ;o4;:{Zފ\b'} *I'6XNi[.(5@[ =\xfBؓQ|UtJ*YI! ,5]-[t@Vqj8z%K }#AԞޖQ|[6d$WK4U^ ɻ0?<7*\_}.sSm:,{P4jH JNcU4Oĝ*N! so<~v?XftW+8We& E]6X)=9 ]EzB %+QD:PM( W⧇gPQX;ZV/Vī՘s5HxbtUA-vmMeZzeixI^RKMrph\B؄zY"K\MyΐI?O~,':$͔vŲrU {6{5޸apq~~N_vӗT`_7iAV"PHTh-Rɻ{EJY|.~fa6/tI|.PLb.gIC4ؕ(9Y| z:XlK2P3ESX]agVN@ "{*!PVZc\Eptt8+eym"(`=PAq3 >+&Z pL8Di!, 9ANŒS$p*vX --Y%"(u^\J+% >r  )b̔^0k! KvVhQDkqE6~L lf6Q"޴{ZL2%%u$UIdR%b;#yx9|4Z*Nq j8a51v$ںE\B\Sh[?.r]-zUvb{֋ >*-:`-ݛׯguþ[-10|GA6JQ{W$e?zXc_B٧~TaJT8ǩ'˧h7 kY u)p mZ|av/j6A^i0&k"+Č[ ecᅬ:!,lZ/׹H2D *ٴ+t{_jmDlB_!iHPxg1$Ɲ( ~M >حΙ|[БJ}AU# 7Uy>~ ە`IDjxZM\{S"kV[Iڒ&ԣ?Y[ny>n]{S[1[ ҺjGW|Û^)έyo#~>}9oz&ؕ?yzUݙkr7q[f ng=1j:ˤ FъP#+AVvýtdR4*ܩ4T^/qz #gS^.?N56?zvܚQ?<}u=uC٫UY;_ۢϷus5EbSd'}vOnϴV|N Ew |SEPgq3>KhOs% xr|d#(JdS@T13DZ`)‡9A< JC*yO'cu$=lm?=[NI^ ʐXF|A H~)֖?]~3JRFdkozG GC9'sG{-` ))tOH¤|A"I22#S:dY[OI;0#CDBȼVJˀ3 ;~) .qg=Xן.M}tT?!vBښf5f9b³e[7X'+p~znV V0>{w{szIJC^xS[-P)耫@;_}OϸNWԆ3[p|6|!)[)U^qPAT(CjP]j3k_SYyƶ$g5+if{z=E;Ӵ$B;w7]u<7]) Cz(h>G, DN@gp~#d0b'IC^xS޹nbv~#dvΏAֹ=wyn)z } ! Se 2 ""FH(߿쪼C+aUm{\JR`nhKTJ!ȏ&??Bjes^-k<ส+"VJ׸ `)] |`ϫCÈ`X#"@w-`Ag@7R>tu0%[0]r΀R*}}?ߘ3 ɰzܻY%8:sǢ'{g -ϧUyKfY)֯{(sOy9͝˟6Y ժ.Os>n! PN{uLYCoKQ~z02a=#QJq-dwrX*! ڑ2Ϯr8~1r~8fR9 a`trř~z(ZK ^MmHy;0y @tʍ@ͷ0L)zӒu'؄T2Eo/3{(+A  >ϊ9);vzez(=(~Ae骣5w׿67tmk 0 |cBo8Q;!GeYlN ҴղR;Iլk&k-Eu0+d^i(`e0 5J-F*CC\pHL 0-^kCPB nH13(5ۧkj;цarJOxb- F3D'MC͛h0({da.ҕAJK%tܼo~Ƃ, TYʩ F΀l$.4L2VE藘8%"#*b$H-@U7BU1ZPi~}d j6Uk͹_DX [[Ƶ2ՊAN k9 4-*vAY;= !ݺƮHbvicV3̈cHbI-FR $<#a -|[{vR:?cgyI8@b[L0Acuvv{fwʈF;bD$mh԰8)Wcp~i-iy.3ki3GH76\5QXdGmԸ:B3#dJc; 욌yL/nq F0`Q"vABv$fhVcL;J|Q\T3km_ҀBvoōMBPFC;8[$d]rc4^[c qc[`U6! 9 A'3Gl {^s盉gw?Uvn?vLk{?c$`'zT "'HhII oI 7#dT^( cjK=\^+' 1ŬLPLjZ%Sb DJCY_} k:'Z fŅE⋞Ud $Ɵ*Db.\85N+?[SJ2#NkYwMϫnbkk/WNi\9ZCAӆRfA(`f!֓y ]sӪF4u۴5\P nL!ҍB#=uȺ!U[-4I}:5룅A@TW>R7 NZQW ~c1X!@>6+ ,Zt>vr&Pט^/Fi1c QfW?&1ݾ]X]{M{<'YiJE /"]!7E{EQyRte‚iU' j_b:_L"UsXx#$YzCW`Wvj$8 ߂΁݇\WXOύbV)q{pFZx׮QV/z{Fw[;o?jf7_-"Vr}nݳ;w 2h޹LןݓWncm=1r mwISjOQ|>D=~9==M̤c&g))>>GOV_. qO,ןeBռ {D =2! Y,L1MĘ!J|UW%$v3*°@m Riz ;x|72ىځ9O!⹊x;#OWe7Ι"YP#b7qݯ;0 a'09f7~P`}sf1b*%uzv:4[BJ xLtL/~kk<}̭8߾^׭}̥wvѢO"b%HWJXILV `H (Jk91 1ŤtYN덎w+Gd_T! <˝Z8ІiǜUCNd= ]UqdB]|;}aZKlvA$&D^5)8\0rEJ ޚh6&{:>'B rJ0Jrڎښӧv9La&.8[OڵIhQ _&'ʑoAPxGC [A&p!( {̹ȭNJ ӂD݂&VjDyeɫuCڍ+Tw?j')Xb;% Jhy3B\OaK(-xM ,xO2w}w t31Ǔ/h4N]םXkv:K۹:|ίa m-ս&NLhchgXPñFSgl< xÝsG4VQ3$.:]uVL%dW44 ))+ALG  N`ɣpqǚP-XquV-Fژ78H{A`1y D] A F"VPQgJJb$-*dn*J\?lM#b4ZCaZI$8O',T"d!m vhBq*=>jHaR_p܂%3,Bfޙ 5 PXYp ‚f+d/(@XlA+jt?7;B6Ւ:,)JqR;kh2W:.zM! Z/4 j& ¢ xpi3+>b\M\#D~9I9iv!+ͣ,D)2¼Bڑ.dpqw-:jSb^|\df9"-`?ޏ??'O{Qd$sͰ&cDzų8~"|?_'_zj'GO Ta|vԻ%.˟{5R*(0fti4ft)I!SB>(L4|3\DHmEpLLjz ]P.rP<^Xr4߿}۩{F/?ޘ[iѵa|[Q5cMqkN4,;OlݍR<ܿyoN*z~/rV5;ȼo%R[*ֿ}֪f:5ADN$r-?LcN -*yϨSZ:]Bx .-5ߛQԆpm$S8]n[2nh(s8m'4Mk:ᄋ"S7xUCEL$:˦d@*Ÿ RyU\1+ҝTیTJ:tҳ{2wJܼ4IgRLsl3OvƦRU^H{$є!A$~~F%~xYR2J(o:`jGH+=W=Kbֳa%6͢҈BBMt1#- oexr;svc{9k[qEE.D{4[r5\CR[c>a͔19dC3۬E̗NcG?JڨQ9+!エw7t%nMMGڵ 㟁" s48PtfuR.1pRtIWHUӎg|hZ1ٵgcpRQJ S ]v ^ -Z+.Lo *^Nܯ尃c#پ%ps{V݋x0`x[rpo;@^.Ihײ6Բ+eZjku/tSrc-& vaB[yu=3//7ίNnn5_uܧQrETbC>/zAJkA(o. *NZ9d+/0˨mWR_KK%0KIN-^c8n׬l møsY5^Fz')5[ $H]e>1d(ۭUR̯}3Q8<ԍvo?L iSU^=jКnCq:B`v(7%Ο&LMϹ} 6TpI35c z`@\S bwTRwަKJߌ$xFh4Ҁ-!In1NI+wIF4AK p ; v RN7zm.:ѧ3=SJmw+ I\>=37:i&L&V ddYid8BF#4YeZ2)bby]Rf]t3!`GmpJ^0llkq9p#: Y/[7sjAv bU '̄+ŐXiT"7~K !Ta."kU|FZn>S;{3) 6s[ǘ#LDPW!^jDZu{Fuқum3d˘*S3jUB+$6y ,qcb,29scrȨ@KhGjYX bJ::b-A`6q G505$2  t j-3|87 X:|\@u@*q H"DBl H,@*6:Fˬs$6O+,7z>tg#xP_ 5QjRM}O~#b|7.}twۧ<'h, <@Sn/!&G/(Nf%s5_yAi\cxtmfNg}xB@xwKL~B˒Jƣ\T}%J4^.\܁QxeGr8BK\5$N8xgBo?JՁT0 $ 9u%gcɠZ_buZ+?6HjWta1:Utҡϻ~\#g&ɀ_]ZG{#?}Xg&YL;UX{'.Ǔ/ar='i*#L7RS=Agq12cά A[a䫌FmO{D*cQk9ʒ (*`UFh>ypktX89z$1! <]+1e.R-5xBъ/M:jI/sΚk&DepUV^0au`4g8+ h9  1TwXC$:/t`4j¥FyRB 4 |ަ2yXKf9¥ye{ JPLOw,(B43{b10'$CYzohz'@$KqEQ 8 C ̡F<ȰN5K}4j?,^XNe$#xDN4wig·K!h]AA +oXn-E*N [4O&^UQ ŽG(JjO%QsY8.y2BA I:c* qph siv*2p9nu1y Lcb>>fz[絏߭{ A+yt5X˧fX\Iz8~9ٛ Zєɯ7ݏۥwm &~< QKrwvYB` 3 ]M)jKtv)A%eQ w~{^&=,%Rx>P-=hU1:u6Dҳ±lPh Nu7Q`97Ov ~e̥J smYX2(L%V B'wC%5R űv(c+ƔKh@&o4? \:)FyDZDQpБ9%ڒ5!xm&\$'F:h3XEXn M<?v{xާk6'oGPti߷>4A9Li]+ж'yuya-Lx}7nĒBᲤ v˷ߌ|6""`N/^/ \ҴAxn27bAzƧ߈#(q"$ZӉ(5F .);^tlr8>TP5$YHyR\$oMbr\Z!Oٶ:Qp:>״7w dUmsSOtP{^7"bٗ(ЩW/^ߵ8fJU271\-|!?/=w7lMx2.JlT D%cHp"@D4jSYFEݍg *6~m+*lGS$;uudf\ ƺ%$c?PmZ- (QĢFn4; 寮=q;ι)bMWoz'79'TJYQIj&I_^Tby3PjDv/n~mб1 ΏBor&Bs+̒Kiʥۄ8y ՌR!t*pݽ x"`U $S E3Pc\YBB+;BdM7ئ=i픻iX1 Bc0+RS$4e ϓ:eKR3+v:3MgMcwAS-"@Cjz2J(NHT&\0cV'DcjRTo9۷;TtӠ(S\L@ fޕTn'5ƊJsŚ1^ڨg~|OgpT 'ZL1T4)S̓83$%5clҖ\ݟ<z^s<+p )q>A#uPpvD^9;9]:omt3h8`ٚU(UT'+tEՔR>$VGw;Uk8,ć +q`t )$}-Og^(EO|;"*ebv:YCDvI Bߡ8[swH"';%ʚ2}u2:'K %dj‘a(H,?_? .7!%Si9}N &DtԕDK-j0) kDjJ&ǟp(gp2Y|nwj0cm$UA :3T0fp[c'Uc}YuEP{ᅳNHB1ii 3]G'BHѪI>iS'X JKpF!QjΟDMH]jIhqQ̧dqN- a**C,gH6C)KLQs'9O3,yƶ[4yϜi.NE i0!cGm < a&wZ.;L;'=D)(8˘M9PuaQ,  D!h߈o Q[ZzJ&$F+}bO28ލV2䱄ҟ*a7M1M0AβtV*FSeԆ8+R;nbf##v!MReyGTpKyІ}} aR5g4Dt"X Fd1*fC47R4#v 3Q iFa{:]Y 䜂2v㾡on|<$v7n Z=@#ror#%^ڮ讀zRM>Rvz9՚coheE,@B/%&;~#s;K3ڕ>RڝґB0Js%3`+-h^Q"ѰXPc,H{i ҥ4q.8ٚnAw5[lK]ɇq>d>\9pq kzWs} elZw?;xt(.Q.'( U}oh2\xoqf?ŕt0KFcNYY^OvMf5jm0EhGyMQEvV3j4J,Ϸ;h@&D+9@_} #ؘ˯%L2!Vdt`}!JE-(Vڧzx l4"B Zw~)>Y=ݻ>&3h{ ӽn2貃FpF%鱦uӌqRŬ,rp}2/xˎ& xR~/[k=9$.| ]'%6d\! @q6ī-4;h` ww_[],BX5fMCd4ޛl!7_[]CX.b h5^`Ѯ 80V:b%O ֱ͒,BFb2G>/z1_u7Z`6[lPW*ѿ{V Cq^{}4(xݺ)wftKn2 %z ŵw]= C QG۟x#zw7A#E'Qh` 9Y%\:Q\FF4/Yץ|Cg29DL)C[m5d{ %xv!\i 2e4:Jl]jI"Ϯ8AWS" T+5%'%=dԱmaKCFސ<]A}zmoEd&w.//Ksz98*٨vk#UkIҖӒǶȹ:;s=MflZ$ᰒ{R):*{AMh]V.w$p9JmR Re8"tj~7\?N+go\KL>1qsr^Iܮ+%J)7f\Y-!q/ɍs\>-DU:a#%hO;:mXqp5.u m+5;a3p]:mD%sYhP0SنHU]hTF: ,LEJx7S_AZN~Kd]m(KĆ"7xV2l%T ևmPVBvcTǀ9Yŀ;Tͷ`qmuxKMy 7w$]8[RO,+>%?ȃJD#vū楰,(򾓣-b5>)[8|ߙ;YٸVFgk-V'ϥV$bߩ|ɐ45$g @'(EAm{ csmaѽs>O2n݂ +π_r_*՜le XS/4΁^BNѸ V "#!tBAq2"s .*MTj%/ϑ+,wݙ'd]7? Pyvր!oɞB)4N]X$*b9E@y nGDRU'*\(wU>&xVN>9 Q􎫓>ǭJ/ŕO3ّf/~ϭYsB8~YڞzMG:%fiZQ-ΰ,AT*+0 %+L*uĨ3 *[qKMO}Y\>lfb=E9m7RP媬%ѲlS2(1Փhuq2:_[ 5TUWv3ZQǨ`lR'ISj&R:(c\m\?k&Mp2$ZNjRȥ@hbxK b¡1&;Db9gm;W;G5&fB{{ɉRÊRsoSI[$YՑH+r[FA @c2R/mVpmXҭU4Z94^nbC<4#ҩ_Z1R֖$ KedaF"Tt1XPvCڝ~XCR2ب\)J(fhKuQ.2hdՂ4W?7Ka:RK j>+X%gR0IR80 ,67 &'kIP~ZshU.m*Cأ[=>5[7 4qjDNEܻGrzMr>LGfwB?>I˴a3N(a$=E [Z-jY)8λ*g)-Z lDn*ۙ9sAa i*BHYvh׵T+ٱO[ Q=^&H-T!E))ʲ4';rM.f[%.P4Κ'Di__ɂ~3Oq e[U kKWE6 y3߽W}WO~T=pL&c!T߉R37LPR=߈a.3t+|| +>Q-~k_is(v5]BUąs0n[2JmUjɆ"JfG6e4isiJo/ n rzb7mɩ$(z̅Xj.ތ'd@ZvksíL 숲fጋKONՕ&R)υ-DI(g@'.(FO-CË>S1Pe6I',elD~Jzy)!?i"e?[bӺ֢+.2{ݭ{>~O>/?[3Sd)RisRC(c"-#'L <"+/;JM"%#Bqn-HGtÆ㨃YҜje{9@q*}1APN Q¿hL\szRhQ(j8x,ZqHX- 9U:+a?6j5f-±MPvZm+zgkR&@ ɀ3Xm#^))JF5![7,=9z\,>' }$cp"AH#zRqL(Ѓ8 ׃#8'#(%f(У|ꮋ/#Ad2K))1 oIU)]9m*g""71*=0L[;2.9SJc)AF>S 7:Fci|9l"]X:ړa0#өۏe,Tk[l$%CNaHc +]hDQ:S4dnʌ" (Z왉u Lj2'Ɛb|dѲQu\zRQ~Ls$YxƩS3%l0k7$#H8Bƌ=-O_|^dڒ;K{=#)6]7y% C4J^MpCedLwM.4-gF;ii \IlР #")X(4ÖV+Ԑbq˕V } ^ /|8']&QqJv2nA[T]<.'#d.v>Kxy*,gr"97: t.vqh[2fo8C_3#sj,q΅x8qA1g#LH;\]˷RB{jXu"8g2:]V0E/v>}@ǂ Av2 a:À=Y3Bdqg#oj4A_ )-]`z[I,7=>fw =~;\{G"^keO>:[{7/+tfb)% Ey4Y)/RrB篗+=si܌}f~l7J8ئTp~W|d67w^s>!Kf}PӀ42 `+S|ɛ3˖8~,{+5x%E`58~>ۉ<o?>.yʤF|=dqT0J+EHtiRJꤗhs>DY8lo^njGw+oowwi"Є3JVΞMi1N#4e 'З1Yayҍ-AƱxw̅&J8D2$`hsP]i_bL Cۨv!1Z:׫,n +hf̥&Bh{e4˂s]dD0cp0` ^an{,u /t5]C?Z[.[S+e^hަߥ&w.$^ƚXxk/5uѭl- L*g _TJU94 nk=+B1SBbJE7AYEWjsڌ!୾6kRgSjrS1%vXm)=@dbfឰVU "}D,L,_n Ix}` WMLo,S߷)ct! [I6Y" b`WJepǯ;ޱ$dȦ$!+?u0Kx'+%p@ SIJh𪐜a0Ď&VW1Ej܀MR:zk{fY:nY&yNR:Z(9pFrIJ )yQB)b`e֜2L@N\%!ϯbA  RK@YAAXk0U%鴢*ʗVqqi]&d$nk 1@% Zg*q Js5i';uNHȀ,M%T@lʳ B}GrhpGr@;ZNC-0XAt30K] 䰐`s2\N4en ڐ'8FED@h,2֊ǿ?ނ2]S෉ڋ=m8^^+e8ՂzQXLDw\>w~EUH c٠` a%&{9y !;OJZ2'A˪JdaEOT'$ΗkC>#_Er҅dW$ =9<`BIr@ i=ȳSKXN@^FUP*.hySVLeByZaO 0 nI)E_d*S*!gxr*׼db9YƌL~k%}^knwjeqV X|N ~Κ2TS y&4PER^TbLU\+K)`D tO^Aݹ "ug"+ֶJ6lmrCSJm@_L)ENa쑸g5':t{?sOHF2ɮpddPhrkt0d.c䠳el쏻IZGVi0E?-YD \Z+aB hm'a&q.^ S1U I > #eA.>h"Fwֵ&|wڌ4-R JYѻYI۫7K7) 6&ylcyӌ{mBH$}AV0=75qh 3 ֻm{S ,gDFWkNSkR6bBޯ֌HmH>N*c+e$fhp{&eQTǵWݪvC!r@ GS,: -0֞Z]Ze wJ< 2 ͱ~\ٽ<~魙G, i3B(ۇј~< grGnr$Q!Iw_[n翝C[P^bm]Ç_nnj@qt%;_`ΥG CPk4unѭ\Y>,ߧqeZ>9 fcrTWp N/"=ZԽO*uvhAm_sA,CԬJp} é)PF߆2K5Dac* Hav{A漺CV3byX 9[᧹l{S[9[ 7ϣJ>DiOl6|{CJo bd Y} S G=V (3όYV5c笶vzlڅ-;6rMbvĢT}[&ŋ]| Y0aYm(Rq?ÑO[\Ο;&2L#bBgj{6wdJD_k?^.΅v$73c[^uxb 8vn>v֥j|)ZF3v#6c r}CoY+s{תfWJD"1aE׈1Ÿ Pk)YzE6'hᵀCt?G:9J~!?po6kAf܌hדh]FѺuE.Z\ Id^ٲJuY8q>&P(Q[1ee }kN˥-c|҇BJRjJ\Y٩^D뎼 ;p;`B5*q6hnÂZ9 EmQXv[ouxɌ/%(+=(r85 ܁PTLT2%x<뽷ޡv:LrqRXx^W]ϯ~{O NP \9fl|finɸ=-]oio#Q A͕ny I 6,_`O32vh2EFnlf%nj\mu8C+ 5顩rmØLm覵y0׻-Щb\>'P=ٛ<~'ІLv"kt8s۫˂U3!)ݺi [cqsczq ]kV)M ႒-}K_9i{M9;iAF~W9WAQ!E'"4"3~xV G1?L!F8]O   s6@jPz^Dsd'FK.h'Cݵ9r":-R{t]P:'xU @){J'm`BбYSwfs` 6ݾ@PNc(-J̓]T 8B Bv~ɴxroNDSb,˛a>j>*6h[|rw 't.Y*wﯧx.J7|/WΣJo&wX/=k z~;OhgUSrQmq;SF-Wv׈pK!UGP˪#^ũ(dtՈ2Vpbm%5IZDb8^TMVI\ce9[}1E6˻k?[=~ca+ K 7wa(:GA2^n:эlܺlpn=Ӛ~h8Fds\Xf0ZpŽL./ezoܮ4Jd 'TY|̠ǴOJ4̉&G6'nc0O˻o!u7D^9gs! #_hA&(t,}$(1|#1=Д|N|7[_5Ь|'Ju}1c*LDmn4F`5r;K$Bn1B=X &,^iVnGũ+K2*_iQkP9ۨF>}p==F'ǩ'ݶa\`퐼gm: Gta&szG#L=P(4!'Ѡ"Gɚ4A”N]J^z]h |%m ڀUڣ;vaW+k,2^\4/i\ݕ#^)Gଦ<k/=hH[%D٘XqH+WXbB{J͆3˫g~2ħs0ŭ(sx: s8#`P뽝ճ]8,~%?Y%䀠w Y 7@Z#3VYF`lYWWx) sWLvb uY[#rcAPӫCfsAV+ϑK@YK%y o(h?5īB_M`TV9C.$ӂsOV(<$_;Wwe=rH4e-G`2J}@&C=~Y}dw]̳e RY/`2 }Q 5fu/cگZJZkV AemKZĮSZ ok=殻\|mюLs1%mm8pۨM׏MQ!1`t y@tTkC{SW3݊$h;bhXmyw`!D;N'x7#$[W rL3x}FZ ݺn}X+7.6eZ:\{ 4oep\viS hzR2/ kQmgf>%oi,IO? (ֻbIO_>_kw}$zAEzPK>Ո)*dDר4XI+5M#YZ}͕E'u&1Iʽ q~ZM~_tO57Ե!hIՀ't}2AuN4Co<57գf&7ӏng+i-GI@l -7,_/ktP?tcu,Zȵ@SL^4g(g)P X= _R'fD r S,s4 Th0!tA-hwL\;Ƚo,H;SLƙ}\rPȅϣ^4㒆佒 <EYd8 k Gt#lhC˼2QE!4`DD֙H;tm#E8ι3#`  0I+D,$$^\v\{ %SG(f2Z?Ng/h2h,Y{JxOrռ7CgsêeĻ.;]7~+8/gxz]ɧ_ߞ]'Ϊ?'P?ϒܣ5]\'SOξ#L+[ϾLkoʅf}Àq]Wh@MfHq:~T \mx fs/~Z-hr.Z_׺Lٷo8E}:C,i=)B+ڇON:,"7xNՇŜuB'|H>E4s7U⇓+O*'^Din~mmZ+teD騊IBZˏ*;:~H/敤ipv Pt|o ˭W|n fB_\S ?Wl܎ƛO 2f`BrN<q|1]%b&lubYzOSz.~opgLr)qz2=愣 "5{~pκ8wc-Fq KoTGWÃ{!אS4Kù6Tù*ȁOK4Ж a`Ʉɮ&0m °AG2f0p,3DRHA>hS~R <YF"&jVx\XI hpg$h$s F{9W V O1f_xk$N"Mea#0VDZmɖnAل Yj r%>xC+<*KQ:,dɣR2IZ[Q@Zٰz3'.ߓ?9֝G~p)U}i"I ָ6|'u 95ʪy.*B֙xPCӚC8 ZހŇ~ h^JڈM-r[׼\  \r}L4g4(WPrXv5 N@ 2{PGxU˺,lx*оٌ[@<θ#}ǹPXITcczE)@ Ly\WjzcqDDT+hbυg򿯑vFҊ&r]6rldM Oc5$94?q]ܢ!(DgـT1iSHfY ./  LRO[ݵ(zS8ln 4AIZ/9[L>BFH+)'B>p9"{{vDq0gۤ`8!vN, `~z>WuzbnR(D+fiκxB*D4(R71m~o#,Oղ4Vb<%4;y P0L}KUBM"5GkJ/=WxJr\*fT*K dlEHK㝉~$ւp/06g^̼a5:X_*(7Td|^ujJ#l-hl?\vC:g[HG&4Jqĺ] yO9օ,.FՅmJGSP%MTh ەK{򵕺1iRÂFhq(p{ר`ح4Mh.M#+1r F[FSSdHb8D+yp&30}T4mi4iPs7?;mi@Z`J~%G:R (,x˭*^i6 "<(hr{D t,MFQpFHMeq l X%hg+*/Bd4uQ Q)Lv>ݰV {Hh%9 4cY3!Th"+Z_. Uk6oAt/8{@tO6t ={ Yuto@yu YUש*{AykaT0A(#S%maT8AT*S~мTN ҃ s{A5O_TUsru( H;Xh*i-:냏qRyKtb^ mo^@xnTY]ܭ?zw_>zϋ[zByy{&Oy }ϞJp|;ߟ.WxK_zqU5LaJ{N>||q pWos՟g.|=J]_I5% ZHZxw\ \sr<=oo㚛 ,; MPf4uF߀ҩ-덦[ lNY;h\+Bi(ZiaZ hbjZ(kBmcSw=5p :XZ k+yW*A0$A~Qz..SXޞ~)7) ,~N+I(}_]E0C-kg zKi9}SMfXJCy)Տ(P/Nk9=J9nyL\t˪G]꿸Mi"_XE7CXuWֳ4U*zWn/lJdڔxX=VÿMP8zI4bcp+r{)-uWfj5O#zʢ0Hw v0F}gq- 61[ H UߌBF{eцfșgL:Εٌ5ji9\hыA(@2YD6iMy gea3C!"eHGWѷV{0bo eS)mhĽlQrJu5bd-.*!4\4o)qC~Ĥ27'yu=ɹæ'oy:;tQNhVz6йA9>8+k]F`J}b%G&lQK]҂.bl%=b0TMXF dj/D)XɐT%$8舕>$޵57r+S\'4[b;RI|qma0D>8""ʡnLkXA{ƭrZ#V\pd #TP+ Z5ze,+(, Ze9]> $2!T"%Aݾm*(fayON1,iZf{mp㬓FyQkqA#tEG-|J"2g&SDpDȊU_|b٘k"'SMiBmGdE TDzʉ{l1 ~L: w~.U2 ¸VΠbːC8tyډ4;OødCƙ1 $K [\D(]}uپWί.>[L_/O@UI[a`zjk{E Kۻ?ӽuz[8mZ(!.wt#nC Z O@%ڴnNꢗPX6oc8d,JS)iǷ*D/&?%>]g0y9<[k S^;SH]uzVjt҃-B&]ACF‹뿊I_]ݻh?]n֫4tWX%/EzzQmPsc-IJ`Es¸r [})wQ!tɔk$+3Kcͦ 2[?{rZP8R)yÊnK :}zhިJv_N?aI%OMӘ=sq eLY"<1H˄PNL@ '@eBrrn%Ϛ?C%5!|FY6J LάQGI4G yd6'6'VVJz֛S8s k`zPOͮo9w}-%ZhاiXIJtƒqϫhBJc%b%k+ڠҶb^ wizOhN{(H5uPfdMİj6/5UjgjZφ՗Zbj{RM{՗Nw~ܓZxO=))ޣJF24o= y&ĦL:lȞwꔽbx=*F:]:-TvqH-X;7F6UX1'{7Q8wkA4GvujHoޭyFwkB޹6)Y7v1nZAd 1H ޣRfz=M}-?hrfڂsmS@5tp7+΀B0y\»䔡K{QQd+Io}a/a,cqGmZ|sCh-78jY4GK>XU^~% dCSJ#IY҅Qzd..Ԛuc]IB.k~ Pw>TMhT ׸b"t۷]c}&bcT%'D1ez:yA]xԎx%} @o%@sy btL-DBKrҪ9(-h2u5f B 溅}<4sDt %+2n5ʜ8)f=FJj*x!x)QG`389KQH2CO{b ڦZٶקs#F?^;R(pml`0Τc,#aDȨ)cf-~|XBxx|^<=5%F$~"u,GO}Ћ?93whv_<9~w[os?wwFy/'քϫuԂn{ov}\b֊};ˮ׷;cz}݇7o&S'cMWüU aL&v@P@ ]D)K?\-3( 3opHø<TgwDj<bJ\PB3 B+7YR;7ԃ͓;EMn0~"Um. -@rN@(au:N  g2']pYn ih )x!;/hFʖf.Ӹ?X{{Di:yeD)bSJq穑MNP((P5*mF!Pܣ׉tXӋ>^A\Nc+V]bRXD$x.~A3t1̒ 9dҕo,p ?z(Xr\E.]հ6WPĺN Dasuz"QX1B' g`=w^sMȃiLVmN5N&\ 7^-0TWhk"`:,:&2" H1u>/fxOYZoϫwҵ ~d_8yU'gFG_RRN!G⯢C>^1䘰qW拵|zG?zSo`(0! E!稠 x<EF\Hw[3s7_̾*-͟*02bS/W!^-9##:pHҮwnFK\ذoAێUyi+۵$، S>4'ƖR̒AnĬӫ3?@ȔdNdgg`-Z@`ʹ ,Ek[6AS-U𦀯3$]h?%j VjXp0 K5fR JFrI:JYKGVxSUASeח08² uW-9LzzMcS2F=FM QuySI($tm#q>7~ oy\ )FB[Fú^smo޳XL;+S3S1~3Eϫ6D_11Ƙto3pך% t 4@yK-~Z= ڿ{z`fS@cr,|>bTY§y}+ /VmKxK1FwB:kF J/~fߞV݈]OU~sv:U\^ jp^^IC-39RF(e8W5kc-sDq2odM6;̨rRzd6.唍~Z᠆@#IoM}Ύڛ8̴rB] CyI9psTDEGtAo%ު ]_T(9I!茐uyWaT_mٹ_b$޴eE\b45p1]%zS0o4Qj:I4SS !ppŌt7]Q$u@DKdbX|u).gPV&C`q\Tpν\Q @ z/uZ+p Ɉqy"b,j`H /M[DTUale@q!-Lw-7}<f:vZq\Kq'yn3'2,x P'q}N! I4'M<ݪc OȲUs-h}`uM-'`7ɾIUm R@Wϻ+HJLҼVX5? +A@ \ R;V7گSvjWxqP5b͠b|j<@jӪNgQ0 NQ3%&"A/2 gZ8B]nCtR6kN_f{lRAcHd Kv0I-uzzisXw^,ŠeF@ORh,rn;!@:,aQc1&8Үs$Lay)t9[΍lf1SaϠZQBSB1C-UX V a"I)=J`,H \{^ݽU="]~@wwyػ&/|\;^dPI&H£)7zxYI Zj+,K7  oz/vѐG(hO+q#Ub4+T|ݤK|P~`-Ld_F~v#|U+mC+Q[/PK [[,gO=\"~ÞR}_&3_Cl6>\vƖ>/VWW=2tkpCرMz@HU)}")?ݺ얓QWO Ś#RU bcdSb+IXxR6Vs RHJAS6j7 5 cjgV@dX͋0wJ oBDCGqחV1h9.}ƞ ߒ{3Ci҈5ϙsΔ>"Mp"?zS!{hno+lU{o}Fњ{2 scUZ xMUB\' ޥg˩rvWFLqkmֹ nqAhK*AF>oM[vQĉEPNߞ#(ӭ3#(bQusW^/%-?C4<11(XcMHlD$$R0e- Ib@HgӇuhg1qΜwK |il`-ʿc&wʺ3/B)w/t7S;~e/zB"θRLՋ'Ч&g;r`fspZ.=(9OG(2P߽[e- v"Jݚv yꐐ3 'EGOZ ԁd M36.E$Ow$׫c;vYXW\~U!2&6[>lj ޼h>8yjLz CPE.L*JpS1^]$^"6;2qMK!Wpv A/aѺpM/?XB>|:R嬄vdom_ ԟתyTRn*JTݯ CBՁ.n!5mpp7:RJc&Koꟃ7.9\9<_Ň,l Kmh^nDSS*GO>q,Pئ(Xs^_ZRNtj9j>1|d%0BupөiyK0g$\I7c 0OSw'pZ/1Z|^~fr9{K.s^^-z^ޯs/3BK.j3Dc(WfSSc9nAso[H> g;Y us: 6֑z1_8|aSPßUTEU\TEUQpp$c3W &T 8[iH#c 1Ģ%I.}>dH_?0QVvh tClcYh'7ȇZ}AA*ڬtZhnj)˽y!4vUX͹AWjc$AeiGe%D)X% 6"eT eSD§> K|+4 QG!$raH&8$pdYł$QVU #?0`i^VbX]J<2LgcU)#A"iI+#bXbLD@`Gi */Bci&IlO426'1X$*eHkKUr`FJ0Uuv[OJM+ 8F 4=ĵ#p2: nX@*-X!:ݝ+5I-fP:%Wn.ʄ𷷫iFlTVߒۛl73}`!ADS K=`]e3'Oߙ.y4 7J^/(v!K0LzqPXqB(Wh0.]XɿYͯ y=||xK$v<Ʈ+$fV_!T/ӧҌ=i j>:pF8a -LJ8Տ NR!`bCSƚF hQA?àGHG ݯ4afu6,ϩs=̠(F{D U*8|7F1 nH+*K U>j;;;ܻ"siUs]H"C0&EH8FD(KY$\(r)5FF3XyUNV:|#Ġ3? d͉'#`[(mUB\A"T$3lc8'p T[TqlT`rE!hI 54ȮƯ,?H ȤZrkͯ2{) +&?r/SjxS*0H{eHJKcR"2޲AXr' khL#kΠBǏO'gJPw<#lI݊7o ]3iݙGXvܗh(_ q$$N 1)GJR4D(e1$Fi$5۽lTBj!)7PVFE}܍ ő-cD4f86pۤRKۺ ,<󞼷vMu݊tM]9;|r.󒒌,+wn>b;hK9sK F)cQI7(auIDG3QPgqdL+LBIaaZD1ØE< jDbMTh^!u[VO6L.{<ãMr[z_|fQn=dCd\ ލ>O6ϔ2v%?xWP59ǭl9sLZhLbIvj>~̎*V+m]~@vkXצZ겳qI85.sy?'ӧLˤ` ~^A~7o!"s䔻rgC|x g)3Ff޸7Nl,6ifֆ7r@hV灚$E4H)\^h@ D'pJڭ;Q[n/j3bw2{^}޺SFe<ȮZ]0q=12dl~Cht d3qce.e@8@gM~Ƴp@pLs/)/Tц`l7}ɵ`=޴~M{Ŧ-,@qOi(L \L I3`Jw\_J tR+̓Jv!"~:">#(=cʄ*JϲV&eHE(sFWssWRWY8dzv_8=X EċײUϖX_][s9v+,nv)~Q<<$qg2ɔ @-HqRHnhM\.[|(<UdrbAyxv6tgtqyq9\}f].rbGz`ۙS/ALrvr=Q_l\na狿eXѾՠ-('wuUiɥڨ~NҙQ3l#u(!-b8\w%8ڽȧ7qf2q&Ӌ>;!/G1[j1ZcF p A71w8KIh>hڻ"+dhcPaGuǓmP}kK @K],&Q 2nlhSL8l=׸3p(6_av)軬ҟ-}"tA6pi$ Ƽ[Ʒ+޲D~A*;5`!q[F ?>ޮ2A硸[;X 8}s7}ir "H)D՚=nU$KUıf֛f[(H%(8tSA)V0j \yF}x@ڭ DHU"Pƍ~ r4pm`#[)/NӢd5" ,Q!%QYaP Тg}?}Q3S1 "ڹOyA]ŭ.-Ε3LK , Ɣ`ET #9 0FY.uMQ 1n!<&֬jX׾we ҷF1'Ŧ î3.`5t|L:\vu9@^ui(HzV+K,CbK;sW4b{?n}m}ҺF'~b'\;O-܂*iobUn=oCnwuv_8uinv6-?{~~p؜ ÊT1 U`bXUy8am4FAϻ;ijgxE*i YN=H a9F a}^&~ivubs".~}0%-_z7]zKO:]N]Hhehoݯ\UI'a,f SbM1)X 1\OkTVg9AUWTy!6OwquZ`(U,>8#-u; c>]Pjx(b3KHNE%-0|pG#{)oc;ޙIt*K3]P]{i -#杷 p"Sd+C2oQQ%4`p%s1O{%I);R$l{ԓ:N4b \gIHKňĻ(k1b1и(+QbĠR{F|4F@!dcHsH# :CQL73G2!\=nКE53E=s{60=׭{W`mq{&hHPkTK, @5-(鋹5*& ' f5ѥ:,J-рA!B@cJe%DlfH-$}1U:۩R tb8){r-/.*1K!,LYIO  "NQ!%>ݧZfNЖ$ȥJ<ȽD <1nmlr J u*::g@\ b8 rߑB1\xuHo4H6,;7,Pr2 L Z2ߑD"4\vN"}j,P hM!.Tлiw }ߔ!gC߇4!p6C$u {=zMIM(`?I0vᮂH\  н1 y nH;[(▤Md$R0&:/\:I ݻyVW(C(]2VK"]`,]D"GEGf$\2-fT8"wQOEC4a 5LQQș-* ` *Jh%Ӕ*|>.'WhU,nɦnWUc?mꋖ7v&hVd|>hsQ41!]g ׮;xh&H+b4n jER" 6҈2P< Sr (!Z2T(,eAjRs BP]B+JU XwemЫ @j qE}s ͎tv 4-#IE +vN v>Ac 8-TTNV)eh. Y*F Kl4B1s(i z/ ӛ[Cɞ{(ݞb, 3+5MFLKhʈ(yYH)ǤTa6č$A&\!;PKeHH5~161yӆl3ڸ4I:BZ(:铗z@$8Ö0tj_jw G@A K;RFEa!߹6)꾇q7t׭7Aq oPTX.mq'hcNeOP C)+ , 4  6$NIB z]Gw ΐ|D+0n7p`ˬD-`Mnr])?$I<1Kv=~a~+ g<}d3̌[T}r:#%crxхGJ u$\RLYCK2-1SR7*-Pe1r$=)yh}NAMl8$(5Լ(D"GĉR^? Va}=yI{C_v~ 'P),uk XE z紨_&ߝm6'̛gz7r9S?V{1ōZN׶y~A \|O#n )uzu`fګT0Y+6j>=Eq=9k#:tDa(OM1.TLj"FÀ2 Řd [>T H2eq徼JN-E9bRSD)f2c&&,\]+EuBRA̍+r{̝P8rkbm'wx ?lUrMF5R?%*yT /C4[,_ꨃUxyqQb&cҢMkd ^rU@o>}ͅҳu/?:5\Sëc1ONЍP@jP&t?Zgb$0T]14Hd#t9ofW܈'`i2qBǒ@`L v;yvA3-懥IS9e$nog^飆YtWgrf-h֭{bmkHh= 915 `)8@ $ދK>gS@@}ǂ4#T IJV-Z oQn@^h.޳(bɶo9–6pR k˰ V#".Zijj,P*8k [ͥ %4 E b 5Rah4ƚBJcX%gDZ%1tZdH-!xxa9V*V Њ݀c$)`VTf%:0x=pA]|>w/5Gr<Đ(#c8ty>H6,;7"A:X}&ІnDN;`q[nG[ hMc8dbc:Hn"NimnmXwnm 8Tv[^'ԤayVT'Y#Z * MOsFvs{6|xt?Wbf|X\^\@p ?wbp)8UݻQ[x<ҟgG:9r/jriN-q#yg#2w,ɫZJTo$RL"Gj7uPd<"3#uÝ/{lmcf% (b Ia3C)j-M)P&ͲmN6 yݫo|m{hDB .#umczݦrB3Rs+B}l0m D^'c3ebWcl62m耂u@t?*$p;\֞l) das 8$ )w6rϷ$2f<*nr9.u"%UU#^8mO=!!d0Al3k9б)SFR[Nt$H%#@ER,IS\qu_E3"1 V^b,&9Bh,P8LAX&D*g֫>(SN2G edfJ<<%\8&ʦ(kAX"d4ST'XgjrbrVULIfE͗ T1=zMgz>$#\u%/fO [y7aғ^h. cBA@iHJFg'";#w,X ".)DI~th(%J->KKB'YU Aΰ!//57x_r M h)ݓc_%lmn?%z>fm2mAd+cꫝo>d1B.ᭇ 08X?r&=7g rOu:[p?1ˢ]elNphVF4-:ӧzچ8y/!]vJ P,Rq.&־.UuA*iMhS*`{z| Xsl߸oD{t\pwup Fu;m2QP{ zPY7TtW C 1 ^A$?wV4v{1ߟ`hs!.^.>?^AsMfYd6wξYͫVByuk-{xW5a+t (i6n()ԥkGIW(7N u'm^R= @'6 HH KH}c=֛՟I"3H@ ՟EV(VHپz-T&d}Kn}he^7y )7vim,X>Ϳ}ӏsTSl77r$:X͊ūi繙֝qyO?U?DEdzpޛIڼ"$ig#o-dthO{xTΔoSuLe2o~!n Vp`VOzmKrtTK^<(s.$bI$+eSQLT3•wg-60v.HUL}5͉° GqQ? 8xƘ(q~(Y8Q4 G@pP'K~.C4GՏF@'N<$36_K4n)N]jeAi/P%Dl%IťcO/Xb7_`x`ۄ[mv=A]F7E6`| .yh'zG"|y$=%1 Rm;w¿-}9(-cIal eF|.zJ3:Yr7_ǝgXI o|6-3ϠM/o.9goi(ߦ#*<ܜʷÆ"~a%˻=(NB9!(lqm'fI =7FZ=0q`PT| !Cy9WrIQS5Fs%.JX jh`(Jj.?t3 9].>DxLY!b[p eF2AV RE!M  7\'7@x{/:H43N] iRӼWz7@2@S :}rqP!& Za0j Q˰ I8Ֆ8U>FgF"*ۺ*O JPVɂ]9ܤAa)B:?-^߄Bԍi,:8f%&ӈQ`d`O);r> $ز:s0|37EY[E`Pt#Ai z[ R"mBQ@)[Ox@cqm]J M8צu6/oUohK{E…D[}q'jI8T@pڳ:7G7~&u'J9LI,y]ăOuvAgڲdQLzm)„2ż+G$~-~tisITIg-)\'R*1RzTq"LElຈN<89*rtohR2 ]DX|n$01D'bkɆڌҮ t3:+ J/@F`fԕ A\y w[zd?2(a^eh N*ˀSʢРT]2 2T«,ԩoP F19%@3Z.d :'C伎PA$įhR!%*Ba@v-M \3<kZ^]Lꋱ+` 0kٯTcJAm5_Y'wo;8\"U7eƖTi}N?Yosr~酱y9yZ-?opbkx7Lk|}oS AD_H3E#JF$TY$$%啠 xL9h!"Jb˸(.o4 d!XbqD%@<ɱy x'yCYNTJsTr!)c*_ 3H‚6T'X3iYLbXLc$i\p* SL8q]F$$׳yfٷywKS. h͆ͻ;kOϛA:&YN0,t??M6\{/ɿUھxX}jG7=5,dlOj֣czq'W5n_ 7?%#J9b>G[d;ۈx5<`P},7ߵ-cOuE&s9ޢyqBi#mU܂ %|[ BC Fq7+DեB/wx,̑xO_~W`@f13O}\#dL~V;xg݂p;Agz=VĽ9!x^+n.S rb) $Mzn-ِRzZ.c^!$ 2FEpftKUJÅv ]@ OYp ƞ'atRu*Jն>{7 oh8vʖy:sÅQm< <5ӫ.ʽl T~~d?.dGX9{Cwx]tűUfL'B~oהr:KcFS̰"YJ͖4E <-&t;lqvگvz)wso\T6o/>׈_ߵFh`<ؕt]_<3a90vWupϽI !QˌGF\KpD][sDZ+,N*gznCRR:>q]L @)@b ,%*n=s򺞷>>Z^ZCN8Y[ptT_lkz>u{k'MÙ6<< ˯/ Сl?C> W賃.(e4^2` tEF McJQJ, Y(ߧ7ӽ]OvR]7NҭI )YA`210iMi#(=GsCahbM[{WhTٲK㊝嬬\)7gMWPp3-X>-)f7wH|^"h_Zc˫0{h3fXJ^<<ޤ[g)UhFKv˔_nAm0ɋ殮BenCdڇ,Mxi^Qz# MMJM&A Š}Gv8 RyݺD[ؔΖ@LmEwN5IB9mZ q5$tmL҃i`lpm,!>ViDoz|6P }gAtR0 hS"㍗k9(렄Grrܐw gP( Z@Tby;[TIr%ݞLo/.ūiSn[7|O:=_{͍s9xsPgm::jߡc(#`8Fc3>ІM⎢d4ws#V{J,Gs,MF,RSA\@}dZ "(EcN R$qC"n<5VnшY} YksLL+ȝ2x@έ$w57܎a犻k tԡtp)Z{%45ZfMC5&ndB;,9A!BSxXAj76$} NA-\Q/@JB:W VZ[G:RQ4Z1F#5Y-<:.l'+%rYr5%rt٨rxrz`JjUmw@m^hAȑYn$$!D E`tȹǢ^`oaLV<:/ԕQI h2 ¼vcE'O&l:et9b/h=AJ]ZHG8MrW#1bO-Bbg;·z*vCRtdd툔cF^yn3$vm )0rIEx $>fg/%=J/dwbRd2<\ޣT w n{0&eZw#^,j!gwVT>QBvƹ@^,8@7*X)u 䖇j.aF;UfXX&3m#`6pZ&-Z,މ.Dm$FK-KY7&[# 5X(a];=-:O^=e} td?)߭M?]|pҾȤg^MwU75}8R"Ňa9S=F<H37].؞DMUX&TAub4t`+Խro~SLW*Ne]9![9q8H|f !q6PFRn=S5Z{4h1$iM:ORXe' e~=\5Hۻ^^:#HaJfovK.a_ENuٕlv9^}nj> Ȯ}Ⱦ xS6!oִ[&7om]Jd;Mteto"fL~kr[c{C&~9%ˮrkiOiz?c̈́~eSBW\=02Lr88hHLQ塟˃R.T[ׯiSj'_i2:zMZb}uZ <_b0٪oݾҎU%IU/4}%JH'cik{EhEk͟x^%v"L*z GWej| }Qk=\3ѩɎf ;~B7rpMgwB^ wϻ򿒆-ɢ> j<\tȧ7_[L7>P5 {J5*Z9BȠɣ,5YE-6U>y%iv7/{~f^C5kzHG]d#bbI1Z\HA!YJlQ7AUPy>Fb{ȈG<;N#p\wU[{Oz0#Rtw/tpRuL*k)jsV>.i60\]ϱ_|sc(1[*BzZZS|‚t" ul/vkxyԫoo^Le¾-p !7ȇDw3<5b#&~={ڗC&h~7Oӥ?ܦTb͛%>_㳿]_˳OwWn՟ΖUMcmt,S[wvwb<{T:(2(5} >qt=3V2mlHSBefזl+}Pk &.6} ;7o)c+F[>alVzCITsRT^M!Օ䒆yv #4{/>w\h]]_!mvtx mvֺtl>|PMޭm .;6L\f5SV5S>GhY }ֈ_Ç@̯iCb[Ӄ 9^S'mBP^-ո&-$1v9?+ҁG)aMr r}Q./ʅE»ʿQGZ &1zLkːg0>Y _eBr E۾rۗnK*"##sr&i_iwQ, :AE N*kf/\zAyvZVOkjD=nw_kF)@K̴]*˴U˺cZǺ]:qU5 ً?Xk4>n0-A=Dy]tc+Loaz/^({Q(COB&ƲJ,Q R.CH`}Eھ`6 /oUShXm胜yp {A ㄐDŔ1 2*CVJۈb7WM3nDZKhm0T%em[:MAjh#ZgQѶmQ( h'7LE(X*KȊPi{,B~'#VxjxVd.!:bT4lBX@g RB tDJ<) 0:vBJi ք6g+,YͬVqudJl ˍr''` )A R`Ŗ@,q[Rn{(BXg+PϹ]p7#Yl7;H;ˇqZ1ַW~CZW/C?/ u;a>ywŝ\~,X۪A:<zIo q0ӕ?Kkledƽ6!'+.50ObNts\9Ѡ 8JruUʓ!U4p k# tH- ֨D%Vx\~6(h:5%'N)y[~lu!$Z-uކ+p~=Yغ`|O`!FX :i'F+X^iLPJcTJ \hcXFǛΫt5a6*Y5պ4r_=}w `R.uzRE+E).ѥ.bƉo3zy]KiitKT1Ew n%$,Xzi$] A?&zGeIfGcpD i+mPASma0(tIB0}`O@<~'KW{\B;gэ";DtҺ((ïۡKQϓS8r& GkH| Fc06^PE*pRJkkVAP%J3OT^/J5ڭ%r0D7o"1|SY얯d pbd|\w?- Jpsv}{sjY(Yrc/ޟϿⲜ^= 1*;_|0H-ߤFF+.x]JR W[kt`fSBxQodAlg[N7&c ]n_^BN0E_U3ӸgCh!8h}a׋$v\ŀ^ZS 1k0޸"S`w_OhN? LNNKFB#yn.̱1_wOn^wB´TTy}U\}_Tkwj+Zn|$/~~p3GNM~ug7ͻPΈ)|8o87g#+j?2wAP(o,(i:dww /dΧ͖,",%+\g`s*X}ApbgBZAܸpP=>\0z;q,#z bv'^Y LxI'۩VhONGZ [hpɥ{ HTHFؑ+mIq+m='Ai*W=t;Wܓ!J'6Gd&De_/ɜȖfm4'nY~5Փ:{Q[ha'x<yh/Pd5D D_N0kvjrm&:oʯ^$%x^2r*ARjDE[ uY3LRW׊E?XWgf%7ې?VS6޾?x][ӿrw^?,e[PYvm̃"JK R߱n.Igfo]3[lm_^u{K?VxDˤ!?&ߩ[7@s@ʃI}Gv8 Pt-)uuˡ!? ?zu֚[*bT'}!Ot-XCC~p)) ߜnC\y݆'tm5)\!N W1Y}*\p,Zœ4b ($YھI)5فx]/SqRSmJ2W:ZkJn5!kC缲 47L}Phj!LņW-!AռGb%vV&+F [V#eg)CŘgӡ/X,F(< ӣD!ŷҘ*gvi 4ƊӕW{1YVJr+ ) ;x_>}Z+늚3p(lx7ݱ~3f?|_S&ȹ(#ï1TvC\3R;*p`fŗ?.9zkAF> ]F02a.nB!t*-FLWA+cX hļ`g@c dt$D6%ؤxa&R0_|ڱ .QRxTykthvD_2_i~Eci+ '*8mRAʙO4z AܝNlgvNDJ .ʀ$峪KkR1"$Z8ۛ@jYuٺ]J V0k/R1TpzRj_|qYW< ջ(d&LU&yBWL J'> k"^)ԫ Z+;Ͳ+SV*(hZ&u)  TZ0P3EYV ø[{\1tB^;,ME jՂe-NR S0Jkl@d70nF@VE)bBv< -<#n_8\lzP>8#+>qC5EXSV&$L@]Ys7+ >츸3zJܗ ;V}r'Ed{b&)bQ]UMCA"R&30>wnyL# hIa>~e;Z?I6&Cx:>6)PIr3<{fw%7%O_Wms6=Zyc=%T`"}gZ!.?s-|x)~2~JQ~}n*D$o|M݂E}{sX͒'sMWhRAEv{/ks!H/9hA~M$w%}0[=rVG/=!aybAa&Ix2%e˝ډ@GKGa^y^)oVĺW:g= /BK]!OQd6D+Nۘ*jV d0v9qc' ec4N]{Vlo'C0jF7'hЁcoFЮf$+Qj)#/,`}e,u/ʔL2ºB] ]֙:BTs$ф\F!#ftV nwL쬳1٥0-#A&U4LuJ載Ѱc 7xL v&*mוfNo{ٺ$Eڠ3X'g1xC <{6D=Zh){VBJ9!(;«,>8c=Sڰۡ&^!B%eTu5Qc.:Zр* F`bpz[^T+36  FR^*5'貧,6Y,pĐƹk=ZB"f-ri #/2$: %U v{ʨ-yY720LIdH, 22.Zk,D[R(2@6Rނ HI*`)/%8à*jjDMu (sgW]^|dwxB PR. (p'59 H7-#qdcHWSf][UL%kRnvdaaM-\ 9q&ɍ45qقG=̌i^n'@Γ?N,nڤrsTvg@m|)ZKhUpUWWu޾+]?FE"|QQfsſ{Mۺb.1@?}=>9`.L$} byRN{~hƶt_s|iY;\C V?ieeqxWҺ墣<[, r8dpw₿˳-nDݟJhٯO|=Ovx~*e\gNU/j~QYE#(ߊc[9Ջ7K uX=Y@z}7ZE/8C8D+CsYs0B,+!&EFD "a=HB2 -O3w#^ s4تqj%ji{? џ-]hV@4NÑ.uPvdBF^i@}Cb#0:@6n b3.tZw[zidR֌xw>z;Ib|J])FdTHf9L*Pơq!/`Vb Q 1[}E/LqRN:Etmz@&Pa5'˩+ˮVH:r2vcb7!FxkYK5 knvR%a[FxE0AbyJ jPKuV-W\ܱN' ;s3֔c^/6=ѱBfӢ2 ʏهeEgu)QoX ZܓZåg'VsiJ+7.>99J0p;7f?Wڸ7h3p~~uy^/wAknDcO37!yUPS%!*(6kX'߂496$RA)+dnjnMJl=Y]g7VmYﻥA5hm4W:3ZMl^r-S }pL~5OwbwcW1{"m{"4=?fc1$j簣vZ>sݾzx#~rvl!0;G0֧B_85~s:L2gS/< f2.f]~tQIiRN&Kɓ ))DE4Yډ/[;Lc0uCKYMsNk,yLY(& >GGCH@]WZu+R\՟ܩ~ѯѝ&r)}wOoKýb}z/o3kY#p8GGϋ50Y9PfρJZZ;>ve{d{+9_KW ~\BWXY)#uwWPNQ_"j%c|⿝qvy}wKn'y.I/5ۼ58{/3^빊_~S瓫.})|B1b94dZ5u so~l~l ŻM{yv\~; ʏWM봧**]-X#Cm^^Np|оIj-N}O{⿨,5Pfg_I\h׼'f(f;/'6 w6'M[ e2>LD2}qYZ٥5wv$Lއ\rq싅Z`4^f# hTVv,"K)$3:l) K']NϺt3Z E.[FXN<?W*uu7BݐI~O&7Q*K:x,$ %ʬkDLY &S𲤐tHuИa(%g5K)llk!6fWoR<AC+I!s#]=!?zн7a?;*aRg8tZiP>5w@ڞvuEpjqIWy*jK^<*PO{èWP;݋>o-mתfz#),KGW>r$k!֚pƮwQu)i}໤Q2=1kaL7.D4҉(|'!il2 @5Q"o9x #U[E)Rm21x#gyCWQD|Y-K""<-} j(^yki[\< -ئsWRgԶZZ+e =*I@:k* bvVDgqH1RI"=j,hImFR@0(]-)8,c"Y@4xH$,Xa&1.;"C OcT EQ;\"#F٨*AgZܸܸ)LdzMb6s'U(%G~HȖ]&@=b^R  6i=sʙTm-#R3Nv8886ɜ}ԆzN )^ -YXjeR㾒X3ЖY t_3n{({*n\[w4;p֚@!solk3Y -K;A5K/ף ǮpϩR9&W5&[m/fx ^ݞ]rDnC1ӏ,Re#TчKb,BVQ 8s.u *&ْw']sKaᓴs;9]eX AWqD7@cDht}R\jp/{ֲȸ.\(7+ֿ\?bUۧ]nj }DO4<,IVyx]~KZ\ v# ,ȩu:[?z^zڌ:]1J>aN%T[2M0U֑YK閘 hR:gi M^U-v=cs }l$jC(?(r,"%W+ǷIql]>7/Q}"V i|[hkɈ͑7]:ǥiScLvObphVfںW7d]7sYxYMT(z\:CВ/B¤5mhYMH#ƻVIcvMoll|B ^m#B[MBO}g\e>. Wc[Uò&kf2'5αLedY2Õ,SYރ>cRCєej?(X'˻[ObeѻDbI8%i yO &?h9pPn]:G7ϟ&ć ivz"9Ma矟q i:#G8GjFk`}?qEBʒ3#X ï ~A戨L eA R/9}rsI!WXzvfXnbٌ]\vbYՑD\1cPjQ",(|.VۥAeQ8&YmokTbߐi)(JF8 V-||GN$HKhѮX/)_0ʫ5L;˚6s]$1yozAו/a 孵fi[ x.7}A  Ŕ /a mMMk0Gjihzd]=]Qxq#t+hhh F8|!Ýr55`B Ү̊ :79&`7 ݢ]K9[cPrC1ӏXM#ֹå1" ( sg˹vm*ْw'\sI=DVtkW¼J ,@ ߿ړЅo6 *l(οڍ?K߇Eϟ~{qe{G`*<lqLށГ.G;p*:u'*Eo6&+M+nסJvxU^*S]B_{<_痎IĞwNJ]iЎ#4NqH۶ih8߇VZ`9_KXxt|Z"IeSnv[(mс7af(Vҷ=5*/"pϲkygq{Jub6Bqϟ_w'%[!q'J7z&{ ؑKDiAI?m=r = q;/HzšW|o [:;ΨZXrxG$ɫjԅd0Mb#c0mYRl DdkOgXlbWHE~ڍxN⣓TEAүIbs.XI|R$?jߣ& 4&ϴ$q*}3>"KRN\=OޗT}Ι;g|X\铆M?$=1^*˜8,q{FόG1}l}ϗp9XxPI 0'=9g&ͱ| # o?_n%W!X^XN{a]ׂڴi[m } 4 6-$3)%ʧgM[ЦYV᮵C/McOԵؖ[bKe·gHX2.H璸 >Z5!%()&s0t:Cj%jk}Dd:6uCtt䏭6QSb|L1qԖHM,\2m`WYZ}>01DTuk-M@hk;oP [59@0}ۥ¨ kqhV!5ቶ +ζMU6DǡOriiA|HeސX1EP `ed9T8E[O}u U>~,{LL&]޴\1P3ٖ͠zU9vuo\xg‹|\A+!t#">.+E.%D8!BG^_an2Cp9׮KP]'h ܒU~J;0E]%^mרZ`a_jVR0t[/{֤tU*Rf6P13B$?U_(x|W֨?H~QVe#4kvlꃍ+byk cH{؞&j'^N^P)&fri؇ϿN?IOC%Iwڇݲ$?}~|_eܿ?7<{}<6]1&K*STf#bHV̋C߷)rޥB3GJ6K&EN 6:%C iZUhā}ZΡKD9(oyȧ'dߒђOO؊z;="8BwOIKI.%//K/[0ϼ D]1>^tC 2ĽjYӃd$8o[::R@Armމ m)H*$!^=5(>9Em {&#zG4z &Τ\-^:;p\ɞ ~YYxNrn;D .,iIvIk'gѿmgP))3fKfH=d(՝/SR=[I yXXgʄu&'?gux~hLgEno]0]NW2™G9&.GR srL?Go]^/J.GRQzQj,J-ȍ8Nl=gXgH'M :N l="L*~֧O,E( TSPq&HX]תoͰ=Zk7C1eEKs/H{2kI/=SrnuN8Я)ptrGͳ7S3GQ`m됊4O$@v$\@Mt`okQ]ox6;$36Zn كx;zg! Qd-t$n7ƶu~]~/.5#8#s|@Q6qJaf/5H)o"}E, vfy UB-Fv=Uy&!b?8oB06eܷVcg=r-4X:̨$rFE2{R`(GضwuW5roց.)'kѾ"Rpw mrDtlSdvhߒN}dCB hǀYJM^Jx)y!fWǓ* -jWԻ"242`EE%ӭ,vdMkO.]Au ;ˎenNN9ũL gegGsLD^Iz0w^lo::rzr(FG>*[XE.EGXYQL+~6 K_[ؗ/魕X VoVgﶀ]b_J/GUhq<zߥ[>6dY9[YʍXnxs/ht1o/ovDVs/xg_GVM?nM%mpǭiHqZl:^T7t]>xE^ؼcܪʸy r7\DWCO{K)<ؔq+ !.?ŜWkѡeW{\HՏ7eEN<ߊCѰ%jOz ;ϛEXF>>(t a+mwϏS+~OY]{>\kщ_:l6tDR.ϼ E+QKNu$v3Jָ@݄Ltʊ+]>Df bKD StEPQ# tm%eC/in5y)QuL16KR~7T%} [D)3Qyv$)۫ .\˿W Râ38}Bsw=lGWs|0ܮ~BjS)SEF*hW ʨi'x ) GW7r'Cd2,{o( ^-eFe DGx@af[9vh-=yGJGXW!qL*Z4yYHۮ8"@۱Q=u:^av܍OD R'fknY-tbA1POG#CWo<ʼnoLg*8B P”4YUPA8o1((,@J$`){>XcVր\7*]) 2,=^RB2^iY!YN[J Pa]_řq OYŪ.yۧO"FGG߮/1loARϬ!^_$t[azIO9@uts0su>~ʡ4hb D#c|5ޝ>ag- ` $At,a@8G/"4 4|˄0@a:|k5w&bl,r%+.BN,(λY2w}sF<4EpRbYU%1UCQhh]^"^T V{fK?.JP}NrEPXA3Qp8TC!z 3S2@ęT(y?ޡ̘R6 Wz_=励h!36 } L8]1$L<ԞfIm+'=2aCFK-r5 y: 0tprH8u,?,nQpz^ͰѢNLrthôKt_|+S>qRC]4h ɩkD[|ysJn|)?+ַA!))ȟ.MPw-vsܡ^hm,{˿#( PMaf)" Q1r;El)٧g{pi $4<~IB^b&Z4`Wݓ@Y{1Z5`hzxè1)[{!;Y=xhŸWK=`-\UsnKsGݔMF1~Ql^"q8j^__| \W3߿(Ɵb.7TTUޡ/UaxłEQZQ`*TA;f-eBUFM5$o'ra~W&Mz?D1EԈi4Zǜ.}muyk<|D3W>AS\JuSFUb%f(*vRH%H탌RXaTAEcwدFLչn5?z$p,nɵ}" >cfw#J {^oa zG. 7ȭ9ZL/DvhvhZfvZpE_l]t}, 4"+-{3>@i,/j=RER˴*I)EY8*΋*Xʂ +]2G۠’Qgh1Ce%D!loegQ长˿ E(3 Ryϸdy!8':/HarCdvsH5hJT}P|Ejnb?hBsh8/yFn},IdE48>VAD:O|)dEF_wlM=7HI<,yz$^( C .pk22(#z*^XxGUIK7,GHY12\!)H`E+{(%zFl :ɜSTܣj5{=Tb}+% ZvU{{=r]ciQEAa/@zg4j6* +v8uUue-!lm&_ c0ܐcg(- w&8TX/ o4w< p%Gg@X)[jskw4NS*tGP5{  JԬe(ǰou8h::{lFӘ7jIk8Fh ⃕Dl"sŐsEiS3Vp2w9jʕ}v.LJ1!Q Vf@Y}l?uَī(}(-ZJP&^TtN`akgdUz錵8>0v-R߷mORgaj5JiMׅz<_FغbK~Iܦ݂n_k5'ekL&qy'k4t_O2cy663Kے]7;hM1iYk~~'Q?LY)?P K1vV~ѝ3 Z>ZU 2I=nl{kqh ZFf~sjg [FA+7USXN1Ŋz -)pEm΋*Xʂ1+]2'X/,  b4WL=w{>D X~mTŊL"il5+N" q$2ni(Vw/ݲKD^!Qxu:J6R}n|5Cש/.Ri_A)~oZiMoVsC5G25wEhѦjY>`6+atmhOU G><2fn&=|OqB[ ,#vݧkvݛ{G0mNGwRuzGqW1Zn(ioChkS&Hz[^ΘݼZ@e<WVj'Q~MLքzMK9G$=˺17kC+aIJPK^:aEuUh.CY<58?L䉱B|:ՈsG0^XK)^߰Ӥ)T(C  4,*I+RRV2 K  l%AɢG, ˃T8p$Vt8S˒URUM0BAI hj!n銺 ' |MXTUf2>Lf'[V{uswȮZE[%)ݨ܇`h:Ɔ8/"h:Ǵ6)ڜV·=ouGƴxN6nD_ՠt$!o\DɔNl[SN1rPvO4U!!o\Dɔ⃷8 ݚ EtyPAoڭDS[E.S=c)2@7;ɚ5QErstxmQfmbqq5//C_DGH{b;eX[޲Mhz%|-+>|H)n|GhqXH1a}:44cy"S g$3 !x==XTXls]jOV4,+Dfu41OkoaJ=L\-A+LZ۫L1'Cf'z$1hlSӶmS=fLsԕ<&(yzMο<ݻ(&,DVZn5H>0 09?$95 ?P?_rK@DXhi%0p?y80u6w@(g.q5fo PR0KdXzե4dҲ Bp+>`PXېDƸ7Zp1POM4,raOc>CA+[f-R,GQN/Ҍ) 3"ALD,K%,}vT+ 1T0}R$~(!{R&5NKzmcVzckS:k1Anw<:mk59xа޴@z ސ^ R Š VZBEɶU>pP:C1֍ku1{8+FtԎ8KZlgUǰI*a穴>Ӄ .64omc' g<~=u^cϠk%7٬9J ĥE냽^cΚ[}/WM^nmnz=ش6l}B/ p!Kė/^/^߰ ;Y߰ z8J.Wca~.%)-+`fTp'FOVuO?9)%Y-}:.ߛRz+zB֩X>Q{jPJ/NIiZ'x'V?N}Mtwo+ŢZ"N;g_ -9+ej)XNRܟ9#zEp|7Bͺg0 T2o/,>@3DU҉c6=P8Ӽ2.yDDnnْK%j"RSiS´.HVCV@Y'_/&㵫;%"p.'}]h~.n3h3Q(4m/>4 $F-Fd/ ʼnh(J 7 N >D7cSi{x#ԟHֻlD &֙D+;i7ɼYb86ĤfIcYFʊ3eDD#Jsa#hb2oT(/2X$R#DZ`8$s6# Il&#* WDJdVarn03}ئ:Y엨]T ˦g$oR%Nvo߿??od^JBIiiU0qgXI`?)Zڬ1$0^H彻5$6:ǟ>Z@>QPRHP%fSyđV({i!w?onr1BZL'RdW-""RZJ IɞEQ":.DPRtIJ=Kt=1RDḩ.. 0Zd28ʖj0Ϟ87]o/$|mSm׳ϮS/ 㝜4+rl.tIO_6}V7R,0~&;_;bDs Yhk7Gv%J^y?az`/vOCd5cfdjt1N'|Ž&?H[EȾq~ٽ#}aI8+l!qv3ξ]#͌&"!VCZ8qLh ]!oHyސ,+G8=flGTOfa5Ȕ .KxTbq{TV H^CZ/I&,9ɓœ~|ɢ8ub`yT(p8!stOOb͙m/Ph@ b}^NyNGiv.u0cQR3c+]RCt]{}hOS>=M'yOnA>swNr/Y 70ޢA>7d.6s\.2tR;w^iv=? EVeXrINT)HhXj1eC%BI9cN[Tu6ȪAlCo>q1n/|t׮5,6϶h^zl\9jP5޽bBnZłyp'H)F ZX8uD bF)&eôT CHspH25߅S(NI |>x`zY[4C~Hh,)qT};_~xz; # pȇOܸa)"?W!8Ȱ8_~{|\sdױx1`?c%Ј#ț$Y,[l?>` yĔSJ'; R%־ ]X ĉt\](ǧ[(Ar(Ufke)$zXӖ $I ´ŸsЌa2H=R[L>=]"?w\HݺxAr+2sME'B!OeȂ۫S>$Z* UO\*ř]qexCǹ#&#'zy]Q/G`1T!ˌ`$)bI!$RMM22+SHSe~PԑGl{ځ8gPH1ű·$^}oE\/D)NQ $48 CN6 j!fY(6$'Qm}0@Ԛe/`K4qŽ} ܧ| x[ aI }*mV&όxwSI|q&-waNj1r$CX'r\*]^8*H򗐑PL:.ئ^No/ > \3㋧,pqbVչ|vh yvi?WӇĶ+ֿ3кա!\E/),yĥ^Z/ оH)ʕ.lߺۿJрoƣAԕ^Q`qRqP&N꛿}t- L=w+Z gvby ^?8bPƻ7_8}nG H߬S-~4RBfSP8͚mO'v)̝N\ev"N\5t4Pñ(JDԼ Iu. j/xtOfwm y0$K'ˡ~Dޏ\×d=q &b&}m5*0.>U/5j#A4>> aLI̵p78RS̰lީzMjSdv( vpJ33 s-=Ey\n !8'wWs9/*%6^5T{$Wz$u2)Z']AJvٹJ[h_6}8l JZ*jz]è!FK&i;[Rڮ뭎-֩ԟ6Mu%%jO{ |rdC_vO(V@j{@DLu#)J5Szui4GpT7]WbL!0 c`# \߽N^ 4D3d2K+3ZE* IFYJܣX`lJ` e&$`P$XĽ& 4#Fl%a*MFX1udۙlRL  eE k&A k絹b7l n}ۏDbQ8Vň{v{uͷTp#[|DV#ھfħv瑉ʛ4Tc<[3POG7z10P6CT*O7޲y碣//-Zp얢A ]T#P(.jL9aXv=Dn*O[o.oYXpggKM'@^Xe3-8HɁcw|OeFDRSJ@UI?%EzG~`u84i<5xXEN7zXGeNYI8ѿ݆6uaʥ͜p}ǖA-_hW%憗,䍛MQL;!n=u`e:hz42^~{ֻy7n{6eYHr}Y9fT%z烇kgi~p`MF͓כdq=ׯ6ɹ{ǖUp(eݪԮ#qnup $zEv7]> HI X,ҧ`H]jklyoJsNUan'J`8\N"ip]Ģ݇.nMtTN++=<Ϯ2uÇl:V?Y uqzu=mttpTOQNw0Fş=quDq`D>$F]\V-Г˫wDvDg}L""pՍfWjߏ]Z<}ZۂjRuuhg,6纵Rk"76S%ZV5U,T,Z-LfCYnO%>sl쾶Sc&!ܝ"t2~,VVr2'8 ~BpZk S#lJ?J;N#"AoUim#ݽ/[MY6x񟟔rev_\nm5Tb`JP{OE`Ka[ (}j $F*~E2~vyGoNv3Aa>>7nhBۏ utgE eHPع56kEYQ82IQL&\L ̰b<& <c 9QEsS)gcBBc+sB@$MrT!Y>'P130gR!xUr=5d`i=5*b{j&66~IbdlYF[j 8>" iAmP_|pI#굡ʭRg/4e=JFɷvWgdR )i^'W=oCe gNJG3l,ڠJx3NDIXLRlZx>.q5\\]tuH/_Nh\Ҧ%f6IU:>QQ@gmqi*lٷ%k*7-??{r[q]7W aU1huc2vCtHh.|cB98~QXkaVlM !,s {ol pt7xO-(&[|#%3(io0xOmlwA\M7h-qVIJت)W#@!="9}Tt/)M232&lTu9$ שkJdʸ*z^H\N0T.ֻu#tjf7,EhFnPǤHS['/: TI-`GΕB?p6r0XFW24WoBzoT g.&O]ww@.5hž'zɲL \)DJfJ(y&R4)3cJZ9jpChP2? cJ rǒX,q U_]S#8ni»sz̓ :xNwE<RȠZYK]HH Ӳ`zX!@]{듊G68yG稘O \/\:!I<' prATLʚX#BNApX6[RЅ X4}jtS 2%1K|HJNSݪKpڥ'1T %'9}[@ShJya]w =bKr:,O "0C@:$&O!?HLA ox>lͫk{HZ[!}Ǜl:8!=7*,6n6@H Fs:F x` *Z_@a<%]X@AERXaXSOC.CgDoeU5FNUg*3}n@H %}賰w^!$? E8$j1HHѽfBNiP`Љʧi_$qx9Ep!!%^Mgpba2_?`N!Y2p04hda`p"Ѡ7}BGDѠwB';@c1GV ƣʾ#tS E;# ewJ˹7+ OlCg/k8ĔAxz7^;L6Iow(w?nOLj)&ILyKsTd3ler2V` $0f!L&)&BL"TB$L՚yIhVD )U` e)dZHsBz"c@h '9}r*HjvTZ\z"kXh,Kӌz4&z)cuHh+53dVD%: fI>9HO6 Y&)$L3D֘0쩬RK/[S/NyCOdMHɌ: $k1Q7T2#iqfߦCP2s@^oc̪'zo80v 8ż44],Yj^:Xk2tRX[|U+|UL[~,:-)^d(y>`lwF67dcG;i,fc}jsO':JWLv_OtؙAj+4Il{o_H9E;l,h'Է{R @+|t^Rhg+E6Է{R q^w*4+cԷ{R TJ4V 脕Bqe[)JU6[Bj$*^ng*2mZ1o­ ;+b _)kUke6e@Bp٥e~,#Hjb1"96g= v)`W_ݼEk1tP@V_L-X|]=,7orNq0wau/jEuKc@wPiqsA 1L(]=af=2{.ħk z*Y|x}VgۿJItQ\%dZwn4a6ۗ ɬ|N1W_IK2}ԏ9 *>݊Q1AH럎hh+@V)pDtض<@.t$;ܱD0Ŵ·]tBl;ǀ3 ?ꩍOD>i)&LeR@.(> v3Ѕ]BFGe @t ]]Wonqqh>uDUb.TTPTF9 knVLYnr| WB t9@hm!/j1ɿfN.*cۣW] <, oe•bIB! !P"% $ý/~=*6ogimA԰S|¶N`lV}kGN-*(߶B;^9K){n[Z-Y92.b7L e(09CrGRF:ʛ(5\l I^f4ka3ΧLr23l'5ŖEv!APnRyN1r6[$?p1BTMzEbəGJQ@OmvCQBxwzY{¨v=D Sن Q*<3G-f/hDt|6XwNRq8V˕ʵP|Vpg Y9+`X~V͹~,љ,p켗%Or_9$7xBX朌s1Tl0N)y3g y⟑{U 摁UЈGn+xO!AD;F>]o.sym`ģk_+4|c&4NcfM73^Fm!M;]kav';-zr=>iN]oFW}%}v="@3o$cII})Yd-PSadbjU]GwU5]S+[n,36K(2JvH}lՔJTñSQ:4ab&vRi-U Yvv85SQKӭ4FtJ[ow\\NS@׿įŗgX`\ \0l'37 ^\#s3tB4444<} ZYk`Ib"G iO8ni1KLX{3hzK\=}yFƑ IR[i LH'7i_h~|,D(V?0Gt2mIN%> +Z\OW ]JP`X M-ȂPN9A$|Hp(Qث.%ack91V|G`4&UX` N-6 ` b,+-XU];}f">NeYPօ\7V''~LFO۵nfq: z$ivV*0TOǿiVRW^q>(/ށU5w=#/8!>'36t{ FWq?:߹5;3$yh7r08vtwAa{S7sW (ޢ%eynlA>Ļ|bЊIQBbh`$IHR&6AXP^&e6QfcwբM/#qN;Eq}44||f9Ei8wg |ܽSf.\L;eCc=lʡJd2xKw=OobK03 g`6 L~k{>]7RKћaf77>#={I 1{yb,}MT_-Q bǻ}vb)LQOJe%16$Kl3lMl"TKBb"Y2e&j"@Jaojj0뢛[[TI)")D IL:)mr|%>WT,3_Q Vpn'RJ<+njjR&;)mb'X纴P!qH)D[J)R[|.OJinFZJRF¡9MT_-Q*Ze/ǧA8)| tݺ82r,]板|[nmSg0n.]z7m4[<S8Α8;`J >}'CV/I,,GO,CŠ+a(c5ӕ}6.I5# $O1J(2+KN(rJL&$qDqܸtRɨ22h}uY'ӵ bQI\3D!4Rp76/)b}WF\!gl l'3c-avJ H'y{G-+N('p+}a@nP]j}aضqsV--새x>'w5I|p!ްg`?hh?Go 89 s/o dqrߣpb3Zv4|!ۂ7Wϻä(8xK d@~|Ķ@-x>1%k}ŀ#j.9nfST,E԰Qn@0) '^VJ￶:qM.ИS>% Ybh_FӘ!'fB,3&TؠeM2y|V%쫖 uRX1gVӾM8IcD"|dBIHfkh8`%/l]j؞ptpmr!w_[lkZ!Qv*ˆ`˩G ܷQ~"'y`K^ ԤOxix-pq ڞwkvjG38;V3arP,722å8O7Z󌖩,r57wȼ2GnbM;E^\XAv^^'ݺZ.  4 U[m %l #M5MuW ~u:𸺁wGPP܈y1\#k<(6bQC_Q&2&O3(2RCId0&qU9AG>GGWN0nΣ뉙<#<'i ό&7i_h~ޒˑ*]X˯L/fuՁ3Y-o*uap2ʭ0ጡX0(P$RTcc8߲*%e nVIz#6H7?togr;3nfk?{?{y{nRݗϬ,•ZMSb$I4J@Ԣ8!T,f*ՙu5LgF$J&gt;ĕ1dK,T@xM^8I 6S1ǏHM6Lb >1u"Ҁ&q3H WJI @0\fۯИjq: [%O@_w8\! vM-~E>_G/nC\vADC}xكg48-D^+iN4Le e;";a1ѽ֌'? -) r{ _a,b/5FRCכ|FwX`Jƅڱ)ns:%ǷUPJf$KSAĂ'fR!!X l0g|^Q 'n|y紃OC1h=GOG"^6_o{{NTo".;/Gf|?`V24,`!"~0{CSuda6/5]}OZJLg`qeekDo?Zaxo6PɂCRV!rD40o)DG-׮**(iB/{ƍ K/dk*?,mjW$G`$V(Q0$pHbn'%3k4@XvDIU1sjxQ"]M/jPFl57k/n=WϬ6ʆq=#@YL.L:s2F KC+ѣuRN:T@z·"ƘuFzF~vy ?glTl`gNӖ7gFM_*XYAZ hGWv:H̉|$$F8]CK,9oE=:\EV-%/ S /7Mꫲx )CkĞ=BϪ//_ \ja2s>7Vm*o"+VK-c8~H@&kْmqkYѡ/'7糩KkckjTٍW_*[3w/ 5,ѩ QL' <^ZJPBl=`mr^Z 5km%*W-w^]DӞw|dQY@N6+#﹧O nlYu6-]C5p3 c@j HgA^@He5@FP*C=ypqf[h$+@n+/ r^N!+T)-7 3Lrf lTlę7 'QAs8L7 YO`8ceJAm)]Nh0pnUz!2B1J4C:KV[e9@*3ag|2 F|9f~l(.V_/"2Qj+ty9so <3HX0 c&#RJJ)(H ˄2%R3bRI@iVf2*A W`2,); }`,20Z5ɍp͙f#la[7?9*U.Rf$D() %]"]B(+ \*C+WV&Z\AK VX5z#Jvu\q|lȷ+asrVBj$+xw=2$eN}t˳3jxI[$~ iaІL4UD`垳 EyW?^5U9C'{QGW% ށHhM:6lGܙg<ֽϤXˇz?fd_39g|o`e=,/IVS6$hr50 tqI9 ʿʿƵ,\P?ܹ621!h_0#dc0qCh  m 9Qj"W`1/ \V}wOo&_G'0џ~2-)n:^vuwϨ˨;{6J+^.˟kŲHƈjW mV5=rl8v?Ӆ1w: ݠF<V@ƻK;wUGջ TBCX{U%%E!_#$H7KR@+2_M\+-S7|O0ĭѵPKM9Ԑuy{$X nͯ{pknNŏ\VzB띎![uތ1[͙-2s6{jRܾx5DcEy-XItgfQzCewnk"!ϒ]k7Wu!\_ԓxsL<-]  )ݘGj& U4Fb_-Չ}G֡Eĸ5_)hАo\Et{֍zZX BT'U[Qf⏎Z:4WђN@Bc&!f[\2*:(=?8rF o?t_}2 AK8la_$ʡ"aZRP/5JR[OλOvSR´37&fkRc($d}fyks>ɒGݘ%VդtQce%V!HBeIfN02@$SJ oU;l▽:i y[AbN=:o=3 O5=4T[Їj 뵳n(>VM8[Lϓ<-n6).$fD D8!Ȅ4OT(Y2T{ ܶc3eZt*g'xvNN2^}:pP)A NV;G[ <!tPC鐡N2 )jO ]h!ٔ4Emb>u5eZ=nMͷܫ#:EiWJ $:>2j)-FM-1L]mV:a=Û/{ j6BAkVq,>,]S׶jqz(쟎/iڽ5Su{m\xP1 nOp۶Ւ0ϙԹuDDHBjd1njk6X8}a_xwմ Az `^eB8|d Db$FϤ?#dqE.z!"FQD)2XkR6"͹P!)ҹ$(* $9YVӼAh#yPN~c[Cǣ`E[+]+ČJMg5덂WCNЁ.$ku+Y\)\4AFZ=NQ}hw_n(BVtV xCPGLF^C4``gs@=bU??V~ǏlRwiϗ[xablx3HV[%=|byQ FecۇmD!9=ftRx N'V"19FBG&KGNTQraE,E r%(4y3 ji "BjaDUYjQ/ ku;4V.9o9% X=~6zhNQiۺI#z-Չ}G֡C,(JCn֭ U4FWs'oY7oOuhb1QwTn>2~i_hАo\Ect?xz˺1؏4Չ}Ge!ޅ nuh7Q:%PZڏ!G3h SL!zgb!a&p@r' qB3vzq'^-' qB3&t'pp' qB34Nq':x2q q'7;䂔n~R|O]I)zU( sS(7Fmww.eY~kWO'nAg>^8$Q$A-s%/V?{xS$<K5u[kYo6./||r?Y&#pxJGo` Z,&7w~8m@3牋HNyB!H J(@577֞ο$""yiy;(sJ̿stDQ"-Rr)Hjĭb%) b-^Ez٧:芒O& GIMs"֝m275ߪqbeͿ`JvpBI1Y8.g9ԹX7cl\϶n'fmrDۖSӄ_Go]瀅qڒ3(#g?{v~^ϲ?T*Pc̡Br!!^{ 9g = k/+R$%K{Ѭߞ;?]-ۑl]W׹ ].>^L'yѺs;tjCb|Rk|N]ʷ~@6;SfXΤNLZ;$ 2t 湦XP&rI57C3_n4 /+hϿqlH"[n?/t1&m_6~bXi!VOFp?[Cuגp@"ȌHyǓF{ yO{kC]Z}xf'[<fr~ 0ȸ-#ZUn)st`ŽC`9~\8+@ku%N8">,p l$q,,l@RF{]~RkN5j^gafRsAy3 \*OsRùcĀKR@pv9EQؗ *&AZA7S!*$U VMbdmVxERL6J/tWy5gښ6_AT._TVjˎ/NYʼn+}{A\S.Ǣ@L陆cD$_DiD"UaTAphbc xc|55rzThsq ?mIU*ua-0*N42UIH 85LsP]L".=8 3bLTl"B3bd;TF2P'䵩-IO$p mރ謰jvZ0Mw}+Mu4rt`VV·2zӛWݐ0 -}p&!+^m_F( bCD_|tOku~aRA1_D8:×8Z,XRX9J>j8A0]n5Ī8In_~gaXw/[|ʗ丞,>'GB)s.I#x0J#A-ΥY0## HL1"c#z&{89n>¥H8 )5DFjRE4U(pLbX{rx#q֒RdKt 5Xq$8cDMiibANY$J"mnUv 4Uˌ*RV΂'?Bٟw*4 &)ĘHg:-礞 +^a](!=0iqFT ud@E tIM_#gv3Ϝ$T*>oX9Hzgnf8`Z<]KXY-gyo ߾^O끄ME2m hEa/=뙴xd@_j D%vC{Ν 14Y>#cߣut3r|uZ5?Z;21ׁHe2 ! 1q4cjTpI磥m Oa`6 ^=zƉF؊uF XD}>gò ^t{DCz+CA+EtCdc (}ÌɺIh8j5'6 )KS).h&t8-9&k!Xy m>ຼ}샶c䔻 ]]dAHX#nW8_RpQk`mOmbς1P.=ϟ|# y"J 劧w.ѺGtڎQǺ/.&Yd<[EL7O5ܱnFi;FvUZfHn]H ђLiOʪ(9#+ܭ[LYr%U*#<B Qq`M1P"wmME3<6rvf5 .ɸ~0p57}fᵅs٪lYhp[ `3wbq9zc-~LSbirmi S,)bIq,i> (r2ZX*Պ)JSOD1 O_Up1ٴOHOtiH͙&+X;ǘ_:54v&GJWJZve|1\6tctt'+PN[)MH5N@v[[klՆhQ ;$4ZC3uc5ҟ[tPMA0z~[*쳽Oxܿ`Jpa£h Ø_"XRF*+a-xVُc  *b?/& Y05i^Nc<ɘ+*k}36ݏ}ӽTaBFsы V[7!D!rM\kO!oacǠw֧]Z+ {758-1BC-Sr /> ! YP%%36F 셮P)`xRƉ+/F㹰Ojpc*ќ['[pbbSń ;T^y@c}b7Eg}D7 B"xAJ:"ݟ^E@>һ{GܫHS-% &08a8BۄœqK1:gT$`"{cNU8IP"[2iil |-)5¤i:Ȕ QN=Sg)I۽bZY#M8FCarsk>GzKa ;vS٣3i)c;G*F9^n?!ܤF04$\!3 !M$(=c<2Ab:*%{b΋92#r^lٻ >O0 zYSDN-mB S}]K^U҈Hy*h`OءS+ 9gւrGG3.*rjU(x%e)MӈH+RcʉJJwbfV(T  ]KAe# Q# !H 9ID3*p>,&R$(%!@K:g ZqZ62t^ S\}B -PA]S.Ֆ3۰Ps4g m0$n8aec)^95_ ?4~ʪnZ8l71࿘ܽ뇩|O1G=5K.h&t5-9ck >7?]>h;FNе{EDLu"[[u:^tc{N>xl OV3[:Ai@ɔafEq z&LMLqAxrKcY-wdt\& ɚBy]FAY}p:FUt` `Fya4,t55'xeM 4EzL6sKN0jXКm慲ԘlqS 6kB=Qkb0ː]3yh}ls -Q&j bqvwmH_˽d3KԛεL! 2E|&$mQ%EQ6Nb4}X,ԩ-8޴`mekqcmAy”T7n:p3x]֦Zn [y($ qnz7d\:툫/u?wgpnM CtoL @P\P;l ni+!^<{g^C-vEjG#M}4VdaoLuW n#M%kW>$J.?N>Ƕ0+|( K?*?aިmHγzLn~C#6ShFGD oplmGr4CXSkWP]DTW7֊b4TQ(E F7tr Œ8 :8 aPP0jt y*uJ#EA8Lk J&$cU ʬ#88 6њV#Lp$eQg/*ic׏k*c1ayFK>k)8V@·XUz;z $YCRJt1oR̀‹΢cWe-PS׊/~?l"[)?q<`irwg;+C\eQvunW ^̓GyD$=/ Z< ?Q@a'.8+ٛ;{f-.P1vY*f&xjjzoGjEVs;v&qB)JwI=.I)x=JOcRUd=:&JwI=.ImPz2JrbiK})/-$A)'R,PC'PKqIjKQJJi6K RTjPGiq72TNq7RPGIԉ ̝) rLjP)T^ R{]6wa[qIj=JOk|i'PJJIǪGIJ7Rf:R*PJ5~?8Jҹw"[qYjӉJyC*$J9Jl'Ru2Q@n:"kh{z^ֻu\UWb1-2ڿdRa1+QU[yoUzVJ^}.^b&58*Lč4rD?Qd,~Fi}f;fy2JqܾQ$By5jB7W&Q:cfbFiyfߔ&y{@ Fk* C:94 J)6G(ʨGF:?l!ͱ]Wo_:$Dbq Iծ糩V: &f&??V| b|52~7wg1DdX̟_^2_,sOUN:Ca~{ps7J))GR4X,ϽSb~* c#$T f50uyӛ_/.G6F [jlp'w)RK,hɸ,51Qj1x.8H泛Aج2=]d.-s|ڄTK64M4KaYq 5v@z/ҏL4Xvfr06|QkmVN#q/M3C]?) YCyݟI6 W.}Fv@UȧڻH:z&!)sûѢO>wՁ t|QǻsA"y77ޭ pc84 uUϪ^ӫiX f6o_8F5˳|: GDIDlF"1,{T^!wyJe `&N>I<[RI7u|W37% 9slL`S<٨<aЦY{טy Mi6\mQCbRr ; ۾ cmfx'<73jc 63O#d.l ]FpRP?y2϶/coZ RWvGC-opo| :뢃o'! `q0`[k#9 u4C QY(mG.b |t1ț1J-άKk1.7hA!NF*m4)Vtā-EZXci D,@#qjs]ldq JO 9D&zcFB#aw)YD8H F:""aDŽSK,Чf!o{aKb||t+-bc8(.E*?5ueOd ;Q9!DN%NYX+6GQ ܸd'fܒBnϒ;SILGު8aB)}thU8ݡQqZ藊Ñ#eA Zc=Fj{*7ne96׮6a ků8'5YN)x?o[dM0ֶ(0{njZ7w-YP5rTϴrൽn k*9*ţ?U:8*&_碔j[)lޒ?pc M2e6͌" \9+3d!b@l V(м!O0ܡJ:>= *pȚo6o鿗gyn,<ǯA뎁Q:(ZSB1@ifqʞi)S%g*th4 U' z\D!&Ȟ\YcGX%Ԓ,:!?'vVz#|h`3ٿ a5ͪ#O)wyD6з8/Z7KAY+a>^H$ÒŭH!^r4)ꢚAI9}]֔<6StWޕJZ֒ Q{{ZC}O+u+VОV *l7޿Ҍ}t]V, #OvcͩFF=PJyP%P⿵vXS!X0a*?q(uz*(:ܢH38q:u5u{xM9^K^ 'Y%}ON/Rv]e nԄsDqYj MM "XL;mp(ȷZwQv-&tk>"2G JUhs孃f;3X\؛YH?2!bjl*x'[3ƫUY1aK71̼NCyC,2m!/^3ߦwF|ugnNjfM%mͻ7pnM C S :qnx7iwՁ t|Qǻs!I ԡ?7wk!/%L9U.Aܞn].wRLJD.%B\']H{H`' kFE$XU,(GK,6ɶV yts4Rt"٬l* B7!(Pa7YYoAd}Xcyp4B޲)ju]1"-ԉV땒@,ݶ~1(|;)DqBٕFT|ϸe,˰\!#E< }c)缼HDjaՎk|hy33"4[gc mq(hxi,M.M[GïVړがDklEhoP+] qk2quN%Nρ;^ʰ3Qవn4 땷\ Gy#D-sAv|ȉ>GrJ BhqI wquuU)j2 }Csey.Zz.C*сjLo9! A׆D y Qä p'[M֍C!* |h%[SP~*4Ia+Pq4&*P=l5M VR$W!tV'*P.%yxi TaQ+~밴T TOxp `ĥ7viv Yk4!-pW0+=n{m_⿟&ֿ9h~gɵQ(rv~[E__]> sɻI> |3nk1lF'@NZRHsg lH{A6XRHLv:NiKÄY.`ZOae[l߃UK\ky:]/FTNfn6BLfdӦGˤ'b͊ӥ1?f:AO 9DzM}zr(7cJݬ48%hp7}ox:7?&wh%JH$i:6*t:6t֧~l4GSwߙHU-G0' R1|[Hؓ;ZZ9_kXMH<.B2⟽mC"c }Yx,i)tT!ZNeR:Ȍ٭))GNvoc0Yv+/LnC CS$m#1 7ޘlBشwcS>a/},\QR5xne6 R;n>HHzXko W%.7JK?*?Wx~zwXWl,Pink+j> V^֟8YGorf`zݮɯn,0!^|T Dg:YC^ٴX(cgkxyK|~o%>naCrH>| R2N,7"#Ǎ$|OtpE "ɷ֧yȣ+.R q;m+넵ՍkkUaʅ:6[,dGr~ry {~(M*Um&K xֵH!Z˯(D/0U=@SG C$;gW2R3<() ,ѧ)<B;o[a TR+LK0YM5y/E $ʱ& J<_ H yr Q)T0>VɸFDqIn*&c8TFTHVa9bLG |&^a^,[g9Ғ#UFՓpv_rPvIl}+Z|{jBYag8:p>өfGᤋ'ϋfX'dV'~3aEϊ(H-A?SEg;A']21ׇ!U{_";w* Ř~/.[oc{Z]\jbX޲6f$,A 3[=r_!S146y`H8B8DK0e\L$0.! :n1gjV^݆@hn;e3fR@9m]D,a־-1 7ޘZYjeNw1VfV 3je= K]a qUJ))>= {u"~tVg2g2Dp&gR'6f5gs2 {H7<&a \::/πZһ3vŗ71RW|n[/DZ㱸ތ+bSU=lPmu|v9$yOvs{nO|( ?hpW}5fOu6?}p xٳg䈈#UK w3o+ޔJ As~Qd YT:P`Y^Vtw&i 6rJW0ZJY}mՊ(=i}bR4FYHsF)DeJ.*K[V[ѵSD)4YJY}e5 ui) H1b(EJC갓g4JӴ+uuiÁQz-3JO6p+^"& v;~6~tػz&츭۶j7<WVciCkgqeܠ`Gd  b7,;,OuEAQfΠ 1nDbܩ N'EG mo\#-ΧWM'!'}镓iĥ.Zp]b{<bl-6bRZyR2vU=a7V n,=.ND^I!XpYJUHMz.$J[5TaMMxx({n4fwb3^4+ bx',0ӟG'W!KƳkRa53G6|ceʪ[oZ@m 0\TKnލzدRFhfzlT1S2i5JlZ+AR3o$Sy@Yvpp@+LdC8uXyrhOY{wg0o֜*2n}JqyTMcDml-0gll9(R9zj`Ťs cFRN?A+0 rVWBt!nBxX,ӅbNCFեGag~a.jG\%/hq^-Ɠ֥?;V[S]${dTϗSmNxS=B8D0e }{de^!fV\'@h ů8w![RSz6.xVޔ݆@hoLi+`Pv61ٔO+;t4E؞4>tJlPRi(Vg2JHcX_s , bՖ=Q*uJej(: kt(=ALC)UI4/#LC)UI:&6JUb.UQ,"Psjmy]z(\JrEh#=Ă/ox_٨nx-c@՟v>a&1& mDW<Ʀ[y4SRb7p~Y✗bF[!IO, X)ttLWtTtjH{a=Z5sZo۪u͠QVUji|FrJT|;=ix#p6[|ZKr7YC#Rه;L֨t1UeRX ̪:4ԍoPTG],]7R)"oÍ 0dc+]iIԵ@A-Q-bR5&q#'W ٻHncW 83&")yI6@s>y86$V,8!{FڞȾ4=Rkb"ҍ #ΊjYү F&C#Ztp^c5Xzå!=T"@:ejJ^a[!jA#gIos,*9.dVkF%1ׯ #'`F)~c $zF)"А\=jx|ixiFVB:D38tZ;Rkl|9t~xZU 7XH[@@ctX@xR=ZT xFiD4hPĨ K&82| ,= a-:/;I-qktMP[9j-.;WK\b?0)iҲS{)pv/Oy%I4ndXk;RcX\i9)JPϺ, PYx)PR5^T&mJYxi^*k+ċbb5=015 /EXcsp.KsK933RߓZx4/kkFRӼ4JP4x6,T@԰b@>[hk TGFnb~Kn~KDFӍF5@LirL썢+q,Z,0FWa$]~*(5| `5(:FN QL+Utd']ahC! ^J`zY@1.Yyө6΁E޽*ި4gHd8(̬YR{KUmP|+W,)"41V+)وi DԊREZ/eϛlc,z}z.nb>xV yX{ +LJO_." \ŗÂAa~.,x(gDgϓ"9aeL>a#4 s8e@$ϭP $F3vN(M"fŴ Ǻ)j`1zMњrP|Ntm+ sVh>?n,ĘXyNI6Mmh.Z@(dJӣ[PeY4d5JɫFsȫr\MAo?YHZ880~aI &,94"'|}y| bG>l 3߳bKv"ީAZvwv\.Sl K'Að/ҴZ1BJ}<䍻h'}ƹݔsn]1HqȉnS"Pݺ_=Hn}xwmSyEϭOHd"1t|ngkW5v>~y{C&AU5(zju^4gYA&a+Bѕ8`TQ{G4J{M(z"5 88C%kS>J2KaA X AV/˒RE,K\AᗅN> ]c^H|^I9p"щ K}$yk0bبtD©8d Hxkbnxow æ'{q0Bǀи< ]eȲa&鐨$fE y#oX6ι5w8'Oϥ~KQWlQw]ZoO]0Myb>>k~際/w׷!6q+wPiô*uJ0VVcI4\6|X7k]_tn%;ȿԸ~uk;!Yr!͐tvO%A6|WG7b8GUKkʪ@m\>݋|X]Ce,1ܧ8;<8\'Z~|@(o.,?u*oxPw;bD1J n^m[²`axAUHFJ$Gˣ,U@rژX ,r}{Y?4"'Rt|m8E?FbRϭ.tqqSZw3=JtsZE~^NS͍ܘߨsc\%bV1fndd'k D魕ez#Š<" M +t1|R0] `3F-E0U$K<+)^Yg2E>e,; y\JNiLuPJ~B! Ҙ"זs;z 2"ݣЗj)\;W3&L[YsTrDd]0R*chh Mz0 `^L s6y )(W!ΕAOO߶Hs;P "6_]oڥ+EO->?E__}=Zpaݻ>X9 +jJEՃ~ꉵ,}9uYV O=w7@;iZ„5JThT*c.KEv+fBi۳03 %e.d%4fntd96wo-q}sW1ˑ]lF÷I8؞sl sgH-RSr#O:pFZmm$!JQɐDN^HkV; 0ҺbZPB:\^JTpIF(*}בKQaT3.—҂A~nUB:XWiz(;:1O9p,I6δ\_7W,'>B'F&w?[?ZT_iYjT/׶vihGj1[ޙbjuccA8ixàQ+!}""#bHe/K|eiâB}Ԏ!8t'#A ԢKit`;kc5I?ܙII:7bgxm b k\W},OM0pPV%2% Q44K%"eXDkYDJ I&^Q^)>[R%/02RW-Q(-җXh-IcE\av3 ȝBØ%|]NDQ2^ ^YbRcŃC2RR+s3\H 4^Up*uwkw|q5kʒv0ᐔyH( g4@w?NƖh8pUI[EKԢuseZѢۣ#GE h*d0+h<όuDݜ/m& a3gIL[4+.F"WZV|ͧ~8燴U-Y]=yK/Uj><-u+`?~"*Ӛ/O-O?^|nޞx?ow'Σ,nVx_%Gn@ ӷg^HƩdv}R9y?1G&a7wz^+w/?s.U l*~)7}Pﶛ]^Z 8XC6r VΞDOW&U"Ż==C#~Ph=\7'wZ^=I7028NkLƻ5܊MFx#-rBZ=+A ^ym XzSx"~ٹx"L OtB `n侫$iq l'Rs9e K͔b-qb5:t=lY7Zj߿ uhU 2}s|vcnX=Yd_OwU-66s|y)_G8_>g虿y~]Vrd|f/C&eSEqw=+Ci{ߤgU#TWgg!!zD 8Sn:@'!m)ݞ;ngDB1v#ĻPvշw/ z.rm)3xǝiuvakgwKf՚&{{*O{+MetT3x-㛋SP Cn ZF?ou[! WkAo׋rGg>:簚0}:iyOjG}>^ZY"=&ۧG73 npX=Qo#sy,J>%ݓsVExtQ*- Z-s&Rii(RG1J:R/}A9ƥRi(弚%NPZIm1.s((h(j'h~D^TXJK/Jj>tQʉ|oWRW甔Yw̹poOƿmaϛjV ZUo]]5|[R,]ŢRA(C4 ΎefKp&0ICh{4բ v'Ǡadkk*MH'4 zsWN9Zs-QX |u{[uz%װp#-D Jj'#P+eǞcbZ7V5eR3A쇩-{~a5y/v!0XPXcB:mj`ġ18 8Dc0L݄bu@N3Bۈїb[|$ѻuh|l -V>#Ļq6Uv0_ABn] !1힫O1;m9wA!PHDsR769ú ye}i'RAl6 (ĢOz^P!xR7b\(}^Ԃ#7~V+#yKR`7JQPy6z(EICU>Q(@CQ7&J%Pڪ/g "oX20S@ A(;l( 6J4Cb9RPԋ4ICE\4أA&lc2i?.M-_CX&^8]ž,2=UɩHќ&2+OY Rc%:W[l[/3I8RxW|W9L{o bT"gKã4aj='@VG`3!CN( waz%5X J1P=%5BㅶAdЖDaZC=Iv8O `t{%L)w*54FˌY2/4M429J & PTsZ3I{&6LJ&TFd"+>"O6b"طg U OVeC@=4̲Wk gJ  2]pPjU:SLnIe"(ҢCi:|ڤx= S2aBq@pm^{Xh,E 9 I[Rbhi29K7STA"Kctȭv )v0Z0Ҹ84.Iݳa F#Z.^4.ջ/x-j.8ʌRuٸ2Xh\QiwMÂt#[Ńq!-BrnZ<WHT>2k;tunwp+o}6{Jt"\cC3 2jnJ2m+9$V H)ޑlX].*ߴYNV Yˌz ڏALI&4qaP;M+OADI*݇-;UĤea%{B2΃=71^"˯ C? j3^,J@gGۄp!8O-p%JGΧ8h;o- }0uϹǹ:&Fp&Hg14[GZ?#_U -{VØ}pW_.wӏb~{yd~x/ibNh~CKdTyL. OdmKV}I2+@%paI}+fŏgٲ\]d}lտ/ֻz ˪עٛ"~xf>^ki\'PjQP&YU.Ԗ*e"-d& 7[ur9z앤 {ÍwqCF1Gfu_9qCџ<5Y+^Bi2&wSi)M&_e(,I)Qi-:9]Hc3ŽNb+nėD񗜿\5Dnk_KLסLOudf'" DW⸱4̉ ssPLJV"E.EߜƺLMR 4rPVD OKzi*'N'No3yqf SyJ7o䖣X -ou΁S9>ܓ"g-Z(CM6,V11J5ȴs,7-krp}z~vd%}g(+7 ɹ'cAd=£Rﶛ 5&賜[BaH_:`n=tmzi[]1*vKZRՎT;A5{( qMdi]X>"[Xi$-JwϮ?z>xy;F|u9wWFXK!(ַp;-],E5ѢzҚs_Y^,&49__ͮ<=}?&nsr$F+gi?%Xnk14ϸ"T53\=6SH`->+`zp+nJ)h Ί=ejUX6_$t~ڇxOX&RݩbibiLԍ BCmի'r}۾]Bl媮}'t12*\_LO6B֡E5EbMu =R(;w\$ o~F#Uq%kEF,rWR"RJOEc>Uf|߄VrꏒE:KFRҳ|6liDJɕ#.- Tp(NAJ /kz?igpO#,t2RfSR̚\ѻ(=iN=)(YKs*))gE ܓ>f(Z~' }QV̏) 1ֺYX- 6E҉wZ+z3;1|,o  ?MGK4*M'o +d?CIR=t)E%e? 8LXue-"/LJ|,yBycקyBOzuj)`_*??#Heql΋@guD~gI#JȮ|6ֽ/)J ΓGjhq|!_nomu&v6}7@yӵ#eʥ$ST0 V1s%J@1V(/&fCBZm\zt;(KD )׀#-(,OX+e0 dH7AE |z|x1heQN PPW> h'8D9>^ioe4SWe8楌Xnd" c\u'n? #]ڛE8Z0ag;]i{ˣ:PuW<3jhFoZ#)Z${"h.}@ { 2F͈(ޣ{f%YT3Xv+J|$H؄.^>O5RK$r/&A$=ݎ\fG#A'nوw8`hh w,JpEo{e48rEBoba)PgSˆ2T qL1S–Rk[HQ "^@-GR)14cTk`bt@(AC{#H#z!jHP@I'heZkĞidwdM"&%*C D5k^5cIfH Xl@8 c+%hH0䈇j+̤y$ְFDN0lm{8q#R@˞m^C72l0'nx:*_G fq$a%;fy)QF-rUc6jv%%6D%M衄mt;<2YCJcQ ?|hqzk iO֤s ۶$]Fݝvl)C, AK_' tyI7Ӕ.l !h(YX-T+!+ij=І`&$XKOcBV"jF` R; Xh c(I˝`&^;Vkq,$yڞtׂ(!"q 6 {&gCep4A8sC9 (BB$TAfHbSps\ղC$RZC#pY.E- ֞#3vY|mae:A=eHRD)%< GH z'j Q@h8WW;9 ՓbOOMđ@N2kpLYУNw}z[Ud!ŰL2 [I~iUJ~J}e!0'HƭED\ImUMm{xL ʼyY'[ ɮ,E-0B=Y/+`IҶ FbKz.+^KHxǷ58BX>ҿ7LRf) L6 MGf„?Ѳ~𺎍Fp66ӏ !A>qP}yPh *Ji=$H3KwDYo 嵠^ F1 ,z6w]nN~to76t3YgF7֕<FqG{)`,M\0Q8RiBl^\)WM %brYr7ɗF/[`zlf&7!MBc%)2YQ1$H`%-T坄[#pĀPV(dĺ ͧ`C|€c֚~t ;[ꏳ5N_%\?Hx--™rT~w5tkeT%d*̗h"BR+XP,+ Tu:Yh6aO>,_rHLi@kU)B)eL-_0\H}6XWٌ/!ʘpk1!틀 Uq,\V@:X/+R Fakx>r(ctyb!Pa)ݲx%T,VCC;sE&+1#8Cfz蕎"%@-_7Jxq0_y6!gp|s'p9C!)Qß|VZ/qv=8 hīi6O|uo懃HF5~ %013wJRӶS-JY`uY^KNcu7sJ$8􄗌:zY0J'-CKg.^o!D6FzPW&$ 2DDYlI9nzhrTV5 ,"1 a!B{gK-0_~Ӈ1=B |A;V$V轍 /Քa/$}7m5.CMI;*%fSw̝5@=x;?Lҝu[.X\^ʏ6:1 ͈[YXEaCN"ʟ|quIM#}]4}qNh{{Dq97ekE#=9q%&@H9%b'*4E"bY 2YOj,~=R'ғE01!b."%Nk5D fIHdž&%gSg e@IHi$\NnjMAl_}T郎ӫ ӝ \ Qpg*CN~Fvu"ڮ1 HxDHwH LcR#GRF1m0B4381jtWqg/ooq^KcDyX f=` 5TD LOH} 0-jq@>@"6k|ҋt=[BjMq<"N".N2|-:.k/B5(<=DyU,K"4IJ S[Es`I^ 4p B`JP9\tܝo޹E̋|}'WYEgX 'å0THޮC_2=5=f_5͂?ǦUP}9Ud︧UѠU=Vr 0[gPB% wʱRWJ>)JK$*֚aڨXQIwgxϼx%-M3Q(lĤ%%>e}߰Ml|yx\VDfƏ$So jʥPʀ9HY>.j0& \c$iql/W 6Z+Zl}5)djvRK0,}֓)&ݖv3;!s%8u% T+3<˨˜49}хQJMv ~C)Ýi'?op4N) &8Kϖ`4)ydq)rP8j겖wmU42Zo4#%?@H;2Dfl(ƫ6kC[wMl`} c:2Cի tq4!u@Y]jc_TQLj1LWb ,Hq֊g5` 7Wc)Td h9)lԖXK(ITB+'JĪIYƼrP;@;BHv8~\-:پTآs.:W^x;ǹFY-i! 〤,mwb 8Gȅ&ñ1ʹQ6/6J=(1s+WnH=`Çw1~zpލ(-4/6޻ ڞ50oiW9DFVr;G]zؙrxʩ=e4i(*B{-r k٤%bݑs cqb(;beckwQ$8lZ\#wRcUfȁq_(~!Q$r)0FF蔬ř~5lnь@?bJMEZ(o1OE s55Ҽws׶4_hek=ufi C@iX+7~:g/<KX%'ll[^W宒$ΩJ|F?&4*ex]QOD@yjFe\]]z|[/3{۪:uEōzG".ZP#o#.JƏgGT9")4 r-)7q]?|J+wåpjTk)t ˟SX-tڂNcҶ<:ah2NkJ#DIȧYd1BF7 7<:Sd!_vH:& ;/p#"4WN$=$ZY*e~"a 3࡫@B[&Q;c {-tPrIF~jo|${7]囼 S0<1HiLd5 ys#h%urz5 [0l 굆jEc/8]Jȷm,k/ lnU\:6KL ڽ~ܞPu8@QzOX ]~Ws^K58h\DLlI$&o"&La(|pb]Kx1]s@]|Ly?{(c'm܏q4=Iun!{~^3vȹx5vGe!:G-o%ƇB)-Qj}y0A{:_A!#T ,6.m5NYhf8Wo/Z[M],>Nno|Ukz˻@zwYywM> /sK)S,LB\Xxn蟱%w'_ks&Chn!F+| 4B4ڤXC&gHSwfh ^c+)@ebFEm? 1rAȢ!a~@. 1R^w?QƻNiTrw0r/DLJ_>xo Eܰ'hRige:ڵ{fMmsI/?oc3d A0,nUDy =?.9bѯq1j g$iBGOuKg3 J-Dyę7␖ G H=?.,O^a,GOaЍ&Bg3ANV 3}OӉe2_/?/ pQ"H#F ^pHèE]Q 7nإgf&󅪱 iS.#3ΐ![XP*p7lp/ΞU{v2V?Ns|X9ou:Ǽ*"DJre%Xv°hndSvj?Z(qoU$$kÔ-3—Vk5 ѨӸX=g43>q"R3:pۗR;sviFԽI@ Rj n>d aImYiH 4*Ěm֥Ēb4Rn,GH\uжuq*\X\Jh{*\D>  )Y&\u-<`Q}ӫs܁<hm*΅98t2)P}6DN*vEBMB0u}fvn緝$&|uxYqq{m l|Ol@hЭ ?;%SvELg#V RUyD'k+^BsvDűT-_Ap}6x,:Qi tV*/)Y?]=BAHLWR3]܍*QI&AaعH2N#%Y(brѳATH^S tW𒸍X,7%r,/ 5?c7[/? u4B 8l(,#O3u?h\l%.*,+"S%+2gdȥ,dgg񣫎e1W]O2:X]}#5Do 1Hbb<;Z|@X|jj 絗UY<$le=,z9_7`H2#ydʪ q<짦`6J 1ͷ 5S )VHwf@78-VW~:drGkP"7[kڛ :gH%wKF}-V,c />ԶuϷb,d:9$E, bȈr#HsмJy$ܺ RzSu&iU*Hȗ"/dEz-j/~,T0X-o-tYq214os,Ypۈ$Zp .p~GDnn-3>r;](* sIb"Br2×X@}$OVRdApr [) Eӟpshi__vAZ!+Um:ea,y'DlE+n:bO-\`]b83%#В\R@Z>sc[im̤wrѠ1!ZCrةb;ϜOI/E?J/@tl§K ^)ʺ_wW|芐 bLg9\a^ԁV],Ny8l2KXQx6Ex 1M=%l]x=#\쯟w=q-HMR!OVgkb/@QsVJYk7$Eяwsxggj)˛'Kף.VσN=j7j烬gvgk^ F;h7yHU1Ox0Cl^5&<ۯY2w k_X5O.1ol/J,ƔE7>Z2Ug#<΋%[8[gX8gQ{O%r @K~L\Y'@biSUp,}bgyA $2Ɂy3JyLA*H¡jZ:Wp)ERRA4eQ +'[}m뫣(JB5l > Z#ʠQ'Gֳ_فʴ>2i;ZJ͖s݇92ϣaX0ɾ^ ogIШ|:jIPSV8p )8ZԳ>.9)f\R-bO\w>BϪ5Y;0s2{[,sU0FQ<a$csGnm-:[rrқ⢖4"?AȠ#wWQF6 jTGb!C`Y\1?ųixLVzA 0a_@p}3>^ TF_$(#O?) |5Ȝ4V(Tws=Pp=c`FH)9{h{*>H)]H^pNQt*56ZE`cPDRJ!hZphƄz-9?;dLzkOPd" "Z'[eN9Fɿ[ANw;r򢐗pT<IEbhfcB+ȵux:OnPPcz>1o;LD}i'B}*- AObz)vOXP,ɀL_Z䕣`0zHKTd؃!%myyo%m }^2IC>?ɭh8Y[=;$4{g6F~ QEU+~N۔*,=vN-+B暰u9ߪs/oi;¯.MN&Z μ/0ѢnZzr'$q=վwq2ST VJ䮣9aa<t]%`RdM)[zJ*ũo[m2ygb %.3,A9B\W!3Z 9'[oݰ vDF`eҁZL?Z= :ЂS"ct_o<]¨Isbц@h wA0#/&j3; ۓ-Aߎs+pXYtA jBgZq9i BѲ|cLo%ޝ1o;:G i%k,d\vt@e}6(PX!))LΐgcΣ  :>^L?E<pGkfT;7f%o¾ӓ[ Ҝ4EQ3AOp84Ow߇QI3w]"5,]N Si#~JD"*źgTRuxo>B|Re/KJP V/a?~b1+=Zp},e"8Ln([\?7-7d7ln\,wd)NmsOGg#znKV֮-ƫx]uAϿ3//>lpzlF=*d\PҳUF݊BPMOSݯ\ "E4 .~k_Hb.hH{@1Banw.wΟWMVq\6G^ ;GoQMa\.l|3kMRnog+@b_~jI5Aϧ4l'7CJLDes b}v,)ѭ׫o"򓫨λrs߯D-SA}1łs*jarNTw| PF3n^Zj8>:pHՑm2O_QcTQvt1$ $o~6zmd/m3pFIzFY_vge;_qvy<ZaV,6P XFːʡ*^d9Gg&䍱g-^]Y#D 9Uծo)=oMWx~۫zӛ#^Ho-;Fz K? s+FŀQy(+m#I6Ghb{^D^%qL4bD)DEHY M/"38\jԣ{2\9L)`\{)e6ʇe2`lYҹpiYY*ͤCƖ\JYBmk Hj+vZ0r]Xd5D*6L'i)B?hTH;!A$UP 6nCF˭}\iqoЛ _z9#RѼCZ4v'6c ˏ?]ueg'?0i$UڿwԗN"u~ t`7&kQ0F5_w.OEqEwtn:x`<ƨQ6ը,Xy-|8ı2JNl O:UO\RqP`IX-} QVH%[{1COO/2ӓ8M^!3YM}>Hn0Ф,p7ZeW"gЬfm(-t>RE:}9o7Nh7؟~̧ ܗDɸSĀFhExEi5.ee|TF&GűJjO\Ag8UTyvџp@{C$-HaH[0-J&7y76opH4Yd<1y:)^q $rq!s)T+®_5jn`ݏ:^c.U S}y J#&FS[{`e˺i9fm}yJHYR z%E9t4Hd\d( J/d3]3R:IXFtJЊԑukؤ _P;=\een !|Τtk(aJ32(e"x 6K1b,сEyLpuSFg ~ȀF0:8 ˁaVZB`vu a8Ū@ВP9R/>L3drNQ{^Q*8(/lQE &9 NJnhrq'(oT!X9RqŽ^e\>;ES&nnW>JI)}':k9r?|0[ j4_ l6~2Zp \X:b"*}uq eo}V<uR\2$.h|x㮃hfm(;jTQ;~8j4iՈZTh@Pdp1)eK5 :8g!niI"ShDZd-zA7Qsô ]T C3+U:ޑUai> Ή97wO@ |1M5J_c|r>TGfF91:eҸ"eߋ*^[^KT}NE;ΫDzh?FѾ*B [-b]^s?y,ՏJ4{\rsTjUU.Zk̢FtuF"5#BDVuDELr"#,(qsEJKˌRB16)jN:*s5~ӵNq+H6t1MhkJ6a`E.{PJBb}t3C;A;i V_?L`tH8T}_~ӹ9~X,kc ^+BT~0ͺi.ُsjB<_DݠB(~9݆%&;yYZs/d0&JW t7w?yJF>sJZ9WE춺m^w4mޠxu,5gt!Ԁj/1Ye[j@8c ^Jƞ3^ pRF K$Ӳԕ^lq eI{K.И^I9Z"5BngV7dkV]WiZkBi} [ %D;|Y(]!V "yvlmֆ2 zܧvmW)j76+mc\aS(r ReZRe?Y#UKfpc)YR>72;?L^50&Wmqv7rwklK֓^rƀ>\nJ[[ P藷5!RREA]Ҙ:ˈ,5V˗'1r=\; UXnJ*dk 3linuв? x KMv}%?K񈨔>:kr\>IQLb SPE_`^s 'Ce? *J-pj!j℠D -xƔ)7j1zEbdχ"0|p 6j `T^_n\e`lYy/ FALR4( "tTixVضf ɑ0--$mUc4‘-"vqe b蕾m,+@-jYf[@3*%a%Aj$BqE=ʵqABj(]z)18ctde7"kخBp{%D!$Zp; EyJBfN9!Zq5-RUuw#^$lBi$1C&aGk0%V 83& ,5ά@pA{-NV1fG.ɱO$ְ ^,!xKB $[cBH!=rBtEQ-ʂ*דNm7R8T jy'Eoia(ULu3#OHk%tok=SzïgO&˩fEIB'Uhl^<)i/El6XYv~8i'Uu.I\fj۶_˽==⹾3g3dҎAGbzI9~dCdEщ4(pb.|x @ݎ_ZgKf0ԭ#f)*c n2:LFnE%T(aIn {Y/qujU(,dw*Rxl< SW;X)z=]֟r(r0O(IiBqv MfZDH+)1=]AK`4O^)B5pPga296@f}Yh02B1u.HǤH*R)3X5T+8nWsnI#)R+Pf75fFffDfD9%橦}wօ@E^ez I4ʿ [R-iB4M2-Ɔ !~(*]F) Jh[ŐS9+YusTzb7@"drrx95_.곊Kx?_Usb\>93a2F+a ?3tFH$$OSJ°io:]B vrkA x.mҁb+C"‘Sj*kD$d/ͥKL4$Cɹw uryw=I|vk/櫢TUi [FW P~j%Q0~E^??SI:/Zx1'??AD  o|dˮbio9&@AA$/֥`AS,>>#g 7[hulW{E)wD[Qp3P`RUE0_7Bwgna'ΫYqe_W"Cljů /zOAOO\u~H*"^0-7g"7^,`. Oe% M4ʦݤ}nN;x2Sڻz6,+7(8/ŻEbd:ݎ`KܻE n-XWnmJh,.˽R8sz1`:{_sT@B(Nw^]ꆴpzw*"} \꫊ T+bI`VPG1酕>&UUjMʼnRTƂrͼ]d-\8WkKq¶JGWDB1Ӊj7jE[[)S)mtي\RCI>(L"Vd{Mkdh$͇R4L%8BxeW^R񐶟$;ZweT&w"6HE{KzW^8HQ{OOE[JsOçn0"x`Du}5ǂ6#nUJGSr\]K'"G>HHcx7Ŝ 1JHkdќ!,8ܤBRm VQt&ќJN\h$u6Oah4K,1"7&͙ ̬ՂZ`.H+n(a&B iCbC"F1bCw c"KZNpJbaPg‘$g"g&Lr"Z QBͩEw|S=_ֺ5bRQFBurMpJB g &ڄ2!MT)QFQaJTęa]>hĻ0NbB:SǛN9q*> djBr)W_gmƽR#zX BL'u/ЩλKz6,+7^긯w=Ӊ}GvUGt⏸z6,+7ъMSG=-m=?PxqWEH$EcR_UVS}n˂w^X) @ FNVT\`b9gq*R/NV4{Y)jzMVvT=7" BVO|7PP-̑8`8͇4|-/~{]to]X|/!uY~J7v,2h Ёl`cWdu .[ m灺(9y&_d5jD:Ft}EtI=^3w7 ~U,o yYUuybo}|aL+/ njU6p@1LZpUg.ОxyhK *#©䢕ThMԊs8.G Cn!zVyIVFT/1A'&\gT l&rCRj (e F$\':̨$G2T+_1 sþCOB6*_eXz Qp/[!h Wʙ ./nh?>l&cح KE=Y ;!5w޸?$ %7JLEjNPKN(x5*yvDތ5=lJSG'mtA5 t lVRtDTׅcsыbICrWLSWT5|Ls $#3[Du@RYjthEpΛ֚8Rg6O0KL~3O#<шx/s붊c0ź?8ͳEYU`<* /KT[%jܙ&ʔ"c]ûW @ISUJz`4=V;CQjW#QW~{Oɶ(xzTxw\: upL 4!;.<cHƷHsUZпy]LxSϕKl]AaE h>56-2fyڙ3k,W-x:Da,UH,SII5 ""X2cs-%NFb Re jm Šq,/5bEgbVYTij ɮB]u@שK1)CYSR,KR4X $9Mc'Q B̪acYKU',`*1)MX Lƭagg@hìIM"@ԁ ꑠ(hVL˻kQtk>_]6pWOe!5SW.y!`ϯge拙->rу\流җUgagJE|B_;dX- tٛ-<%Ԛaۃr^1⬜s_5=ґ@ꬹ 1 ;X1ls;#`\wKV;X~%{<-Թ@v+2 5@ ̷Fw, }Z\ NA'; ?u}`xZidU5-W0X= ZUxF1 hmkkkk_8:'2e9/A4Kfy7@ 8AJ(R&)A{IJJ.}eyںڡQ LT>Q}LYA m!W׺~S|7:YlzsQ~voŚi(G++D!0tfSW:d8$1v$' aW@B6Z%,ۂBl95.5ۃBZ 7(3̼߬8h[,^qˠZos̿BsKqt5iRg_{Xxwzh`_z+t)Go"RṅnzMhUt"#s$<|=hoם"uȪwu\~ސ ٖ=!(-?^/>nF|[r)a.H(+zdܴ\L_mjoL(K}kZv_״bm3ہhسCgˬ gJb PY}ډ5vnñ_ ]^|{B>e#)Hf 9*OWWe$HqHI9A1/K5<њ[' 3١jr+k/oDU{1 +YX$)jr~~֡*opP؆IԭiM- ({!ЮQSԖEQE)x0Z)R3{JJ.*jlF!b6ܚ੊z.Vg.f90js{j2(0ǕQ2 AdG_I"'EΔ|.B~r R qƯ; ^DC(c j0IXǒLq@7n\0^>t-۠-,d bQVA!@^\v 7=23_oUi7~ f }Df=&y5}L_G ^.m8#)jX{mq 8~Q z^Uoo]EvCIuja ZR[@Ljۇp8$;ΐ ?^{V%Uܽ{nVw{+UҲN t2^.w_=}t>9ʕoǴ/>?mszR7oU9\on~vC`>*꣦쐅b T1Ʃꔲ'ƣ&W>H"sprǫ;;tP]?0[:]~Z=K +0 ˃1 8U.fP^9%JJcx\E.=:J ziz~v7g ʒw&u!V=EJו؉k5z/iE?FGm3h94&*Tqq=)/II͟):k4l ͦ_sg gQvbPxOJ?/˛VW^gԕF2q@bM}>WS9WSse!|ݵ_Ops`Y·O3:F}~N9܊IPj\4z{+Nxjo#4Hㅌ$HrĢL 2BVUK5b5z *~:ȯZs2F`/ 7`OcI:tZ-. -KMZ[ve- ޑߛ7B@#FtN>*>*>*>jJx3)WׄXr:K!kf.9 +01=ZE˭Dy3[j] < *;2:T+?E2 gi3 Y-qp#jh̎rK5RZr`fP!iU\%SsFp) 3rN "AT:pQňr,]sH{wkYV+%;z]}G\0q*LHNf @Ƭ9=\:נ'хhVӬ@nXU=|v1p'ppExK^J+l/?5.O6-jb'Z{e[TlB2kayr(Բدxd~-Y3Žx$Ā0~s|`܊V_{&6*dsJlBHB 9ytSsaU-hd*s.T(?lUtvE@T;heyjkh[~뎶,9gלFsBi._gH.nuw8:gHՋ,-mn"A%Z|[,Tm_FsʧpӇZ#˔ŧ}n7wv'WɍW{PYP ?bQ~ٵ*?k{U7K;Y l=xC -~Iq؉8 Zhz_|()2u=`gyq8Eѐ>čWWhkJUO}ig18)B^Yeu9KBI3FqzY=O캋K~zo%a·5~;P+,a~{]ilq=a+v4uiߘ i SG k9yu$?Q3$:J6zb*ۨ֡wʟ>Zx6BM@, GihQa XN^>\|>H4 WS-Q͠G9Ғ տJ=I]٤$?A7 HlnIw+$l,V9ml5@8"inEig͹yB-[.ӵZ)[S?k1gHb֪Qk;EUy6XJy4}0 cW^ALlx<~W}ʑ'ig/+V_tSe8M*g9ڊTK aՐGML/}/-JzRi`*w֓"Et# fmNݺҠFt~v;g(w>Hf}(Fn nd1J?-~rN&N e^,s$Ld6h:|6JW,U--bϒnILwd<>;wK>П8*A4D$ᆮL=a;SF6_ʄt=ޢ8#t0t"O5 dG$R% |x# Йw? Y7tFXg#`:4#3.RVD8/L))olwrXŮ8 B 3RZ'@@p☱D^f'7vnX. <|,D*eģ@8tN~ K>"劝Қq+j^Tr$;s}gb!QYD 3֋ZC$<&\ Vq2sr43R'@gqJ.>wHK,(2>@f$$%:sِ UZT#+TmfJ~Nfu8ܠ9d2 YSAbV}jyZߑԮ8|~-Ok[ꖣ܀qkL:tucZbZpA`H2 a;ǵIRўn}h_$mPE@ċC=1i11)í֠⏓*E#Ge.94@3EIӄBfpO?vlTpUc/?4 iU4 ŤpѰ~d`C,Gs}@qEwX &n݁h 8Xq^ D+V@qs{c9v;+eKYK7N"ɀ*"uDU:1Fҋ6Sm_03@:iVwK_AHiӯiܻQnQ?mb=[1s1ᐐ܌G3`sϯ@lk$>"ńBfL?j[^8cbҒS} Ӈ2AI µB[cDNC+D* X˄׾YWfw?%ͯm:2ܭ wH,S9SF7^y(6Xwb<_CyҢO.F}yq8<y;%(ӗ`=.l3QU҅,oy(51!CwY88_-_haM툦 VWlcwc9 }o/ҰMf,AR6E&%ͳ=0r \u(.2T0p gWYVв^jRiVV.G㝱\y\u(.Kn\dzYfB7xQLctUxFN1ԊUTf"?\<i@[,.,X2B!_;>j@Mt^&'KcSWh|wz^4NRTCur@{ԩk'o~'7~JŽᜂz=UTNxFɩ(#n];$dϒAuB'b,H8 ;72A$C~d*ذod|)4Y_}iZdaꁧy@GS3I w[ Q6c^7>2VD4i[>T])m ڠ)֢KI 'nu3M܎4q$aSiϼۅ.x`u&@@"O"r>XB*MPh&*KHs=Ptdes$]g#>]D\6T':dօe݁袍f :=}׌xsCRS ؙAZ(B,EUfcfYێvG(t q)>djIěw"DK?q?CJ49^^3R_U)4Ad "`E]ȍhd1gS*].BvJ5{@XcNc~Gh\)Aۦ);(UV%2N YҨn@pȣӒǻ= v׿ M}-32H}:hz88%ҹӁ՟Pd PNIiN~Cp)O-^6z^ǕO~ie>kig{ @T" ^ܜkE^7>2(qSb^ysds26)ED7$3xn29|b )}-3e3AX{M@2\JVK+A8c"VU? aGJp >+ֽB)C-&JYH6Q]ADeRqrk(Pkw5CEfrP`k[X@PI@^'@] T\0Iͷ&1r^"6X!%eҨJףy#G3P' !4E4*f[!IoKw.ߌt4Yew{zY>ЍgX%EGcm@l1$d=8;w׋:`ې &-zoE;/n<H9-Xf%l-?|%ZȠc=UHb WgL5RPL l2"UE$uF֨n>o\{AoMۻd@mzX%Cp_~QXUH2`_- q>y~(%%J.&1v$^ [W  A-l)՛ gp+Ee#m=9D:۬ ^ i+HqqyP9 $Bp}7Dgc'Gp,fLx%b"gG1_.,7d28Z_!ȾOc_ k>uƂpH"zHf l_yˣd N uSM{$hW|L/V"JIh'mz,H\ ONf%Vh-IDPRښ <$Ճyd E޵. F`H5V3 McL_=y|zx;6]iϯ>u|bޢjb*"4P~u.mGcq=(l0v?w]U|i%S%6$ V,rVskHtU5E>{rcC$5vYqC,/a"&ŪCNi8aKI?6-$XK厕~# b5}I:E ڀ``' 9kQ(]lJ9TY_Pz;@==x|쯌9yЊH(+yIsD P GpKUwGo2h1sVSM\狇.C`YY6G},牜s1[pT\UP3cr#y]h k}8hm1k_EXQF!߷a;v,1xNvWr\ +ʑgD<7-S(Q-×)~ > 2`A V|hp)G"7G~ -z퍛<}@)Jgߜћ(I )UɠyHau)1:93 Hq?߬`F}3xZ I1&oyz\Z~1ՏQ旃8.Я1NUK>7^H郃 ۷9(q uHE,v:$vjt{Ku:}V$4y?|짳ri3} n}gy.Hp-g/re+~,,[}QXEw ;=8'[k&]f^c=cV]['/9\we͍H0#qu6Rxc;n'J-j}xc&T") )nuQ2 Vd-̩k+߱3"DTJR5Րh:*[Eo:!7mرmEQ};yA>h Ofs"Q"dʐ*OZʗe@(g$/Kү+zmbt~֚:>*\xN)}h֧󳖢~y}R)*uZ&T5{vTp L;Le}_nIX±{*U6)f?hGEɌ:Z2ғ+.d m썪1jB{P|JJT`*Ƥ/t r QQk;E+Ǽj8X $UL$)UR[:!G Rܶu޸$( ^;YW X*|AH*NX 権ucRm 1~15 Wk8b B:ށ,pF+:"ʜ9e)dȄ #=h֧sT="3LET.M5s[%]bu4ZT4ΕLi'H}R4(XX S42Kg]l4!64GbPV\XşzT|"ş].&9p_ 'QEh#UOW%Q6KnWA1rrgp=s5XʘKSPG-)Ԝf:%"Ǽ8TBdiLXCR j(}c\mWF WA% %1Q1 G81ZY|h_vZR! |Z^S^H&Y^LL )?FX ʐK}lEt(K8Z>غ2ToBm^`v.'Gcu>-@ ya'ZZ>Te%99DX@>~lR+|Q]eC4TB_x<(황,u7GW@~p `=Q #7/*,5 e)|h(fYgBRpxKs.ĩ<<ӹYk™<gJ)G)L[Q`UAnjd$je_7VP^760PS R=|EO;m4*9Hq`KA@jyDTY[$;!)k"!n>}gt`.UR='WGOCVF9BXxFw)K~T8KC)L*^y3nUШ{L 7"@1LYp5[!DKC!u Dͧt ѐ 9E:QeLnpz1mi>Z#iPXRBuHP{ҫɕTQRABPTMM͍GZ6 1;?*c>FiaI|TAPI\)Zm4k+pU&\!h}pX+K53Aӯ82cj`ooП 1Ppaea4 -[{'1)}Z>:K{udKUR`\agYP!rb>scSۏ'9oIZ@;/msJdrnp~lmmUUXT8>Fp<2cN5<I.1Nr't;ƇնebC<:, !\$T-ݳ؟&1J򴱯%WE:{6TrN39fx0oiS^R4DՉu{߼iO?fW|Nodm;Uo-;=U cq  $ yIEi# YY-m4C[YˎzyDPl) %Cq&PKh%%~A}DeK)֠-ŽI-=6QT:brM'}.."SP!ӭ0ZFCUǟ?DjuSvzrvi%X}2c\^a V_޿mp4?;nZiS̅-s?3QIϧ .""~uywIEtَكq٠\I^gpj *+œ}?͢AOXq3YhwCW7~;E#^:m$L򎠔j6^;C.u\|xRP9,=|пܵsA!5~p_2.fδS=MC?wZ.&gwoۚ}/f/&7Ĺ'k{vbfyQϚ o[gi<4QjI_,Qg4E~{ڡǗO=9!8Mȵ)NāL@26~0adћnHD60H# nWeXd&uqFlO=u͢89t7`HL| 7 $~KYN/?ێ0NtJy*(z{mjr Ϋl^-DmAPG Ã(H鈌~;zg3b2'ϖOf ч,d7Pʒ vBJ͓R6m!/AJ_Á5\vP%M_h87y+H&TFS[W}Ψk"Vf>;$g#DM6Jd^2!=aEO+ j{7܏N˫}#$(bDcJ (-8&J]Qe9j&9/ ۤP Ō5apv0[F{0ϗ3;?b=oFovC8o9(8m)A젝o:;⠕zgC\m~ɛ :hLnp4JF()PvM?N@n׉Xϑh=W\C+WD$&BLB#I[Z}JݔRa! 0g}_^{CŤ$&}K )-#H@+$I)P(7\9@[[N@CǁB,$h.f1NsKpR9!e# +qLu*Uvhʍ% Q9/P"FJ 4RJGuf\YG9]nYͮ5J$0r5pӒ3yk7=7Fʕ-#xYҳ2ZpQ"DRZN(rJ]f6Ce,rjh$Od$c/a|Q&dB'5tqg(C>]V3wXR3 yx=25YOp*JN8\r78&'?c7[)+{>o!Neo B&y +XWnIn ?4ӅYkuo/2}qgX+LXJ'k9d~>Cf̖ՏWĭ\a᤟m*4*]DWmjGJ¶3Áf 뇽bh)wc6GL"CK6ص!7Wday_CZ-7Ww̆Uy #9#tA{@>1+gf*pb+qxg>u..FC;V('+: [1ΫcXƋUlUo8R7 e4b?%b" Fm Q,e4 r}əL*6tcXn[d5|7 ^gs")ۈZB+]"ҹ/]"Q!82:3h8vvt@2a,[fJD$sS-`G[Aݨ)jun/p*!;k=աMD?ئ {%,^*Քu2[]4*N<<yޫ ]g]~jV]>TnןС7 syw͠ǿ=+W}׾të[-o/'sxʞX7yo!W|U+!ZjƮ?e$u#8Ds0ţ T -L3 trߑ3:LnHBN9ɛڍi6 trߑƻByrnHBN L $D-m]CqF!/TkO#Έ6rY+=l7#Jp(3{hBD^ZN{{$**lO S{[x ;DM+K$t,D v$T0.M%5>fFzMOjXϓkhiנWpW BS (j[3Jo Zp=x&(;//.VSѮKb]Q_r"ǣ%mRecq?E& LI)+e)eܚ\AY$@ 9K:E!4>R/xz5Z``g %9XnHAK(Z5`P˘uETNBJp0'"{^KeCwQpa*ep3҂2ٺFNB@c+@e( ]W]pCnkm\xi^^7OZxY~{ڬ~}YL$3̲\>pn=#Qb͖K!#8U.tsAW$-Hb3MJp뵮ա9SBk#VlK${K]G3*9g+3fY- kuwzS n5Zўwq'NoxiOQ#2E1TXlyOY_8gid/IAwM?#yDEs"M&~CDu :?5<] Ii>̾{-8 [x< 'jV14?~yf^44?h~ CsփW钔dڧY*.9ʖ BV%`Q%-=~ a% b``b)7J):[^fuh;;`NL0`7s 5.rڀH];Ư.ŁJ wC?wx4\n~-j^b@Q̵FpU2UW5o mKוi+/lija _Yl cZ 8yAC9$r^Gjad](ͤ+%cեb~8KClS1J`d){fh;l@))0dXm rThrM\ݫ:A)Ȝ78nFD1g>Ok e@l>Acݝ6&|+ Y3<]s @+0n;7' fn%H_q+8ǀ5\SK|*uY)Z8iap_B9ֶa`~: ϘG\O"(`-pbZ{[GD޼XG+ R(¶, FSY$ Wl QM; WPeL,>J"7ؘCQ}٤IPoCEӐbNԝǂcS̓OxwA 6xd.sBRtٛ [ Cnd(m A6A{H?E" R9fM=ǍpC\{I~lL?s$bN_pcѸ+B+}J$'/0ڶ忸jjMIt,7|z 'fãG0ZYozxaibQ50=vRMF'h$jW˨$)F1 W)E`XÄ5d60ڶK"C T 2pnQy_L94vMҙ`^AT=<+-ΔTPP$)^uLu-%EiY+"TTNT1€Җ*mOc]WgdWϿ߹ޛwŹټA+lT#'(2Jpiȏ]oOz@d#oIsޏϥ1;SIo35G6"[O!/TkpQ,>r; k!̄(syNu=X)[kK^TK͹ l,B # `@ uκõrfWsTm Oä $zJ}}rRQV sJYw KDa@ sLCUg+/hn],>-nK6fW~j߅FV/bklw?[ѳ #afk'wtK'˻s&?^_fyuq3[wO¹]|ǵr6^L4f^ a}Zc"2=2`,bnk%n6-`nvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000003354251515145670371017716 0ustar rootrootFeb 19 18:33:44 crc systemd[1]: Starting Kubernetes Kubelet... Feb 19 18:33:44 crc restorecon[4709]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 18:33:44 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 18:33:45 crc restorecon[4709]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 18:33:45 crc restorecon[4709]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 19 18:33:46 crc kubenswrapper[4749]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 19 18:33:46 crc kubenswrapper[4749]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 19 18:33:46 crc kubenswrapper[4749]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 19 18:33:46 crc kubenswrapper[4749]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 19 18:33:46 crc kubenswrapper[4749]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 19 18:33:46 crc kubenswrapper[4749]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.463921 4749 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476512 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476553 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476564 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476573 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476581 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476590 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476599 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476607 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476615 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476624 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476632 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476640 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476648 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476656 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476663 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476671 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476679 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476688 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476695 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476703 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476712 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476720 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476728 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476736 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476751 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476759 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476767 4749 feature_gate.go:330] unrecognized feature gate: Example Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476776 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476784 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476794 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476802 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476810 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476819 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476827 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476841 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476851 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476862 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476873 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476884 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476902 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476918 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476929 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476939 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476950 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476959 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476974 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476986 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.476995 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.477004 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.477012 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.477020 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.477082 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.477096 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.477105 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.477114 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.477123 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.477131 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.477140 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.477151 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.477161 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.477168 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.477176 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.477184 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.477191 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.477199 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.477207 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.477214 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.477225 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.477235 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.477245 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.477254 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477454 4749 flags.go:64] FLAG: --address="0.0.0.0" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477480 4749 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477502 4749 flags.go:64] FLAG: --anonymous-auth="true" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477517 4749 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477539 4749 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477550 4749 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477563 4749 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477575 4749 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477585 4749 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477594 4749 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477603 4749 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477614 4749 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477623 4749 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477633 4749 flags.go:64] FLAG: --cgroup-root="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477641 4749 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477652 4749 flags.go:64] FLAG: --client-ca-file="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477661 4749 flags.go:64] FLAG: --cloud-config="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477669 4749 flags.go:64] FLAG: --cloud-provider="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477678 4749 flags.go:64] FLAG: --cluster-dns="[]" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477689 4749 flags.go:64] FLAG: --cluster-domain="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477698 4749 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477707 4749 flags.go:64] FLAG: --config-dir="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477715 4749 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477725 4749 flags.go:64] FLAG: --container-log-max-files="5" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477736 4749 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477745 4749 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477754 4749 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477763 4749 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477772 4749 flags.go:64] FLAG: --contention-profiling="false" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477781 4749 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477790 4749 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477800 4749 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477809 4749 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477820 4749 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477829 4749 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477838 4749 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477848 4749 flags.go:64] FLAG: --enable-load-reader="false" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477857 4749 flags.go:64] FLAG: --enable-server="true" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477866 4749 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477877 4749 flags.go:64] FLAG: --event-burst="100" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477887 4749 flags.go:64] FLAG: --event-qps="50" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477896 4749 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477905 4749 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477914 4749 flags.go:64] FLAG: --eviction-hard="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477925 4749 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477934 4749 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477943 4749 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477954 4749 flags.go:64] FLAG: --eviction-soft="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.477992 4749 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478002 4749 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478012 4749 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478058 4749 flags.go:64] FLAG: --experimental-mounter-path="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478073 4749 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478084 4749 flags.go:64] FLAG: --fail-swap-on="true" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478093 4749 flags.go:64] FLAG: --feature-gates="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478105 4749 flags.go:64] FLAG: --file-check-frequency="20s" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478115 4749 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478124 4749 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478133 4749 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478142 4749 flags.go:64] FLAG: --healthz-port="10248" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478152 4749 flags.go:64] FLAG: --help="false" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478161 4749 flags.go:64] FLAG: --hostname-override="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478169 4749 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478179 4749 flags.go:64] FLAG: --http-check-frequency="20s" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478188 4749 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478196 4749 flags.go:64] FLAG: --image-credential-provider-config="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478205 4749 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478214 4749 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478223 4749 flags.go:64] FLAG: --image-service-endpoint="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478232 4749 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478240 4749 flags.go:64] FLAG: --kube-api-burst="100" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478250 4749 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478259 4749 flags.go:64] FLAG: --kube-api-qps="50" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478268 4749 flags.go:64] FLAG: --kube-reserved="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478278 4749 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478286 4749 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478296 4749 flags.go:64] FLAG: --kubelet-cgroups="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478304 4749 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478313 4749 flags.go:64] FLAG: --lock-file="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478322 4749 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478331 4749 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478340 4749 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478355 4749 flags.go:64] FLAG: --log-json-split-stream="false" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478365 4749 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478376 4749 flags.go:64] FLAG: --log-text-split-stream="false" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478384 4749 flags.go:64] FLAG: --logging-format="text" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478393 4749 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478403 4749 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478412 4749 flags.go:64] FLAG: --manifest-url="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478494 4749 flags.go:64] FLAG: --manifest-url-header="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478507 4749 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478516 4749 flags.go:64] FLAG: --max-open-files="1000000" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478528 4749 flags.go:64] FLAG: --max-pods="110" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478537 4749 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478546 4749 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478554 4749 flags.go:64] FLAG: --memory-manager-policy="None" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478563 4749 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478573 4749 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478582 4749 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478591 4749 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478612 4749 flags.go:64] FLAG: --node-status-max-images="50" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478621 4749 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478631 4749 flags.go:64] FLAG: --oom-score-adj="-999" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478640 4749 flags.go:64] FLAG: --pod-cidr="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478649 4749 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478661 4749 flags.go:64] FLAG: --pod-manifest-path="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478670 4749 flags.go:64] FLAG: --pod-max-pids="-1" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478679 4749 flags.go:64] FLAG: --pods-per-core="0" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478688 4749 flags.go:64] FLAG: --port="10250" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478697 4749 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478706 4749 flags.go:64] FLAG: --provider-id="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478715 4749 flags.go:64] FLAG: --qos-reserved="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478724 4749 flags.go:64] FLAG: --read-only-port="10255" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478733 4749 flags.go:64] FLAG: --register-node="true" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478741 4749 flags.go:64] FLAG: --register-schedulable="true" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478750 4749 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478767 4749 flags.go:64] FLAG: --registry-burst="10" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478782 4749 flags.go:64] FLAG: --registry-qps="5" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478791 4749 flags.go:64] FLAG: --reserved-cpus="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478809 4749 flags.go:64] FLAG: --reserved-memory="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478820 4749 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478829 4749 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478839 4749 flags.go:64] FLAG: --rotate-certificates="false" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478848 4749 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478857 4749 flags.go:64] FLAG: --runonce="false" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478865 4749 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478874 4749 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478884 4749 flags.go:64] FLAG: --seccomp-default="false" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478894 4749 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478902 4749 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478912 4749 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478921 4749 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478930 4749 flags.go:64] FLAG: --storage-driver-password="root" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478940 4749 flags.go:64] FLAG: --storage-driver-secure="false" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478952 4749 flags.go:64] FLAG: --storage-driver-table="stats" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478961 4749 flags.go:64] FLAG: --storage-driver-user="root" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478969 4749 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478979 4749 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478988 4749 flags.go:64] FLAG: --system-cgroups="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.478997 4749 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.479013 4749 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.479053 4749 flags.go:64] FLAG: --tls-cert-file="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.479066 4749 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.479082 4749 flags.go:64] FLAG: --tls-min-version="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.479092 4749 flags.go:64] FLAG: --tls-private-key-file="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.479100 4749 flags.go:64] FLAG: --topology-manager-policy="none" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.479109 4749 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.479118 4749 flags.go:64] FLAG: --topology-manager-scope="container" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.479127 4749 flags.go:64] FLAG: --v="2" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.479143 4749 flags.go:64] FLAG: --version="false" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.479155 4749 flags.go:64] FLAG: --vmodule="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.479166 4749 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.479175 4749 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479433 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479448 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479459 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479468 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479477 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479488 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479499 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479510 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479519 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479527 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479535 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479545 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479555 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479569 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479577 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479585 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479594 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479602 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479610 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479617 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479625 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479633 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479641 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479650 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479658 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479665 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479673 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479684 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479695 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479703 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479711 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479718 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479726 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479734 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479741 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479749 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479757 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479765 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479773 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479781 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479789 4749 feature_gate.go:330] unrecognized feature gate: Example Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479796 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479804 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479812 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479819 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479828 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479836 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479843 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479851 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479859 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479866 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479874 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479882 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479890 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479897 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479905 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479912 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479920 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479928 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479935 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479946 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479953 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479961 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.479968 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.480013 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.480021 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.480060 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.480071 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.480081 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.480090 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.480098 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.480837 4749 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.499532 4749 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.499604 4749 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499717 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499730 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499736 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499744 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499778 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499789 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499795 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499801 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499807 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499814 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499820 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499826 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499856 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499862 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499867 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499873 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499877 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499884 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499891 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499896 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499902 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499907 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499912 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499936 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499941 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499947 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499952 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499957 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499962 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499967 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499972 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499977 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499982 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.499988 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500070 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500078 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500084 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500089 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500094 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500099 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500104 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500110 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500116 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500123 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500154 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500161 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500167 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500174 4749 feature_gate.go:330] unrecognized feature gate: Example Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500179 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500188 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500197 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500204 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500234 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500240 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500246 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500251 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500258 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500265 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500270 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500275 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500281 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500285 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500291 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500316 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500321 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500326 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500331 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500336 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500341 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500346 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500352 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.500362 4749 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500639 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500650 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500656 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500661 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500668 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500673 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500679 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500684 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500690 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500695 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500722 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500728 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500733 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500738 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500743 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500747 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500752 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500757 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500762 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500767 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500771 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500778 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500784 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500790 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500795 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500822 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500828 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500834 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500840 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500845 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500850 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500857 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500862 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500867 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500879 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500884 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500889 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500896 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500902 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500909 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500915 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500921 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500927 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500934 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500962 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500969 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500975 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500985 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500992 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.500998 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.501004 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.501009 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.501014 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.501019 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.501038 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.501043 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.501048 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.501053 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.501058 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.501062 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.501067 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.501072 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.501077 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.501082 4749 feature_gate.go:330] unrecognized feature gate: Example Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.501087 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.501092 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.501097 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.501101 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.501107 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.501111 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.501118 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.501127 4749 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.501370 4749 server.go:940] "Client rotation is on, will bootstrap in background" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.505780 4749 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.505893 4749 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.507862 4749 server.go:997] "Starting client certificate rotation" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.507903 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.508128 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-11 01:25:09.282969977 +0000 UTC Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.508216 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.534294 4749 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 19 18:33:46 crc kubenswrapper[4749]: E0219 18:33:46.537009 4749 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.128:6443: connect: connection refused" logger="UnhandledError" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.538529 4749 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.554452 4749 log.go:25] "Validated CRI v1 runtime API" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.586680 4749 log.go:25] "Validated CRI v1 image API" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.588917 4749 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.592693 4749 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-19-18-28-39-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.592725 4749 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.607574 4749 manager.go:217] Machine: {Timestamp:2026-02-19 18:33:46.605301724 +0000 UTC m=+0.566521698 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:8d50dffb-ce61-4602-b187-6c64cf1965b9 BootID:81008fcf-511f-47db-bde8-866a4e6a7c7b Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:7e:8e:31 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:7e:8e:31 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:9f:02:c5 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:fb:50:1d Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:65:bf:ab Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:45:b2:00 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:f2:89:7a:0f:1d:99 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:12:3b:89:ab:18:31 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.607788 4749 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.607983 4749 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.608329 4749 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.608517 4749 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.608567 4749 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.609191 4749 topology_manager.go:138] "Creating topology manager with none policy" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.609208 4749 container_manager_linux.go:303] "Creating device plugin manager" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.609579 4749 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.609602 4749 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.609759 4749 state_mem.go:36] "Initialized new in-memory state store" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.609844 4749 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.612848 4749 kubelet.go:418] "Attempting to sync node with API server" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.612868 4749 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.612882 4749 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.612893 4749 kubelet.go:324] "Adding apiserver pod source" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.612906 4749 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.616542 4749 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.617413 4749 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.617492 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.128:6443: connect: connection refused Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.617514 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.128:6443: connect: connection refused Feb 19 18:33:46 crc kubenswrapper[4749]: E0219 18:33:46.617576 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.128:6443: connect: connection refused" logger="UnhandledError" Feb 19 18:33:46 crc kubenswrapper[4749]: E0219 18:33:46.617614 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.128:6443: connect: connection refused" logger="UnhandledError" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.620827 4749 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.622191 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.622222 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.622229 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.622235 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.622245 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.622252 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.622259 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.622269 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.622277 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.622284 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.622294 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.622300 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.623090 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.623585 4749 server.go:1280] "Started kubelet" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.624494 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.128:6443: connect: connection refused Feb 19 18:33:46 crc systemd[1]: Started Kubernetes Kubelet. Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.625043 4749 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.625717 4749 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.625089 4749 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.628682 4749 server.go:460] "Adding debug handlers to kubelet server" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.633807 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.633861 4749 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.634150 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 23:11:15.887160911 +0000 UTC Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.634217 4749 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.634239 4749 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 19 18:33:46 crc kubenswrapper[4749]: E0219 18:33:46.634244 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.634338 4749 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 19 18:33:46 crc kubenswrapper[4749]: E0219 18:33:46.634609 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.128:6443: connect: connection refused" interval="200ms" Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.634887 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.128:6443: connect: connection refused Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.634990 4749 factory.go:55] Registering systemd factory Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.635014 4749 factory.go:221] Registration of the systemd container factory successfully Feb 19 18:33:46 crc kubenswrapper[4749]: E0219 18:33:46.635252 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.128:6443: connect: connection refused" logger="UnhandledError" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.636019 4749 factory.go:153] Registering CRI-O factory Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.636053 4749 factory.go:221] Registration of the crio container factory successfully Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.636115 4749 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.636137 4749 factory.go:103] Registering Raw factory Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.636151 4749 manager.go:1196] Started watching for new ooms in manager Feb 19 18:33:46 crc kubenswrapper[4749]: E0219 18:33:46.636369 4749 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.128:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1895b98907419475 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 18:33:46.623554677 +0000 UTC m=+0.584774631,LastTimestamp:2026-02-19 18:33:46.623554677 +0000 UTC m=+0.584774631,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.637290 4749 manager.go:319] Starting recovery of all containers Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.641717 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.641748 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.641759 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.641768 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.641777 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.641785 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.641793 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.641802 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.641813 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.641821 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.641830 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.641838 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.641847 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.641857 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.641868 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.641876 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.641885 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.641901 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.641909 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.641917 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.641925 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.641936 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.641945 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.641952 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.641985 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642010 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642036 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642048 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642057 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642066 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642077 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642086 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642095 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642104 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642131 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642141 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642151 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642160 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642170 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642192 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642201 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642211 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642219 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642228 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642237 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642259 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642269 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642278 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642286 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642299 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642308 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642316 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642342 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642352 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642364 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642374 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642417 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642427 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642436 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642445 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642453 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642478 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642488 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642496 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642505 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642514 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642522 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642532 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642554 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642562 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642571 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642580 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642588 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642597 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642606 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642625 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642634 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642653 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642663 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642672 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642697 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642706 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642714 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642723 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642732 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642741 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642750 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642766 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642775 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642784 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642793 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642803 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642811 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642820 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642842 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642851 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642860 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642868 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642877 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642891 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642909 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642918 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642928 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642936 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642948 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642958 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642967 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642988 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.642997 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.643006 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.643016 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.643049 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.643060 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.643068 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.643075 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.643083 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.643092 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.643100 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.643108 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.643127 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.643134 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.643142 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.643151 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.643159 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.643167 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.643174 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.643197 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.643206 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.643226 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.643235 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.643243 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.643255 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.643264 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.643313 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.643323 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.644848 4749 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.644873 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.644885 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.644903 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.644912 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.644922 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.644930 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.644941 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.644949 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.644957 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.644966 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.644975 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.644986 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.644994 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645002 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645011 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645036 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645044 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645053 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645061 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645069 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645078 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645086 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645096 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645104 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645113 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645122 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645131 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645140 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645149 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645159 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645168 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645176 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645185 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645195 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645204 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645212 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645221 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645231 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645241 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645250 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645259 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645268 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645277 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645286 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645295 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645319 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645328 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645338 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645346 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645355 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645363 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645372 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645382 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645391 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645400 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645410 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645419 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645427 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645437 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645446 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645455 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645464 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645473 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645481 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645489 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645498 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645506 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645515 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645524 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645533 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645542 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645550 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645558 4749 reconstruct.go:97] "Volume reconstruction finished" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.645565 4749 reconciler.go:26] "Reconciler: start to sync state" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.654169 4749 manager.go:324] Recovery completed Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.667417 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.668673 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.668699 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.668708 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.669714 4749 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.669729 4749 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.669744 4749 state_mem.go:36] "Initialized new in-memory state store" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.676132 4749 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.677584 4749 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.677621 4749 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.677644 4749 kubelet.go:2335] "Starting kubelet main sync loop" Feb 19 18:33:46 crc kubenswrapper[4749]: E0219 18:33:46.677712 4749 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 19 18:33:46 crc kubenswrapper[4749]: W0219 18:33:46.680412 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.128:6443: connect: connection refused Feb 19 18:33:46 crc kubenswrapper[4749]: E0219 18:33:46.680487 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.128:6443: connect: connection refused" logger="UnhandledError" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.690248 4749 policy_none.go:49] "None policy: Start" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.690944 4749 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.690986 4749 state_mem.go:35] "Initializing new in-memory state store" Feb 19 18:33:46 crc kubenswrapper[4749]: E0219 18:33:46.734913 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.745958 4749 manager.go:334] "Starting Device Plugin manager" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.745996 4749 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.746007 4749 server.go:79] "Starting device plugin registration server" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.746377 4749 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.746395 4749 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.746568 4749 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.746632 4749 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.746638 4749 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 19 18:33:46 crc kubenswrapper[4749]: E0219 18:33:46.753289 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.778526 4749 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.778643 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.779598 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.779626 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.779635 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.779750 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.780306 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.780328 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.780336 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.780978 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.781010 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.781021 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.781107 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.781131 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.781741 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.781782 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.781795 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.782000 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.782041 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.782060 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.782145 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.782270 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.782330 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.782281 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.782468 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.782529 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.783097 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.783124 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.783135 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.783252 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.783271 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.783279 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.783349 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.783478 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.783502 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.783835 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.783910 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.783964 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.784194 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.784283 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.786652 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.786688 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.786699 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.787995 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.788040 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.788054 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:46 crc kubenswrapper[4749]: E0219 18:33:46.835548 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.128:6443: connect: connection refused" interval="400ms" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.846489 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.846912 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.846941 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.846958 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.846974 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.846996 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.847010 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.847040 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.847101 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.847157 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.847191 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.847220 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.847250 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.847333 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.847383 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.847400 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.848003 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.848058 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.848071 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.848091 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 18:33:46 crc kubenswrapper[4749]: E0219 18:33:46.848479 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.128:6443: connect: connection refused" node="crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.947968 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.948145 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.948417 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.948494 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.948520 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.948596 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.948624 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.948645 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.948665 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.948680 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.948697 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.948719 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.948743 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.948747 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.948758 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.948855 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.948892 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.948911 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.948924 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.948955 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.948984 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.949017 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.949074 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.948958 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.949102 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.949144 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.948786 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.949209 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.949277 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 18:33:46 crc kubenswrapper[4749]: I0219 18:33:46.949324 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:33:47 crc kubenswrapper[4749]: I0219 18:33:47.049355 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:47 crc kubenswrapper[4749]: I0219 18:33:47.050760 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:47 crc kubenswrapper[4749]: I0219 18:33:47.050787 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:47 crc kubenswrapper[4749]: I0219 18:33:47.050798 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:47 crc kubenswrapper[4749]: I0219 18:33:47.050822 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 18:33:47 crc kubenswrapper[4749]: E0219 18:33:47.051197 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.128:6443: connect: connection refused" node="crc" Feb 19 18:33:47 crc kubenswrapper[4749]: I0219 18:33:47.111609 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 19 18:33:47 crc kubenswrapper[4749]: I0219 18:33:47.118386 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:33:47 crc kubenswrapper[4749]: I0219 18:33:47.131527 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 18:33:47 crc kubenswrapper[4749]: I0219 18:33:47.151360 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 18:33:47 crc kubenswrapper[4749]: I0219 18:33:47.154314 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 18:33:47 crc kubenswrapper[4749]: W0219 18:33:47.154942 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-5a150f8fe39e8221cd36fe6130868de89d4461a884f57b45973286151e06d79d WatchSource:0}: Error finding container 5a150f8fe39e8221cd36fe6130868de89d4461a884f57b45973286151e06d79d: Status 404 returned error can't find the container with id 5a150f8fe39e8221cd36fe6130868de89d4461a884f57b45973286151e06d79d Feb 19 18:33:47 crc kubenswrapper[4749]: W0219 18:33:47.156837 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-870409578ebfeb6a47027bc15aa1e35b018cb8905fabf4a854ef608c8985032a WatchSource:0}: Error finding container 870409578ebfeb6a47027bc15aa1e35b018cb8905fabf4a854ef608c8985032a: Status 404 returned error can't find the container with id 870409578ebfeb6a47027bc15aa1e35b018cb8905fabf4a854ef608c8985032a Feb 19 18:33:47 crc kubenswrapper[4749]: W0219 18:33:47.171371 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-592c17b7cc9055725dfa6a4af642a2626ec65967acf1290f59262faa2edccd1b WatchSource:0}: Error finding container 592c17b7cc9055725dfa6a4af642a2626ec65967acf1290f59262faa2edccd1b: Status 404 returned error can't find the container with id 592c17b7cc9055725dfa6a4af642a2626ec65967acf1290f59262faa2edccd1b Feb 19 18:33:47 crc kubenswrapper[4749]: E0219 18:33:47.236871 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.128:6443: connect: connection refused" interval="800ms" Feb 19 18:33:47 crc kubenswrapper[4749]: W0219 18:33:47.426424 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.128:6443: connect: connection refused Feb 19 18:33:47 crc kubenswrapper[4749]: E0219 18:33:47.426556 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.128:6443: connect: connection refused" logger="UnhandledError" Feb 19 18:33:47 crc kubenswrapper[4749]: I0219 18:33:47.452189 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:47 crc kubenswrapper[4749]: I0219 18:33:47.453614 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:47 crc kubenswrapper[4749]: I0219 18:33:47.453673 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:47 crc kubenswrapper[4749]: I0219 18:33:47.453684 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:47 crc kubenswrapper[4749]: I0219 18:33:47.453711 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 18:33:47 crc kubenswrapper[4749]: E0219 18:33:47.454254 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.128:6443: connect: connection refused" node="crc" Feb 19 18:33:47 crc kubenswrapper[4749]: W0219 18:33:47.513288 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.128:6443: connect: connection refused Feb 19 18:33:47 crc kubenswrapper[4749]: E0219 18:33:47.513392 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.128:6443: connect: connection refused" logger="UnhandledError" Feb 19 18:33:47 crc kubenswrapper[4749]: I0219 18:33:47.625792 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.128:6443: connect: connection refused Feb 19 18:33:47 crc kubenswrapper[4749]: I0219 18:33:47.634938 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 03:03:38.568875717 +0000 UTC Feb 19 18:33:47 crc kubenswrapper[4749]: I0219 18:33:47.683695 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bf60ee73cd12dbb0bebbc4689e454ff0174742dea571dc22b7ed2e6050172eb1"} Feb 19 18:33:47 crc kubenswrapper[4749]: I0219 18:33:47.684816 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"592c17b7cc9055725dfa6a4af642a2626ec65967acf1290f59262faa2edccd1b"} Feb 19 18:33:47 crc kubenswrapper[4749]: I0219 18:33:47.685638 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"269d7d73b50bd26e5da1afb053a3ba78a3334ace327b855b9720b2c6ca539b54"} Feb 19 18:33:47 crc kubenswrapper[4749]: I0219 18:33:47.686379 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5a150f8fe39e8221cd36fe6130868de89d4461a884f57b45973286151e06d79d"} Feb 19 18:33:47 crc kubenswrapper[4749]: I0219 18:33:47.686980 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"870409578ebfeb6a47027bc15aa1e35b018cb8905fabf4a854ef608c8985032a"} Feb 19 18:33:47 crc kubenswrapper[4749]: W0219 18:33:47.746078 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.128:6443: connect: connection refused Feb 19 18:33:47 crc kubenswrapper[4749]: E0219 18:33:47.746449 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.128:6443: connect: connection refused" logger="UnhandledError" Feb 19 18:33:47 crc kubenswrapper[4749]: W0219 18:33:47.834869 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.128:6443: connect: connection refused Feb 19 18:33:47 crc kubenswrapper[4749]: E0219 18:33:47.834926 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.128:6443: connect: connection refused" logger="UnhandledError" Feb 19 18:33:48 crc kubenswrapper[4749]: E0219 18:33:48.038342 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.128:6443: connect: connection refused" interval="1.6s" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.255226 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.259302 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.259436 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.259450 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.259475 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 18:33:48 crc kubenswrapper[4749]: E0219 18:33:48.259875 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.128:6443: connect: connection refused" node="crc" Feb 19 18:33:48 crc kubenswrapper[4749]: E0219 18:33:48.504638 4749 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.128:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1895b98907419475 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 18:33:46.623554677 +0000 UTC m=+0.584774631,LastTimestamp:2026-02-19 18:33:46.623554677 +0000 UTC m=+0.584774631,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.626206 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.128:6443: connect: connection refused Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.635312 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 21:45:11.720749788 +0000 UTC Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.637478 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 19 18:33:48 crc kubenswrapper[4749]: E0219 18:33:48.638329 4749 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.128:6443: connect: connection refused" logger="UnhandledError" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.693748 4749 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0caa5485ff1e9a463ad1de11b0d4750b72a97e25340ef78f8de4706a01eaa0a2" exitCode=0 Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.693819 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0caa5485ff1e9a463ad1de11b0d4750b72a97e25340ef78f8de4706a01eaa0a2"} Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.693982 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.695451 4749 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="a3a5f80308cdb672b18e81484a7f7ca76bd0d91256e5ff3ad12d580bc23164e4" exitCode=0 Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.695527 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"a3a5f80308cdb672b18e81484a7f7ca76bd0d91256e5ff3ad12d580bc23164e4"} Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.695578 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.695844 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.695870 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.695879 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.696583 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.696619 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.696659 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.698409 4749 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="e5d19c39ddbfb4f41c69c2f259828fe6ffe52ab248f9b387d3f0f1e27f2ce974" exitCode=0 Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.698489 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.698501 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"e5d19c39ddbfb4f41c69c2f259828fe6ffe52ab248f9b387d3f0f1e27f2ce974"} Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.699509 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.699554 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.699570 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.702711 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1f89a78133a284f9f26d5276d2e9de76077a925a7e504cb03a523a2a501cb787"} Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.702750 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5d08fedcd92f642d5ba6d00bef77973a15b18fc25a0bd12a3eac4c79ae642acc"} Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.702765 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"982d904c52a06d7228b35166501e609ec23e23e834d0d6d4b5e8bc8ab9150411"} Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.702778 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9b3881770674ef4130e3b94e5d7424d9474dc6a57df26af788808d649d3933ed"} Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.702836 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.703748 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.703782 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.703797 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.705119 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="de75e980cf20d25c7b5d236a8baa09a0c5aa9e5b1150f9d64a12c2e3df0a6302" exitCode=0 Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.705178 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"de75e980cf20d25c7b5d236a8baa09a0c5aa9e5b1150f9d64a12c2e3df0a6302"} Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.705216 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.706045 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.706071 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.706081 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.708479 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.713642 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.713673 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:48 crc kubenswrapper[4749]: I0219 18:33:48.713686 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.378728 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.625589 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.128:6443: connect: connection refused Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.635920 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 08:18:53.803419166 +0000 UTC Feb 19 18:33:49 crc kubenswrapper[4749]: E0219 18:33:49.639779 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.128:6443: connect: connection refused" interval="3.2s" Feb 19 18:33:49 crc kubenswrapper[4749]: W0219 18:33:49.671267 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.128:6443: connect: connection refused Feb 19 18:33:49 crc kubenswrapper[4749]: E0219 18:33:49.671416 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.128:6443: connect: connection refused" logger="UnhandledError" Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.709914 4749 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8a5cd39b9c6592f94b34ebeeb25c35da92a774c780dcda7120f1802dc6a43b54" exitCode=0 Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.710014 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8a5cd39b9c6592f94b34ebeeb25c35da92a774c780dcda7120f1802dc6a43b54"} Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.710140 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.712530 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.712578 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.712589 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.713238 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.713151 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2f37fc83d4bd24c3a105057a46f676b23a53c3dad9d05873d2239c7afa2620ee"} Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.713491 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0bc8d04543a076c5bb7a8d2682109d3871b9fbd1404b64b7ffc3ef3679ef5a6b"} Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.713529 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f67ba3006acf53bfb4b220c8bef0a0e9b7783d9b506e93acfeef503ec1817fad"} Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.714204 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.714234 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.714246 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.715092 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"45fbd065062d841d80da8ff214139c22fd61cbc3b71a294e34d92c96f1723c51"} Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.715121 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.715925 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.715952 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.715963 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.719557 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.719900 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e9b12443ad3ae04dfb8e1c84530aa6ceb0f813c57b0c18201a84a3e83b219f88"} Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.719922 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3d2eea03700cf7f239720aedd551d3c3bf84b7a0c8d0b69214a3e868115b1f9e"} Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.719935 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"46302fdc6234ad50fae4a7f00eb45d9d7fd624c9680683f332fb020fcca639fe"} Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.719944 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"69af99983784464ff8b25db0e73db93c849f9a6e704ca8f907228103ae2c0591"} Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.721307 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.721330 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.721340 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:49 crc kubenswrapper[4749]: W0219 18:33:49.824173 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.128:6443: connect: connection refused Feb 19 18:33:49 crc kubenswrapper[4749]: E0219 18:33:49.824242 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.128:6443: connect: connection refused" logger="UnhandledError" Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.860749 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.861957 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.862001 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.862017 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:49 crc kubenswrapper[4749]: I0219 18:33:49.862065 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 18:33:49 crc kubenswrapper[4749]: E0219 18:33:49.862538 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.128:6443: connect: connection refused" node="crc" Feb 19 18:33:50 crc kubenswrapper[4749]: W0219 18:33:50.226385 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.128:6443: connect: connection refused Feb 19 18:33:50 crc kubenswrapper[4749]: E0219 18:33:50.226474 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.128:6443: connect: connection refused" logger="UnhandledError" Feb 19 18:33:50 crc kubenswrapper[4749]: I0219 18:33:50.625207 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.128:6443: connect: connection refused Feb 19 18:33:50 crc kubenswrapper[4749]: I0219 18:33:50.636639 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 16:18:49.274615597 +0000 UTC Feb 19 18:33:50 crc kubenswrapper[4749]: W0219 18:33:50.638300 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.128:6443: connect: connection refused Feb 19 18:33:50 crc kubenswrapper[4749]: E0219 18:33:50.638406 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.128:6443: connect: connection refused" logger="UnhandledError" Feb 19 18:33:50 crc kubenswrapper[4749]: I0219 18:33:50.724891 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"49098302a1c2e9d27f3beea4ed4b886169983707335b7599d69c4137e7e65016"} Feb 19 18:33:50 crc kubenswrapper[4749]: I0219 18:33:50.724958 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:50 crc kubenswrapper[4749]: I0219 18:33:50.726060 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:50 crc kubenswrapper[4749]: I0219 18:33:50.726088 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:50 crc kubenswrapper[4749]: I0219 18:33:50.726099 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:50 crc kubenswrapper[4749]: I0219 18:33:50.727285 4749 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1b210e1387fac4f36f5c021c84dfbdfebba8651156b88daa14c1832766e2e52a" exitCode=0 Feb 19 18:33:50 crc kubenswrapper[4749]: I0219 18:33:50.727357 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:50 crc kubenswrapper[4749]: I0219 18:33:50.727398 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:50 crc kubenswrapper[4749]: I0219 18:33:50.727410 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:50 crc kubenswrapper[4749]: I0219 18:33:50.727713 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:50 crc kubenswrapper[4749]: I0219 18:33:50.727947 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1b210e1387fac4f36f5c021c84dfbdfebba8651156b88daa14c1832766e2e52a"} Feb 19 18:33:50 crc kubenswrapper[4749]: I0219 18:33:50.728000 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 18:33:50 crc kubenswrapper[4749]: I0219 18:33:50.728483 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:50 crc kubenswrapper[4749]: I0219 18:33:50.728500 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:50 crc kubenswrapper[4749]: I0219 18:33:50.728507 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:50 crc kubenswrapper[4749]: I0219 18:33:50.728542 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:50 crc kubenswrapper[4749]: I0219 18:33:50.728545 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:50 crc kubenswrapper[4749]: I0219 18:33:50.728557 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:50 crc kubenswrapper[4749]: I0219 18:33:50.728570 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:50 crc kubenswrapper[4749]: I0219 18:33:50.728571 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:50 crc kubenswrapper[4749]: I0219 18:33:50.728683 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:50 crc kubenswrapper[4749]: I0219 18:33:50.728506 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:50 crc kubenswrapper[4749]: I0219 18:33:50.728787 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:50 crc kubenswrapper[4749]: I0219 18:33:50.732249 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:51 crc kubenswrapper[4749]: I0219 18:33:51.037117 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:33:51 crc kubenswrapper[4749]: I0219 18:33:51.637322 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 07:49:31.309296007 +0000 UTC Feb 19 18:33:51 crc kubenswrapper[4749]: I0219 18:33:51.734103 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d6c41ac527837223fd88209458196523059d65415abe87649e42afed70629386"} Feb 19 18:33:51 crc kubenswrapper[4749]: I0219 18:33:51.734162 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:51 crc kubenswrapper[4749]: I0219 18:33:51.734192 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"94b4924ca142d588748a5f26c65af6b0b2ef068acc87e40f70379eb092668c4c"} Feb 19 18:33:51 crc kubenswrapper[4749]: I0219 18:33:51.734162 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:51 crc kubenswrapper[4749]: I0219 18:33:51.734227 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"018a3849b556d5317f7007b13a007ac58d10b6d40f3c02364588a4684e5ae1fe"} Feb 19 18:33:51 crc kubenswrapper[4749]: I0219 18:33:51.734459 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0de58629d0ca399d72c836f81d521da0502d9227f375f678ad87e74d86cf62f9"} Feb 19 18:33:51 crc kubenswrapper[4749]: I0219 18:33:51.735003 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:51 crc kubenswrapper[4749]: I0219 18:33:51.735105 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:51 crc kubenswrapper[4749]: I0219 18:33:51.735133 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:51 crc kubenswrapper[4749]: I0219 18:33:51.735159 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:51 crc kubenswrapper[4749]: I0219 18:33:51.735177 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:51 crc kubenswrapper[4749]: I0219 18:33:51.735184 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:52 crc kubenswrapper[4749]: I0219 18:33:52.183615 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 18:33:52 crc kubenswrapper[4749]: I0219 18:33:52.183762 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:52 crc kubenswrapper[4749]: I0219 18:33:52.184805 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:52 crc kubenswrapper[4749]: I0219 18:33:52.184879 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:52 crc kubenswrapper[4749]: I0219 18:33:52.184899 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:52 crc kubenswrapper[4749]: I0219 18:33:52.637698 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 22:18:47.936628199 +0000 UTC Feb 19 18:33:52 crc kubenswrapper[4749]: I0219 18:33:52.725302 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 19 18:33:52 crc kubenswrapper[4749]: I0219 18:33:52.746428 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8de5fa8fc53c6969bc62d2bf82680c2b81905ce684ec8ba4e4f2b293d201bee0"} Feb 19 18:33:52 crc kubenswrapper[4749]: I0219 18:33:52.746460 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:52 crc kubenswrapper[4749]: I0219 18:33:52.746470 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:52 crc kubenswrapper[4749]: I0219 18:33:52.748005 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:52 crc kubenswrapper[4749]: I0219 18:33:52.748057 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:52 crc kubenswrapper[4749]: I0219 18:33:52.748094 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:52 crc kubenswrapper[4749]: I0219 18:33:52.748072 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:52 crc kubenswrapper[4749]: I0219 18:33:52.748150 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:52 crc kubenswrapper[4749]: I0219 18:33:52.748113 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:53 crc kubenswrapper[4749]: I0219 18:33:53.062944 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:53 crc kubenswrapper[4749]: I0219 18:33:53.064804 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:53 crc kubenswrapper[4749]: I0219 18:33:53.064856 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:53 crc kubenswrapper[4749]: I0219 18:33:53.064869 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:53 crc kubenswrapper[4749]: I0219 18:33:53.064897 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 18:33:53 crc kubenswrapper[4749]: I0219 18:33:53.277137 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 19 18:33:53 crc kubenswrapper[4749]: I0219 18:33:53.638364 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 22:54:07.669880419 +0000 UTC Feb 19 18:33:53 crc kubenswrapper[4749]: I0219 18:33:53.748567 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:53 crc kubenswrapper[4749]: I0219 18:33:53.749765 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:53 crc kubenswrapper[4749]: I0219 18:33:53.749806 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:53 crc kubenswrapper[4749]: I0219 18:33:53.749823 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:54 crc kubenswrapper[4749]: I0219 18:33:54.030937 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:33:54 crc kubenswrapper[4749]: I0219 18:33:54.031258 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:54 crc kubenswrapper[4749]: I0219 18:33:54.032760 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:54 crc kubenswrapper[4749]: I0219 18:33:54.032810 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:54 crc kubenswrapper[4749]: I0219 18:33:54.032827 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:54 crc kubenswrapper[4749]: I0219 18:33:54.639430 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 10:29:57.691557403 +0000 UTC Feb 19 18:33:54 crc kubenswrapper[4749]: I0219 18:33:54.751215 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:54 crc kubenswrapper[4749]: I0219 18:33:54.751967 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:54 crc kubenswrapper[4749]: I0219 18:33:54.751997 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:54 crc kubenswrapper[4749]: I0219 18:33:54.752005 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:55 crc kubenswrapper[4749]: I0219 18:33:55.061661 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 19 18:33:55 crc kubenswrapper[4749]: I0219 18:33:55.327289 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:33:55 crc kubenswrapper[4749]: I0219 18:33:55.327570 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:55 crc kubenswrapper[4749]: I0219 18:33:55.329343 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:55 crc kubenswrapper[4749]: I0219 18:33:55.329416 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:55 crc kubenswrapper[4749]: I0219 18:33:55.329436 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:55 crc kubenswrapper[4749]: I0219 18:33:55.640223 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 09:07:37.143422608 +0000 UTC Feb 19 18:33:55 crc kubenswrapper[4749]: I0219 18:33:55.713022 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 18:33:55 crc kubenswrapper[4749]: I0219 18:33:55.713247 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:55 crc kubenswrapper[4749]: I0219 18:33:55.714784 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:55 crc kubenswrapper[4749]: I0219 18:33:55.715154 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:55 crc kubenswrapper[4749]: I0219 18:33:55.715211 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:55 crc kubenswrapper[4749]: I0219 18:33:55.754088 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:55 crc kubenswrapper[4749]: I0219 18:33:55.755526 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:55 crc kubenswrapper[4749]: I0219 18:33:55.755576 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:55 crc kubenswrapper[4749]: I0219 18:33:55.755595 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:56 crc kubenswrapper[4749]: I0219 18:33:56.641303 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 01:27:21.481882439 +0000 UTC Feb 19 18:33:56 crc kubenswrapper[4749]: E0219 18:33:56.753436 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 19 18:33:57 crc kubenswrapper[4749]: I0219 18:33:57.642289 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 21:58:19.652146737 +0000 UTC Feb 19 18:33:58 crc kubenswrapper[4749]: I0219 18:33:58.058265 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 18:33:58 crc kubenswrapper[4749]: I0219 18:33:58.058409 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:58 crc kubenswrapper[4749]: I0219 18:33:58.059589 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:58 crc kubenswrapper[4749]: I0219 18:33:58.059662 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:58 crc kubenswrapper[4749]: I0219 18:33:58.059682 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:58 crc kubenswrapper[4749]: I0219 18:33:58.066494 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 18:33:58 crc kubenswrapper[4749]: I0219 18:33:58.643301 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 01:17:47.108323967 +0000 UTC Feb 19 18:33:58 crc kubenswrapper[4749]: I0219 18:33:58.713181 4749 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 18:33:58 crc kubenswrapper[4749]: I0219 18:33:58.713301 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 18:33:58 crc kubenswrapper[4749]: I0219 18:33:58.759626 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:58 crc kubenswrapper[4749]: I0219 18:33:58.760771 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:58 crc kubenswrapper[4749]: I0219 18:33:58.760796 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:58 crc kubenswrapper[4749]: I0219 18:33:58.760804 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:33:58 crc kubenswrapper[4749]: I0219 18:33:58.765256 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 18:33:59 crc kubenswrapper[4749]: I0219 18:33:59.643872 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 03:45:16.162051079 +0000 UTC Feb 19 18:33:59 crc kubenswrapper[4749]: I0219 18:33:59.762749 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:33:59 crc kubenswrapper[4749]: I0219 18:33:59.763987 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:33:59 crc kubenswrapper[4749]: I0219 18:33:59.764071 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:33:59 crc kubenswrapper[4749]: I0219 18:33:59.764088 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:34:00 crc kubenswrapper[4749]: I0219 18:34:00.646078 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 21:56:59.824358399 +0000 UTC Feb 19 18:34:01 crc kubenswrapper[4749]: I0219 18:34:01.267729 4749 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 19 18:34:01 crc kubenswrapper[4749]: I0219 18:34:01.267786 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 19 18:34:01 crc kubenswrapper[4749]: I0219 18:34:01.272873 4749 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 19 18:34:01 crc kubenswrapper[4749]: I0219 18:34:01.272921 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 19 18:34:01 crc kubenswrapper[4749]: I0219 18:34:01.646805 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 22:13:26.919342344 +0000 UTC Feb 19 18:34:02 crc kubenswrapper[4749]: I0219 18:34:02.648446 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 05:33:07.249436143 +0000 UTC Feb 19 18:34:03 crc kubenswrapper[4749]: I0219 18:34:03.305079 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 19 18:34:03 crc kubenswrapper[4749]: I0219 18:34:03.305228 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:34:03 crc kubenswrapper[4749]: I0219 18:34:03.306194 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:34:03 crc kubenswrapper[4749]: I0219 18:34:03.306226 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:34:03 crc kubenswrapper[4749]: I0219 18:34:03.306235 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:34:03 crc kubenswrapper[4749]: I0219 18:34:03.315962 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 19 18:34:03 crc kubenswrapper[4749]: I0219 18:34:03.649786 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 15:32:20.098473531 +0000 UTC Feb 19 18:34:03 crc kubenswrapper[4749]: I0219 18:34:03.771002 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:34:03 crc kubenswrapper[4749]: I0219 18:34:03.771906 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:34:03 crc kubenswrapper[4749]: I0219 18:34:03.771938 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:34:03 crc kubenswrapper[4749]: I0219 18:34:03.771948 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:34:04 crc kubenswrapper[4749]: I0219 18:34:04.650286 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 12:16:44.127460603 +0000 UTC Feb 19 18:34:05 crc kubenswrapper[4749]: I0219 18:34:05.332221 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:34:05 crc kubenswrapper[4749]: I0219 18:34:05.332394 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:34:05 crc kubenswrapper[4749]: I0219 18:34:05.333494 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:34:05 crc kubenswrapper[4749]: I0219 18:34:05.333524 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:34:05 crc kubenswrapper[4749]: I0219 18:34:05.333533 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:34:05 crc kubenswrapper[4749]: I0219 18:34:05.336533 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:34:05 crc kubenswrapper[4749]: I0219 18:34:05.650360 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 09:28:39.216478088 +0000 UTC Feb 19 18:34:05 crc kubenswrapper[4749]: I0219 18:34:05.775566 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 18:34:05 crc kubenswrapper[4749]: I0219 18:34:05.775611 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:34:05 crc kubenswrapper[4749]: I0219 18:34:05.776318 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:34:05 crc kubenswrapper[4749]: I0219 18:34:05.776409 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:34:05 crc kubenswrapper[4749]: I0219 18:34:05.776474 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:34:06 crc kubenswrapper[4749]: E0219 18:34:06.271428 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.271832 4749 trace.go:236] Trace[1439738832]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 18:33:52.890) (total time: 13381ms): Feb 19 18:34:06 crc kubenswrapper[4749]: Trace[1439738832]: ---"Objects listed" error: 13381ms (18:34:06.271) Feb 19 18:34:06 crc kubenswrapper[4749]: Trace[1439738832]: [13.381413986s] [13.381413986s] END Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.271874 4749 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 19 18:34:06 crc kubenswrapper[4749]: E0219 18:34:06.283569 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.284390 4749 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.287074 4749 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.290375 4749 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.292325 4749 trace.go:236] Trace[380444285]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 18:33:53.967) (total time: 12324ms): Feb 19 18:34:06 crc kubenswrapper[4749]: Trace[380444285]: ---"Objects listed" error: 12324ms (18:34:06.292) Feb 19 18:34:06 crc kubenswrapper[4749]: Trace[380444285]: [12.324409389s] [12.324409389s] END Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.292377 4749 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.296652 4749 trace.go:236] Trace[769163446]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 18:33:55.611) (total time: 10685ms): Feb 19 18:34:06 crc kubenswrapper[4749]: Trace[769163446]: ---"Objects listed" error: 10685ms (18:34:06.296) Feb 19 18:34:06 crc kubenswrapper[4749]: Trace[769163446]: [10.685593546s] [10.685593546s] END Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.296681 4749 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.321369 4749 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:60746->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.321449 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:60746->192.168.126.11:17697: read: connection reset by peer" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.321762 4749 csr.go:261] certificate signing request csr-wx4nc is approved, waiting to be issued Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.321847 4749 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.321906 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.323262 4749 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:36838->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.323327 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:36838->192.168.126.11:17697: read: connection reset by peer" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.626070 4749 apiserver.go:52] "Watching apiserver" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.651156 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 02:33:29.31353175 +0000 UTC Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.652309 4749 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.652539 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.652856 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.652931 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.652950 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 18:34:06 crc kubenswrapper[4749]: E0219 18:34:06.652980 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.652954 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.653267 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 18:34:06 crc kubenswrapper[4749]: E0219 18:34:06.653306 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.653317 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 18:34:06 crc kubenswrapper[4749]: E0219 18:34:06.653366 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.654651 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.654920 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.655054 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.655224 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.655458 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.655763 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.656148 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.656675 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.657376 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.663944 4749 csr.go:257] certificate signing request csr-wx4nc is issued Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.692379 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-njxlm"] Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.692641 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-nzldt"] Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.692794 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-njxlm" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.692819 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-sjmng"] Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.692986 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.693615 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-4w8w4"] Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.693772 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sjmng" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.693848 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.695625 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.695717 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.695776 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.697821 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.697875 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.697907 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.698083 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.698225 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.698237 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.698230 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.698223 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.698314 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.698484 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.698683 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.699226 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.699322 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.732325 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.735740 4749 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.746810 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.763547 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.771751 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.779879 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.780810 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4w8w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51c18c30-ebd6-48f1-bc44-97849f648ed2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4w8w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.784263 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="49098302a1c2e9d27f3beea4ed4b886169983707335b7599d69c4137e7e65016" exitCode=255 Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.784311 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"49098302a1c2e9d27f3beea4ed4b886169983707335b7599d69c4137e7e65016"} Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.791439 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.794243 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.794383 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.794453 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.794523 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.794588 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.794653 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.794776 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.794846 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.794982 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795072 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795145 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795213 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795268 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795280 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795403 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795420 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795438 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795417 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795534 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795562 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795590 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795613 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795615 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795661 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795667 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795708 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795734 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795757 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795802 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795809 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795846 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795887 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795913 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795933 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795955 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795968 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.795975 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796000 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796069 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796091 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796112 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796126 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796134 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796157 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796178 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796201 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796222 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796260 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796274 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796280 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796303 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796326 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796351 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796373 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796392 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796416 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796427 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796437 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796459 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796484 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796504 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796524 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796543 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796549 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796564 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796566 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796586 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796615 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796636 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796656 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796677 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796700 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796721 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796727 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796742 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796752 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796763 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796808 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796834 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796857 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796880 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796907 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796929 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796991 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797015 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797062 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797089 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797114 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797136 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797162 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797185 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797207 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797228 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797255 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797365 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797389 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797411 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797435 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797460 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797481 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797505 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797528 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797551 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797573 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797598 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797626 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797649 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797672 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797696 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797723 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797745 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797771 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797798 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797870 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797895 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.798226 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.798259 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.798286 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.798390 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.798420 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.798442 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.798466 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.798492 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.798751 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.798788 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.798817 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.798848 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.798876 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.798929 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.798953 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.798976 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799000 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799042 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799108 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799131 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799154 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799175 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799197 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799220 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799242 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799269 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799295 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799319 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799341 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799364 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799387 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799410 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799435 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799459 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799481 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799506 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799529 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799553 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799577 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799602 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796881 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.796920 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797070 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797085 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797218 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797345 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797360 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797640 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797668 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797704 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.797817 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.798071 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.798120 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.798306 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799803 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.798445 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.798523 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.798878 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799011 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799252 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799461 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799688 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.800008 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.800015 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.800181 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.800201 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.800261 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799643 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.800477 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.800500 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.800562 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.800579 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: E0219 18:34:06.800705 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:07.300680757 +0000 UTC m=+21.261900731 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.800704 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.800703 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.800700 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.800772 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801275 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.799627 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801348 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801374 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801401 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801428 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801455 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801490 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801515 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801537 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801558 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801580 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801603 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801638 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801656 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801676 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801697 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801718 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801752 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801776 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801799 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801813 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801821 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801877 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801881 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801901 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801920 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801937 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801956 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801973 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.801989 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802014 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802046 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802063 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802080 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802083 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802097 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802117 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802133 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802151 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802168 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802184 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802200 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802216 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802231 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802250 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802269 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802289 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802307 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802324 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802341 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802361 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802378 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802395 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802412 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802431 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802451 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802469 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802495 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802510 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802558 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802588 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b4b7c32a-5fc5-45f9-848f-f344598f6d73-mcd-auth-proxy-config\") pod \"machine-config-daemon-nzldt\" (UID: \"b4b7c32a-5fc5-45f9-848f-f344598f6d73\") " pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802615 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dgvn\" (UniqueName: \"kubernetes.io/projected/51c18c30-ebd6-48f1-bc44-97849f648ed2-kube-api-access-5dgvn\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802635 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ca3baa5f-80b1-4e21-bb81-bf35ec1675f7-hosts-file\") pod \"node-resolver-njxlm\" (UID: \"ca3baa5f-80b1-4e21-bb81-bf35ec1675f7\") " pod="openshift-dns/node-resolver-njxlm" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802652 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfmmf\" (UniqueName: \"kubernetes.io/projected/ca3baa5f-80b1-4e21-bb81-bf35ec1675f7-kube-api-access-rfmmf\") pod \"node-resolver-njxlm\" (UID: \"ca3baa5f-80b1-4e21-bb81-bf35ec1675f7\") " pod="openshift-dns/node-resolver-njxlm" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802672 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8a658d64-60b9-4161-93e5-8431821c07ce-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sjmng\" (UID: \"8a658d64-60b9-4161-93e5-8431821c07ce\") " pod="openshift-multus/multus-additional-cni-plugins-sjmng" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802662 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802691 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802862 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-os-release\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802933 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4b7c32a-5fc5-45f9-848f-f344598f6d73-proxy-tls\") pod \"machine-config-daemon-nzldt\" (UID: \"b4b7c32a-5fc5-45f9-848f-f344598f6d73\") " pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802960 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-cnibin\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802983 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/51c18c30-ebd6-48f1-bc44-97849f648ed2-cni-binary-copy\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803009 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-host-run-multus-certs\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803063 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803095 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803121 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b4b7c32a-5fc5-45f9-848f-f344598f6d73-rootfs\") pod \"machine-config-daemon-nzldt\" (UID: \"b4b7c32a-5fc5-45f9-848f-f344598f6d73\") " pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803148 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-host-run-netns\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803175 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smkp7\" (UniqueName: \"kubernetes.io/projected/8a658d64-60b9-4161-93e5-8431821c07ce-kube-api-access-smkp7\") pod \"multus-additional-cni-plugins-sjmng\" (UID: \"8a658d64-60b9-4161-93e5-8431821c07ce\") " pod="openshift-multus/multus-additional-cni-plugins-sjmng" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803200 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-multus-cni-dir\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803222 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-host-var-lib-kubelet\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803246 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-etc-kubernetes\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803275 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803302 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx6lb\" (UniqueName: \"kubernetes.io/projected/b4b7c32a-5fc5-45f9-848f-f344598f6d73-kube-api-access-bx6lb\") pod \"machine-config-daemon-nzldt\" (UID: \"b4b7c32a-5fc5-45f9-848f-f344598f6d73\") " pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803327 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803352 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-multus-socket-dir-parent\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803378 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803404 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-host-var-lib-cni-bin\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803427 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-hostroot\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803451 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803536 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8a658d64-60b9-4161-93e5-8431821c07ce-os-release\") pod \"multus-additional-cni-plugins-sjmng\" (UID: \"8a658d64-60b9-4161-93e5-8431821c07ce\") " pod="openshift-multus/multus-additional-cni-plugins-sjmng" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803571 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803598 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-system-cni-dir\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803625 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/51c18c30-ebd6-48f1-bc44-97849f648ed2-multus-daemon-config\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803649 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8a658d64-60b9-4161-93e5-8431821c07ce-cni-binary-copy\") pod \"multus-additional-cni-plugins-sjmng\" (UID: \"8a658d64-60b9-4161-93e5-8431821c07ce\") " pod="openshift-multus/multus-additional-cni-plugins-sjmng" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803674 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8a658d64-60b9-4161-93e5-8431821c07ce-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sjmng\" (UID: \"8a658d64-60b9-4161-93e5-8431821c07ce\") " pod="openshift-multus/multus-additional-cni-plugins-sjmng" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803697 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-multus-conf-dir\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803723 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803779 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8a658d64-60b9-4161-93e5-8431821c07ce-system-cni-dir\") pod \"multus-additional-cni-plugins-sjmng\" (UID: \"8a658d64-60b9-4161-93e5-8431821c07ce\") " pod="openshift-multus/multus-additional-cni-plugins-sjmng" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803808 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803836 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803861 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-host-run-k8s-cni-cncf-io\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803887 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-host-var-lib-cni-multus\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803911 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8a658d64-60b9-4161-93e5-8431821c07ce-cnibin\") pod \"multus-additional-cni-plugins-sjmng\" (UID: \"8a658d64-60b9-4161-93e5-8431821c07ce\") " pod="openshift-multus/multus-additional-cni-plugins-sjmng" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803939 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804002 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804102 4749 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804122 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804136 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804151 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804166 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804180 4749 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804195 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804209 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804223 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804237 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804251 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804265 4749 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804280 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804294 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804309 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804324 4749 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804346 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804364 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804378 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804394 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804409 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804423 4749 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804436 4749 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804448 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804460 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804473 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804487 4749 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804501 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804515 4749 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804530 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804544 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804558 4749 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804571 4749 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804584 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804598 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804612 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804626 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804639 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804653 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804667 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804680 4749 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804724 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804740 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804755 4749 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804770 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804783 4749 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804798 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804812 4749 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804825 4749 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804838 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804851 4749 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804864 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804878 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804892 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804906 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.805455 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.806365 4749 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802245 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802445 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802567 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802592 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802758 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.802780 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803221 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803302 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.803914 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804044 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804236 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804259 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804394 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804516 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804537 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804751 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804864 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.804946 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.805197 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.805337 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.805450 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.805580 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.805664 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.805770 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.805892 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.805983 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.806255 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.806512 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.806515 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.806615 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.806811 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.806798 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.807255 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.807433 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.807519 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.807555 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.807746 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.809184 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.809377 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.809655 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.810088 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.810179 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.810336 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.810342 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.810805 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.810951 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.811153 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.811162 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.811364 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.811456 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.811667 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.811742 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.811760 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.811882 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.811914 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.811927 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.812091 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.812112 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.812164 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.812285 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.812331 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.812515 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.812466 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.812704 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.812706 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.812907 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.813084 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.813152 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.813494 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.813720 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.813948 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.814348 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.814821 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.815444 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.815474 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.815728 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.815830 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.816197 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.816225 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.816408 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.816743 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.817596 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.817828 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.818195 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.818250 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.818372 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.818501 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.818637 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.818829 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.818853 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.819211 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.819421 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.819930 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: E0219 18:34:06.821723 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 18:34:06 crc kubenswrapper[4749]: E0219 18:34:06.821755 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 18:34:06 crc kubenswrapper[4749]: E0219 18:34:06.821769 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 18:34:06 crc kubenswrapper[4749]: E0219 18:34:06.821874 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 18:34:07.321853658 +0000 UTC m=+21.283073732 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.822164 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.822217 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.822340 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:06 crc kubenswrapper[4749]: E0219 18:34:06.822751 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 18:34:06 crc kubenswrapper[4749]: E0219 18:34:06.822813 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 18:34:07.322798139 +0000 UTC m=+21.284018193 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.823764 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.824004 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.824578 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.824588 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.825320 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 18:34:06 crc kubenswrapper[4749]: E0219 18:34:06.825377 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 18:34:06 crc kubenswrapper[4749]: E0219 18:34:06.825426 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 18:34:07.325406679 +0000 UTC m=+21.286626633 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 18:34:06 crc kubenswrapper[4749]: E0219 18:34:06.825631 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 18:34:06 crc kubenswrapper[4749]: E0219 18:34:06.825658 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.825651 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: E0219 18:34:06.825668 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 18:34:06 crc kubenswrapper[4749]: E0219 18:34:06.825707 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 18:34:07.325697035 +0000 UTC m=+21.286916989 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.825704 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.825942 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.825963 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.825952 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.834744 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.834909 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.835362 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.835761 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.837141 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.839963 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.840647 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.841268 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-njxlm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3baa5f-80b1-4e21-bb81-bf35ec1675f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-njxlm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.841324 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.841375 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.841407 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.841441 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.841858 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.841985 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.842056 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.842207 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.842742 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.843320 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.843357 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.843770 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.843850 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.844247 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.844407 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.844477 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.844537 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.844401 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.844613 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.844842 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.844882 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.844967 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.844997 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.845217 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.845483 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.845961 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.846000 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.850801 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b7c32a-5fc5-45f9-848f-f344598f6d73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx6lb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx6lb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nzldt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.856319 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.858169 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.861868 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.863568 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.865686 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sjmng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a658d64-60b9-4161-93e5-8431821c07ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sjmng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.869648 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.870514 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.876746 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.878017 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.884281 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.891301 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.899040 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.905924 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-njxlm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3baa5f-80b1-4e21-bb81-bf35ec1675f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-njxlm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.906114 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8a658d64-60b9-4161-93e5-8431821c07ce-system-cni-dir\") pod \"multus-additional-cni-plugins-sjmng\" (UID: \"8a658d64-60b9-4161-93e5-8431821c07ce\") " pod="openshift-multus/multus-additional-cni-plugins-sjmng" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.906178 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-host-run-k8s-cni-cncf-io\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.906206 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-host-var-lib-cni-multus\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.906217 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-host-run-k8s-cni-cncf-io\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.906231 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8a658d64-60b9-4161-93e5-8431821c07ce-cnibin\") pod \"multus-additional-cni-plugins-sjmng\" (UID: \"8a658d64-60b9-4161-93e5-8431821c07ce\") " pod="openshift-multus/multus-additional-cni-plugins-sjmng" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.906178 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8a658d64-60b9-4161-93e5-8431821c07ce-system-cni-dir\") pod \"multus-additional-cni-plugins-sjmng\" (UID: \"8a658d64-60b9-4161-93e5-8431821c07ce\") " pod="openshift-multus/multus-additional-cni-plugins-sjmng" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.906269 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b4b7c32a-5fc5-45f9-848f-f344598f6d73-mcd-auth-proxy-config\") pod \"machine-config-daemon-nzldt\" (UID: \"b4b7c32a-5fc5-45f9-848f-f344598f6d73\") " pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.906264 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-host-var-lib-cni-multus\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.906295 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ca3baa5f-80b1-4e21-bb81-bf35ec1675f7-hosts-file\") pod \"node-resolver-njxlm\" (UID: \"ca3baa5f-80b1-4e21-bb81-bf35ec1675f7\") " pod="openshift-dns/node-resolver-njxlm" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.906299 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8a658d64-60b9-4161-93e5-8431821c07ce-cnibin\") pod \"multus-additional-cni-plugins-sjmng\" (UID: \"8a658d64-60b9-4161-93e5-8431821c07ce\") " pod="openshift-multus/multus-additional-cni-plugins-sjmng" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.906317 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfmmf\" (UniqueName: \"kubernetes.io/projected/ca3baa5f-80b1-4e21-bb81-bf35ec1675f7-kube-api-access-rfmmf\") pod \"node-resolver-njxlm\" (UID: \"ca3baa5f-80b1-4e21-bb81-bf35ec1675f7\") " pod="openshift-dns/node-resolver-njxlm" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.906343 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dgvn\" (UniqueName: \"kubernetes.io/projected/51c18c30-ebd6-48f1-bc44-97849f648ed2-kube-api-access-5dgvn\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.906363 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8a658d64-60b9-4161-93e5-8431821c07ce-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sjmng\" (UID: \"8a658d64-60b9-4161-93e5-8431821c07ce\") " pod="openshift-multus/multus-additional-cni-plugins-sjmng" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.906418 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-os-release\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.906423 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ca3baa5f-80b1-4e21-bb81-bf35ec1675f7-hosts-file\") pod \"node-resolver-njxlm\" (UID: \"ca3baa5f-80b1-4e21-bb81-bf35ec1675f7\") " pod="openshift-dns/node-resolver-njxlm" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.906756 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-os-release\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.906438 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4b7c32a-5fc5-45f9-848f-f344598f6d73-proxy-tls\") pod \"machine-config-daemon-nzldt\" (UID: \"b4b7c32a-5fc5-45f9-848f-f344598f6d73\") " pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.906804 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/51c18c30-ebd6-48f1-bc44-97849f648ed2-cni-binary-copy\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.906832 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-host-run-multus-certs\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.906857 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-cnibin\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.906880 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b4b7c32a-5fc5-45f9-848f-f344598f6d73-rootfs\") pod \"machine-config-daemon-nzldt\" (UID: \"b4b7c32a-5fc5-45f9-848f-f344598f6d73\") " pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.906933 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-host-run-netns\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.906955 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smkp7\" (UniqueName: \"kubernetes.io/projected/8a658d64-60b9-4161-93e5-8431821c07ce-kube-api-access-smkp7\") pod \"multus-additional-cni-plugins-sjmng\" (UID: \"8a658d64-60b9-4161-93e5-8431821c07ce\") " pod="openshift-multus/multus-additional-cni-plugins-sjmng" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.906978 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-multus-cni-dir\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.906997 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-host-var-lib-kubelet\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907039 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-etc-kubernetes\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907063 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx6lb\" (UniqueName: \"kubernetes.io/projected/b4b7c32a-5fc5-45f9-848f-f344598f6d73-kube-api-access-bx6lb\") pod \"machine-config-daemon-nzldt\" (UID: \"b4b7c32a-5fc5-45f9-848f-f344598f6d73\") " pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907083 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-multus-socket-dir-parent\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907103 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907124 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-host-var-lib-cni-bin\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907135 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b4b7c32a-5fc5-45f9-848f-f344598f6d73-mcd-auth-proxy-config\") pod \"machine-config-daemon-nzldt\" (UID: \"b4b7c32a-5fc5-45f9-848f-f344598f6d73\") " pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907143 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-hostroot\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907165 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907193 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8a658d64-60b9-4161-93e5-8431821c07ce-os-release\") pod \"multus-additional-cni-plugins-sjmng\" (UID: \"8a658d64-60b9-4161-93e5-8431821c07ce\") " pod="openshift-multus/multus-additional-cni-plugins-sjmng" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907223 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-multus-cni-dir\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907225 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-system-cni-dir\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907294 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/51c18c30-ebd6-48f1-bc44-97849f648ed2-multus-daemon-config\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907314 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8a658d64-60b9-4161-93e5-8431821c07ce-cni-binary-copy\") pod \"multus-additional-cni-plugins-sjmng\" (UID: \"8a658d64-60b9-4161-93e5-8431821c07ce\") " pod="openshift-multus/multus-additional-cni-plugins-sjmng" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907355 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8a658d64-60b9-4161-93e5-8431821c07ce-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sjmng\" (UID: \"8a658d64-60b9-4161-93e5-8431821c07ce\") " pod="openshift-multus/multus-additional-cni-plugins-sjmng" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907362 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-system-cni-dir\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907374 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-multus-conf-dir\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907394 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-multus-conf-dir\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907446 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b4b7c32a-5fc5-45f9-848f-f344598f6d73-rootfs\") pod \"machine-config-daemon-nzldt\" (UID: \"b4b7c32a-5fc5-45f9-848f-f344598f6d73\") " pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907465 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907488 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-host-run-netns\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.906934 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-host-run-multus-certs\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907656 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/51c18c30-ebd6-48f1-bc44-97849f648ed2-cni-binary-copy\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907744 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907780 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-etc-kubernetes\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907802 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-host-var-lib-kubelet\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907864 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-cnibin\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907890 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907944 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8a658d64-60b9-4161-93e5-8431821c07ce-os-release\") pod \"multus-additional-cni-plugins-sjmng\" (UID: \"8a658d64-60b9-4161-93e5-8431821c07ce\") " pod="openshift-multus/multus-additional-cni-plugins-sjmng" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.907971 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.908017 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-host-var-lib-cni-bin\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.908109 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-multus-socket-dir-parent\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.908185 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/51c18c30-ebd6-48f1-bc44-97849f648ed2-hostroot\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.908248 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.908314 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.908365 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8a658d64-60b9-4161-93e5-8431821c07ce-cni-binary-copy\") pod \"multus-additional-cni-plugins-sjmng\" (UID: \"8a658d64-60b9-4161-93e5-8431821c07ce\") " pod="openshift-multus/multus-additional-cni-plugins-sjmng" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.908378 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.908414 4749 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.908427 4749 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.908438 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.908457 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.908468 4749 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.908912 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8a658d64-60b9-4161-93e5-8431821c07ce-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sjmng\" (UID: \"8a658d64-60b9-4161-93e5-8431821c07ce\") " pod="openshift-multus/multus-additional-cni-plugins-sjmng" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.909177 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4b7c32a-5fc5-45f9-848f-f344598f6d73-proxy-tls\") pod \"machine-config-daemon-nzldt\" (UID: \"b4b7c32a-5fc5-45f9-848f-f344598f6d73\") " pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.910564 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/51c18c30-ebd6-48f1-bc44-97849f648ed2-multus-daemon-config\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913708 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913744 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913754 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913762 4749 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913771 4749 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913779 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913788 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913796 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913810 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913821 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913829 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913838 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913846 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913855 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913865 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913875 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913883 4749 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913891 4749 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913899 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913907 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913915 4749 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913923 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913932 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913941 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913950 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913958 4749 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913966 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913974 4749 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913984 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.913996 4749 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914005 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914013 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914036 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914045 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914053 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914061 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914069 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914079 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914090 4749 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914103 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914114 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914124 4749 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914134 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914143 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914151 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914159 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914167 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914176 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914184 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914193 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914204 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914215 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914227 4749 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914236 4749 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914244 4749 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914252 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914260 4749 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914269 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914278 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914290 4749 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914302 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914313 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914324 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914336 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914348 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914361 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914372 4749 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914383 4749 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914394 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914405 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914417 4749 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914429 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914440 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914449 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914459 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914471 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914482 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914493 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914505 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914515 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914526 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914537 4749 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914548 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914560 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914571 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914581 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914593 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914604 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914615 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914627 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914639 4749 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914650 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914661 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914671 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914682 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914693 4749 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914703 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914714 4749 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914725 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914737 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914748 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914758 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914769 4749 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914780 4749 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914791 4749 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914802 4749 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914813 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914823 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914834 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914844 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914854 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914866 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914876 4749 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914967 4749 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914980 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.914991 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.915003 4749 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.915016 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.915045 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.915056 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.915067 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.915078 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.915091 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.915101 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.915113 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.915124 4749 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.915135 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.915147 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.915158 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.918534 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b7c32a-5fc5-45f9-848f-f344598f6d73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx6lb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx6lb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nzldt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.923454 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfmmf\" (UniqueName: \"kubernetes.io/projected/ca3baa5f-80b1-4e21-bb81-bf35ec1675f7-kube-api-access-rfmmf\") pod \"node-resolver-njxlm\" (UID: \"ca3baa5f-80b1-4e21-bb81-bf35ec1675f7\") " pod="openshift-dns/node-resolver-njxlm" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.925362 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx6lb\" (UniqueName: \"kubernetes.io/projected/b4b7c32a-5fc5-45f9-848f-f344598f6d73-kube-api-access-bx6lb\") pod \"machine-config-daemon-nzldt\" (UID: \"b4b7c32a-5fc5-45f9-848f-f344598f6d73\") " pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.928527 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dgvn\" (UniqueName: \"kubernetes.io/projected/51c18c30-ebd6-48f1-bc44-97849f648ed2-kube-api-access-5dgvn\") pod \"multus-4w8w4\" (UID: \"51c18c30-ebd6-48f1-bc44-97849f648ed2\") " pod="openshift-multus/multus-4w8w4" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.929713 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smkp7\" (UniqueName: \"kubernetes.io/projected/8a658d64-60b9-4161-93e5-8431821c07ce-kube-api-access-smkp7\") pod \"multus-additional-cni-plugins-sjmng\" (UID: \"8a658d64-60b9-4161-93e5-8431821c07ce\") " pod="openshift-multus/multus-additional-cni-plugins-sjmng" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.930605 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8a658d64-60b9-4161-93e5-8431821c07ce-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sjmng\" (UID: \"8a658d64-60b9-4161-93e5-8431821c07ce\") " pod="openshift-multus/multus-additional-cni-plugins-sjmng" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.930564 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sjmng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a658d64-60b9-4161-93e5-8431821c07ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sjmng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.942035 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.951475 4749 scope.go:117] "RemoveContainer" containerID="49098302a1c2e9d27f3beea4ed4b886169983707335b7599d69c4137e7e65016" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.951623 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.956535 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.965714 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.968976 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4w8w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51c18c30-ebd6-48f1-bc44-97849f648ed2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4w8w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.978352 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.978874 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.989794 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-njxlm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3baa5f-80b1-4e21-bb81-bf35ec1675f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-njxlm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.992307 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 18:34:06 crc kubenswrapper[4749]: W0219 18:34:06.992574 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-1b859109f646f41f91bf61da52da649a0ec538db706476df86ab6b066d46b9e9 WatchSource:0}: Error finding container 1b859109f646f41f91bf61da52da649a0ec538db706476df86ab6b066d46b9e9: Status 404 returned error can't find the container with id 1b859109f646f41f91bf61da52da649a0ec538db706476df86ab6b066d46b9e9 Feb 19 18:34:06 crc kubenswrapper[4749]: I0219 18:34:06.999532 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b7c32a-5fc5-45f9-848f-f344598f6d73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx6lb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx6lb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nzldt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.012872 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sjmng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a658d64-60b9-4161-93e5-8431821c07ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sjmng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.014327 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-njxlm" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.024479 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.027737 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 18:34:07 crc kubenswrapper[4749]: W0219 18:34:07.034700 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca3baa5f_80b1_4e21_bb81_bf35ec1675f7.slice/crio-58954697e42464f24a8af090bd63e6450c799af18f1056e7a4b4edcb25cbe712 WatchSource:0}: Error finding container 58954697e42464f24a8af090bd63e6450c799af18f1056e7a4b4edcb25cbe712: Status 404 returned error can't find the container with id 58954697e42464f24a8af090bd63e6450c799af18f1056e7a4b4edcb25cbe712 Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.035908 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.036468 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sjmng" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.041909 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4w8w4" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.047765 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4w8w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51c18c30-ebd6-48f1-bc44-97849f648ed2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4w8w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:07 crc kubenswrapper[4749]: W0219 18:34:07.051059 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4b7c32a_5fc5_45f9_848f_f344598f6d73.slice/crio-ada22e5ce16e6843ccb94d8ad8ccdc34536818220b852c1774ff20fee6c69749 WatchSource:0}: Error finding container ada22e5ce16e6843ccb94d8ad8ccdc34536818220b852c1774ff20fee6c69749: Status 404 returned error can't find the container with id ada22e5ce16e6843ccb94d8ad8ccdc34536818220b852c1774ff20fee6c69749 Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.061135 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ce9cc97-651c-4136-a376-5152d4db2876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69af99983784464ff8b25db0e73db93c849f9a6e704ca8f907228103ae2c0591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d2eea03700cf7f239720aedd551d3c3bf84b7a0c8d0b69214a3e868115b1f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46302fdc6234ad50fae4a7f00eb45d9d7fd624c9680683f332fb020fcca639fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49098302a1c2e9d27f3beea4ed4b886169983707335b7599d69c4137e7e65016\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49098302a1c2e9d27f3beea4ed4b886169983707335b7599d69c4137e7e65016\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 18:34:00.774221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 18:34:00.778472 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-353049508/tls.crt::/tmp/serving-cert-353049508/tls.key\\\\\\\"\\\\nI0219 18:34:06.283979 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 18:34:06.288892 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 18:34:06.288916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 18:34:06.288960 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 18:34:06.288967 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 18:34:06.304182 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 18:34:06.304215 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 18:34:06.304225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 18:34:06.304232 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 18:34:06.304237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 18:34:06.304241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 18:34:06.304245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 18:34:06.304370 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 18:34:06.310684 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T18:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b12443ad3ae04dfb8e1c84530aa6ceb0f813c57b0c18201a84a3e83b219f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de75e980cf20d25c7b5d236a8baa09a0c5aa9e5b1150f9d64a12c2e3df0a6302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de75e980cf20d25c7b5d236a8baa09a0c5aa9e5b1150f9d64a12c2e3df0a6302\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T18:33:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T18:33:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:33:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.073634 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hz5j8"] Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.074504 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.074825 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.075052 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.077925 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.078260 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.078369 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.078534 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.078632 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.079135 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.079159 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.080281 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.085244 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.096945 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.106014 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.117718 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.131709 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4w8w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51c18c30-ebd6-48f1-bc44-97849f648ed2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4w8w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.149939 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ce9cc97-651c-4136-a376-5152d4db2876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69af99983784464ff8b25db0e73db93c849f9a6e704ca8f907228103ae2c0591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d2eea03700cf7f239720aedd551d3c3bf84b7a0c8d0b69214a3e868115b1f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46302fdc6234ad50fae4a7f00eb45d9d7fd624c9680683f332fb020fcca639fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49098302a1c2e9d27f3beea4ed4b886169983707335b7599d69c4137e7e65016\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49098302a1c2e9d27f3beea4ed4b886169983707335b7599d69c4137e7e65016\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 18:34:00.774221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 18:34:00.778472 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-353049508/tls.crt::/tmp/serving-cert-353049508/tls.key\\\\\\\"\\\\nI0219 18:34:06.283979 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 18:34:06.288892 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 18:34:06.288916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 18:34:06.288960 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 18:34:06.288967 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 18:34:06.304182 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 18:34:06.304215 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 18:34:06.304225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 18:34:06.304232 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 18:34:06.304237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 18:34:06.304241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 18:34:06.304245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 18:34:06.304370 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 18:34:06.310684 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T18:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b12443ad3ae04dfb8e1c84530aa6ceb0f813c57b0c18201a84a3e83b219f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de75e980cf20d25c7b5d236a8baa09a0c5aa9e5b1150f9d64a12c2e3df0a6302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de75e980cf20d25c7b5d236a8baa09a0c5aa9e5b1150f9d64a12c2e3df0a6302\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T18:33:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T18:33:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:33:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.163077 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.168740 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.173961 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.185606 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.210229 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e6ef30-b3be-4cfe-869c-0341a645215b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hz5j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.218205 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-kubelet\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.218239 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-run-openvswitch\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.218269 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-run-ovn\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.218286 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-etc-openvswitch\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.218303 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-cni-netd\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.218325 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-var-lib-openvswitch\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.218342 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-cni-bin\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.218362 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-slash\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.218378 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-systemd-units\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.218391 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/01e6ef30-b3be-4cfe-869c-0341a645215b-env-overrides\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.218412 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nww4\" (UniqueName: \"kubernetes.io/projected/01e6ef30-b3be-4cfe-869c-0341a645215b-kube-api-access-9nww4\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.218429 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/01e6ef30-b3be-4cfe-869c-0341a645215b-ovnkube-config\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.218444 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-run-systemd\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.218457 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-run-netns\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.218471 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-run-ovn-kubernetes\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.218485 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.218500 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/01e6ef30-b3be-4cfe-869c-0341a645215b-ovnkube-script-lib\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.218514 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-node-log\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.218528 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/01e6ef30-b3be-4cfe-869c-0341a645215b-ovn-node-metrics-cert\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.218542 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-log-socket\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.222620 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.259623 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-njxlm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3baa5f-80b1-4e21-bb81-bf35ec1675f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-njxlm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.300599 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b7c32a-5fc5-45f9-848f-f344598f6d73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx6lb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx6lb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nzldt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.319550 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.319645 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-slash\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.319668 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-systemd-units\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.319686 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/01e6ef30-b3be-4cfe-869c-0341a645215b-env-overrides\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.319707 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nww4\" (UniqueName: \"kubernetes.io/projected/01e6ef30-b3be-4cfe-869c-0341a645215b-kube-api-access-9nww4\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.319723 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/01e6ef30-b3be-4cfe-869c-0341a645215b-ovnkube-config\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.319753 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-slash\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: E0219 18:34:07.319759 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:08.319732807 +0000 UTC m=+22.280952761 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.319825 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-run-systemd\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.319851 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-run-netns\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.319867 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-run-ovn-kubernetes\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.319885 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.319902 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/01e6ef30-b3be-4cfe-869c-0341a645215b-ovnkube-script-lib\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.319919 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-node-log\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.319935 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/01e6ef30-b3be-4cfe-869c-0341a645215b-ovn-node-metrics-cert\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.319956 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-log-socket\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.319975 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-kubelet\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.319995 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-run-openvswitch\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.320050 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-run-ovn\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.320073 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-etc-openvswitch\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.320075 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-systemd-units\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.320099 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-cni-netd\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.320129 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-var-lib-openvswitch\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.320145 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-cni-bin\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.320199 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-cni-bin\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.320226 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-log-socket\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.320246 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-kubelet\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.320264 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-run-openvswitch\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.320285 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-run-ovn\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.320305 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-etc-openvswitch\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.320334 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-cni-netd\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.320365 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-var-lib-openvswitch\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.320392 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-run-ovn-kubernetes\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.320404 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/01e6ef30-b3be-4cfe-869c-0341a645215b-env-overrides\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.320418 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-run-systemd\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.320439 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-run-netns\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.320459 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-node-log\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.320485 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.320894 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/01e6ef30-b3be-4cfe-869c-0341a645215b-ovnkube-script-lib\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.321095 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/01e6ef30-b3be-4cfe-869c-0341a645215b-ovnkube-config\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.324810 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/01e6ef30-b3be-4cfe-869c-0341a645215b-ovn-node-metrics-cert\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.342413 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sjmng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a658d64-60b9-4161-93e5-8431821c07ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sjmng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.376954 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nww4\" (UniqueName: \"kubernetes.io/projected/01e6ef30-b3be-4cfe-869c-0341a645215b-kube-api-access-9nww4\") pod \"ovnkube-node-hz5j8\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.420708 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.420749 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.420766 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.420785 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 18:34:07 crc kubenswrapper[4749]: E0219 18:34:07.420908 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 18:34:07 crc kubenswrapper[4749]: E0219 18:34:07.420910 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 18:34:07 crc kubenswrapper[4749]: E0219 18:34:07.420946 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 18:34:07 crc kubenswrapper[4749]: E0219 18:34:07.420987 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 18:34:08.420969679 +0000 UTC m=+22.382189633 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 18:34:07 crc kubenswrapper[4749]: E0219 18:34:07.421003 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 18:34:08.42099666 +0000 UTC m=+22.382216614 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 18:34:07 crc kubenswrapper[4749]: E0219 18:34:07.421006 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 18:34:07 crc kubenswrapper[4749]: E0219 18:34:07.421016 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 18:34:07 crc kubenswrapper[4749]: E0219 18:34:07.421040 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 18:34:07 crc kubenswrapper[4749]: E0219 18:34:07.421067 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 18:34:08.421055151 +0000 UTC m=+22.382275105 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 18:34:07 crc kubenswrapper[4749]: E0219 18:34:07.420924 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 18:34:07 crc kubenswrapper[4749]: E0219 18:34:07.421083 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 18:34:07 crc kubenswrapper[4749]: E0219 18:34:07.421100 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 18:34:08.421094702 +0000 UTC m=+22.382314656 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.489885 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:07 crc kubenswrapper[4749]: W0219 18:34:07.500497 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01e6ef30_b3be_4cfe_869c_0341a645215b.slice/crio-d5049b05c8bfc42c2e39890f7612a049bcc2a26b627bedea39c2081326062474 WatchSource:0}: Error finding container d5049b05c8bfc42c2e39890f7612a049bcc2a26b627bedea39c2081326062474: Status 404 returned error can't find the container with id d5049b05c8bfc42c2e39890f7612a049bcc2a26b627bedea39c2081326062474 Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.651581 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 00:08:10.430678087 +0000 UTC Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.665800 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-19 18:29:06 +0000 UTC, rotation deadline is 2026-11-08 21:31:56.431750221 +0000 UTC Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.665880 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6290h57m48.765873615s for next certificate rotation Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.790685 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5c1d977d666d5881dcdfa3d597fc08ae6b8d5c717ef6e41028fbf86608ac9b62"} Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.790725 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1b859109f646f41f91bf61da52da649a0ec538db706476df86ab6b066d46b9e9"} Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.795334 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.797517 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a7d9f8e499616cc17dac442dfcb89aa8bf915a4393701c45e1e84e60b734518b"} Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.799001 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-njxlm" event={"ID":"ca3baa5f-80b1-4e21-bb81-bf35ec1675f7","Type":"ContainerStarted","Data":"0ab2f9c946b957a71d73a2b4a9dcfe6da7664ae91d8d1c0fbcb0f57defdaf8ea"} Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.799053 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-njxlm" event={"ID":"ca3baa5f-80b1-4e21-bb81-bf35ec1675f7","Type":"ContainerStarted","Data":"58954697e42464f24a8af090bd63e6450c799af18f1056e7a4b4edcb25cbe712"} Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.800170 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"7730e683c9b87b1f1a4d1393fa1507be5d86438054c61e1f76e60cd3f399e6f5"} Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.800194 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"ab8aedaacc9f5bae5594f7e8e6752477f1f99aff7d68f58d511507935eed4ad1"} Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.806072 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sjmng" event={"ID":"8a658d64-60b9-4161-93e5-8431821c07ce","Type":"ContainerStarted","Data":"94fc778df5cca3499bffb39d7bf2c11797d5fe4ebf742c0946eb714dbcbbb248"} Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.806223 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sjmng" event={"ID":"8a658d64-60b9-4161-93e5-8431821c07ce","Type":"ContainerStarted","Data":"5cea7da84445992fdc29222030d9d835a680b88e49526c9fae6b8448cc6db0d5"} Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.807000 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" event={"ID":"01e6ef30-b3be-4cfe-869c-0341a645215b","Type":"ContainerStarted","Data":"d5049b05c8bfc42c2e39890f7612a049bcc2a26b627bedea39c2081326062474"} Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.809497 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4w8w4" event={"ID":"51c18c30-ebd6-48f1-bc44-97849f648ed2","Type":"ContainerStarted","Data":"a041fbac0086e83243af5dde989a265a8d0815962a53402b6f5c778112f054c3"} Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.809528 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4w8w4" event={"ID":"51c18c30-ebd6-48f1-bc44-97849f648ed2","Type":"ContainerStarted","Data":"834660c004d589d9306470e5a0e51e0553d26e3943cd4c5de62e73c06ba429b9"} Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.810610 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerStarted","Data":"ada22e5ce16e6843ccb94d8ad8ccdc34536818220b852c1774ff20fee6c69749"} Feb 19 18:34:07 crc kubenswrapper[4749]: I0219 18:34:07.813196 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b737caae6aa0ece2c024b145b8c9f524510b1ef82ed1e5e36bc5e68a7dc25050"} Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.330888 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:08 crc kubenswrapper[4749]: E0219 18:34:08.331104 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:10.33107522 +0000 UTC m=+24.292295174 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.432260 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.432300 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.432319 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.432337 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 18:34:08 crc kubenswrapper[4749]: E0219 18:34:08.432415 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 18:34:08 crc kubenswrapper[4749]: E0219 18:34:08.432416 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 18:34:08 crc kubenswrapper[4749]: E0219 18:34:08.432422 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 18:34:08 crc kubenswrapper[4749]: E0219 18:34:08.432583 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 18:34:08 crc kubenswrapper[4749]: E0219 18:34:08.432646 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 18:34:08 crc kubenswrapper[4749]: E0219 18:34:08.432662 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 18:34:08 crc kubenswrapper[4749]: E0219 18:34:08.432475 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 18:34:10.432458315 +0000 UTC m=+24.393678269 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 18:34:08 crc kubenswrapper[4749]: E0219 18:34:08.432605 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 18:34:08 crc kubenswrapper[4749]: E0219 18:34:08.432745 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 18:34:08 crc kubenswrapper[4749]: E0219 18:34:08.432725 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 18:34:10.432714841 +0000 UTC m=+24.393934795 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 18:34:08 crc kubenswrapper[4749]: E0219 18:34:08.432793 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 18:34:10.432781082 +0000 UTC m=+24.394001036 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 18:34:08 crc kubenswrapper[4749]: E0219 18:34:08.432804 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 18:34:10.432798772 +0000 UTC m=+24.394018716 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.652628 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 02:34:36.765776336 +0000 UTC Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.677872 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 18:34:08 crc kubenswrapper[4749]: E0219 18:34:08.678006 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.678135 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.678240 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 18:34:08 crc kubenswrapper[4749]: E0219 18:34:08.678344 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 18:34:08 crc kubenswrapper[4749]: E0219 18:34:08.678394 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.681497 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.682283 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.683414 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.684308 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.685148 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.685858 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.686713 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.687492 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.688473 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.689453 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.690207 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.693675 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.694370 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.695221 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.696527 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.697271 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.698631 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.699331 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.700236 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.701701 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.702518 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.703907 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.704590 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.706068 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.707266 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.708143 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.709629 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.710358 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.711886 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.712625 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.713890 4749 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.714064 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.716522 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.717396 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.718745 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.720916 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.722052 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.723533 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.724304 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.725350 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.725833 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.726774 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.727515 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.728480 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.728951 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.729918 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.730452 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.732011 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.732512 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.733339 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.733789 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.734684 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.735425 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.735907 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.816463 4749 generic.go:334] "Generic (PLEG): container finished" podID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerID="503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674" exitCode=0 Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.816518 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" event={"ID":"01e6ef30-b3be-4cfe-869c-0341a645215b","Type":"ContainerDied","Data":"503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674"} Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.818822 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerStarted","Data":"d123252182a47ccd6ba31cca4bf5a56c186613b03b6954d68bf4e641e17d68fd"} Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.818851 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerStarted","Data":"fcbf9629890eee3b536ecb92a62d87812c0720ee814a6859674ce7303be101b2"} Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.823532 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"38d9895704bfddce83b6ea7053b3c3ca469509ac2a362501481636d6363509fb"} Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.824958 4749 generic.go:334] "Generic (PLEG): container finished" podID="8a658d64-60b9-4161-93e5-8431821c07ce" containerID="94fc778df5cca3499bffb39d7bf2c11797d5fe4ebf742c0946eb714dbcbbb248" exitCode=0 Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.825048 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sjmng" event={"ID":"8a658d64-60b9-4161-93e5-8431821c07ce","Type":"ContainerDied","Data":"94fc778df5cca3499bffb39d7bf2c11797d5fe4ebf742c0946eb714dbcbbb248"} Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.839222 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:08Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.854475 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-njxlm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3baa5f-80b1-4e21-bb81-bf35ec1675f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-njxlm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:08Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.868847 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b7c32a-5fc5-45f9-848f-f344598f6d73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx6lb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx6lb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nzldt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:08Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.883327 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sjmng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a658d64-60b9-4161-93e5-8431821c07ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sjmng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:08Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.897059 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:08Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.912742 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:08Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.931693 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e82d8e4-2f15-4e4b-87c6-0fc518e286ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982d904c52a06d7228b35166501e609ec23e23e834d0d6d4b5e8bc8ab9150411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3881770674ef4130e3b94e5d7424d9474dc6a57df26af788808d649d3933ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d08fedcd92f642d5ba6d00bef77973a15b18fc25a0bd12a3eac4c79ae642acc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f89a78133a284f9f26d5276d2e9de76077a925a7e504cb03a523a2a501cb787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:33:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:08Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.948494 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4w8w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51c18c30-ebd6-48f1-bc44-97849f648ed2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4w8w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:08Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.962968 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:08Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.977092 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:08Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:08 crc kubenswrapper[4749]: I0219 18:34:08.987893 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:08Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.008768 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e6ef30-b3be-4cfe-869c-0341a645215b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T18:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T18:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hz5j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:09Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.020865 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ce9cc97-651c-4136-a376-5152d4db2876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69af99983784464ff8b25db0e73db93c849f9a6e704ca8f907228103ae2c0591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d2eea03700cf7f239720aedd551d3c3bf84b7a0c8d0b69214a3e868115b1f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46302fdc6234ad50fae4a7f00eb45d9d7fd624c9680683f332fb020fcca639fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49098302a1c2e9d27f3beea4ed4b886169983707335b7599d69c4137e7e65016\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49098302a1c2e9d27f3beea4ed4b886169983707335b7599d69c4137e7e65016\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 18:34:00.774221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 18:34:00.778472 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-353049508/tls.crt::/tmp/serving-cert-353049508/tls.key\\\\\\\"\\\\nI0219 18:34:06.283979 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 18:34:06.288892 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 18:34:06.288916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 18:34:06.288960 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 18:34:06.288967 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 18:34:06.304182 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 18:34:06.304215 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 18:34:06.304225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 18:34:06.304232 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 18:34:06.304237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 18:34:06.304241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 18:34:06.304245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 18:34:06.304370 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 18:34:06.310684 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T18:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b12443ad3ae04dfb8e1c84530aa6ceb0f813c57b0c18201a84a3e83b219f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de75e980cf20d25c7b5d236a8baa09a0c5aa9e5b1150f9d64a12c2e3df0a6302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de75e980cf20d25c7b5d236a8baa09a0c5aa9e5b1150f9d64a12c2e3df0a6302\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T18:33:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T18:33:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:33:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:09Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.034086 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e82d8e4-2f15-4e4b-87c6-0fc518e286ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982d904c52a06d7228b35166501e609ec23e23e834d0d6d4b5e8bc8ab9150411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3881770674ef4130e3b94e5d7424d9474dc6a57df26af788808d649d3933ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d08fedcd92f642d5ba6d00bef77973a15b18fc25a0bd12a3eac4c79ae642acc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f89a78133a284f9f26d5276d2e9de76077a925a7e504cb03a523a2a501cb787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:33:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:09Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.046930 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7730e683c9b87b1f1a4d1393fa1507be5d86438054c61e1f76e60cd3f399e6f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:09Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.064755 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38d9895704bfddce83b6ea7053b3c3ca469509ac2a362501481636d6363509fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1d977d666d5881dcdfa3d597fc08ae6b8d5c717ef6e41028fbf86608ac9b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:09Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.084229 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4w8w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51c18c30-ebd6-48f1-bc44-97849f648ed2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a041fbac0086e83243af5dde989a265a8d0815962a53402b6f5c778112f054c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4w8w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:09Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.109278 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ce9cc97-651c-4136-a376-5152d4db2876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69af99983784464ff8b25db0e73db93c849f9a6e704ca8f907228103ae2c0591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d2eea03700cf7f239720aedd551d3c3bf84b7a0c8d0b69214a3e868115b1f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46302fdc6234ad50fae4a7f00eb45d9d7fd624c9680683f332fb020fcca639fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d9f8e499616cc17dac442dfcb89aa8bf915a4393701c45e1e84e60b734518b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49098302a1c2e9d27f3beea4ed4b886169983707335b7599d69c4137e7e65016\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 18:34:00.774221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 18:34:00.778472 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-353049508/tls.crt::/tmp/serving-cert-353049508/tls.key\\\\\\\"\\\\nI0219 18:34:06.283979 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 18:34:06.288892 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 18:34:06.288916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 18:34:06.288960 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 18:34:06.288967 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 18:34:06.304182 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 18:34:06.304215 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 18:34:06.304225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 18:34:06.304232 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 18:34:06.304237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 18:34:06.304241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 18:34:06.304245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 18:34:06.304370 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 18:34:06.310684 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T18:33:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b12443ad3ae04dfb8e1c84530aa6ceb0f813c57b0c18201a84a3e83b219f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de75e980cf20d25c7b5d236a8baa09a0c5aa9e5b1150f9d64a12c2e3df0a6302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de75e980cf20d25c7b5d236a8baa09a0c5aa9e5b1150f9d64a12c2e3df0a6302\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T18:33:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T18:33:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:33:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:09Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.121448 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:09Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.131229 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:09Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.140421 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:09Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.157826 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e6ef30-b3be-4cfe-869c-0341a645215b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T18:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T18:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hz5j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:09Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.168841 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:09Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.179095 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-njxlm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3baa5f-80b1-4e21-bb81-bf35ec1675f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab2f9c946b957a71d73a2b4a9dcfe6da7664ae91d8d1c0fbcb0f57defdaf8ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-njxlm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:09Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.192943 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b7c32a-5fc5-45f9-848f-f344598f6d73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcbf9629890eee3b536ecb92a62d87812c0720ee814a6859674ce7303be101b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx6lb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d123252182a47ccd6ba31cca4bf5a56c186613b03b6954d68bf4e641e17d68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx6lb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nzldt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:09Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.207806 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sjmng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a658d64-60b9-4161-93e5-8431821c07ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94fc778df5cca3499bffb39d7bf2c11797d5fe4ebf742c0946eb714dbcbbb248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94fc778df5cca3499bffb39d7bf2c11797d5fe4ebf742c0946eb714dbcbbb248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T18:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sjmng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:09Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.653501 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 20:14:00.32121636 +0000 UTC Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.835005 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" event={"ID":"01e6ef30-b3be-4cfe-869c-0341a645215b","Type":"ContainerStarted","Data":"ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9"} Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.835082 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" event={"ID":"01e6ef30-b3be-4cfe-869c-0341a645215b","Type":"ContainerStarted","Data":"27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625"} Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.835098 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" event={"ID":"01e6ef30-b3be-4cfe-869c-0341a645215b","Type":"ContainerStarted","Data":"086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7"} Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.835109 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" event={"ID":"01e6ef30-b3be-4cfe-869c-0341a645215b","Type":"ContainerStarted","Data":"fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425"} Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.835122 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" event={"ID":"01e6ef30-b3be-4cfe-869c-0341a645215b","Type":"ContainerStarted","Data":"df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84"} Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.835132 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" event={"ID":"01e6ef30-b3be-4cfe-869c-0341a645215b","Type":"ContainerStarted","Data":"2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045"} Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.837142 4749 generic.go:334] "Generic (PLEG): container finished" podID="8a658d64-60b9-4161-93e5-8431821c07ce" containerID="964f31092d5fbebf0e81c9f02aa83ac92d7b44f5882e7e6d73e684b6a8cba334" exitCode=0 Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.837225 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sjmng" event={"ID":"8a658d64-60b9-4161-93e5-8431821c07ce","Type":"ContainerDied","Data":"964f31092d5fbebf0e81c9f02aa83ac92d7b44f5882e7e6d73e684b6a8cba334"} Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.839794 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4ba403e638ef7f9c6f3815a12c68fc83530137b2df138d8eaef6400043e79e3d"} Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.857255 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e6ef30-b3be-4cfe-869c-0341a645215b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T18:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T18:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nww4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hz5j8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:09Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.868882 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ce9cc97-651c-4136-a376-5152d4db2876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:33:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69af99983784464ff8b25db0e73db93c849f9a6e704ca8f907228103ae2c0591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d2eea03700cf7f239720aedd551d3c3bf84b7a0c8d0b69214a3e868115b1f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46302fdc6234ad50fae4a7f00eb45d9d7fd624c9680683f332fb020fcca639fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d9f8e499616cc17dac442dfcb89aa8bf915a4393701c45e1e84e60b734518b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49098302a1c2e9d27f3beea4ed4b886169983707335b7599d69c4137e7e65016\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 18:34:00.774221 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 18:34:00.778472 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-353049508/tls.crt::/tmp/serving-cert-353049508/tls.key\\\\\\\"\\\\nI0219 18:34:06.283979 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 18:34:06.288892 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 18:34:06.288916 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 18:34:06.288960 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 18:34:06.288967 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 18:34:06.304182 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 18:34:06.304215 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 18:34:06.304225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 18:34:06.304232 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 18:34:06.304237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 18:34:06.304241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 18:34:06.304245 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 18:34:06.304370 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 18:34:06.310684 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T18:33:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b12443ad3ae04dfb8e1c84530aa6ceb0f813c57b0c18201a84a3e83b219f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:33:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de75e980cf20d25c7b5d236a8baa09a0c5aa9e5b1150f9d64a12c2e3df0a6302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de75e980cf20d25c7b5d236a8baa09a0c5aa9e5b1150f9d64a12c2e3df0a6302\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T18:33:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T18:33:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:33:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:09Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.881166 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:09Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.895010 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:09Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.904260 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:09Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.917076 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sjmng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a658d64-60b9-4161-93e5-8431821c07ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94fc778df5cca3499bffb39d7bf2c11797d5fe4ebf742c0946eb714dbcbbb248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94fc778df5cca3499bffb39d7bf2c11797d5fe4ebf742c0946eb714dbcbbb248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T18:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://964f31092d5fbebf0e81c9f02aa83ac92d7b44f5882e7e6d73e684b6a8cba334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://964f31092d5fbebf0e81c9f02aa83ac92d7b44f5882e7e6d73e684b6a8cba334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T18:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T18:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smkp7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sjmng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:09Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.929421 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:09Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.938805 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-njxlm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3baa5f-80b1-4e21-bb81-bf35ec1675f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab2f9c946b957a71d73a2b4a9dcfe6da7664ae91d8d1c0fbcb0f57defdaf8ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfmmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-njxlm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:09Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.949836 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b7c32a-5fc5-45f9-848f-f344598f6d73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T18:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcbf9629890eee3b536ecb92a62d87812c0720ee814a6859674ce7303be101b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx6lb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d123252182a47ccd6ba31cca4bf5a56c186613b03b6954d68bf4e641e17d68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T18:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx6lb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T18:34:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nzldt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T18:34:09Z is after 2025-08-24T17:21:41Z" Feb 19 18:34:09 crc kubenswrapper[4749]: I0219 18:34:09.972916 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=2.9728932869999998 podStartE2EDuration="2.972893287s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:09.972697132 +0000 UTC m=+23.933917086" watchObservedRunningTime="2026-02-19 18:34:09.972893287 +0000 UTC m=+23.934113241" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.024371 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4w8w4" podStartSLOduration=4.024351227 podStartE2EDuration="4.024351227s" podCreationTimestamp="2026-02-19 18:34:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:10.013827667 +0000 UTC m=+23.975047621" watchObservedRunningTime="2026-02-19 18:34:10.024351227 +0000 UTC m=+23.985571191" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.024667 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-njxlm" podStartSLOduration=4.024662484 podStartE2EDuration="4.024662484s" podCreationTimestamp="2026-02-19 18:34:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:10.024552971 +0000 UTC m=+23.985772925" watchObservedRunningTime="2026-02-19 18:34:10.024662484 +0000 UTC m=+23.985882458" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.044747 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podStartSLOduration=4.04472906 podStartE2EDuration="4.04472906s" podCreationTimestamp="2026-02-19 18:34:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:10.035616053 +0000 UTC m=+23.996836017" watchObservedRunningTime="2026-02-19 18:34:10.04472906 +0000 UTC m=+24.005949014" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.045004 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-hk7fv"] Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.045340 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hk7fv" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.048922 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.049252 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.050455 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.057059 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.137206 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=4.137180562 podStartE2EDuration="4.137180562s" podCreationTimestamp="2026-02-19 18:34:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:10.126853307 +0000 UTC m=+24.088073261" watchObservedRunningTime="2026-02-19 18:34:10.137180562 +0000 UTC m=+24.098400516" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.147698 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/acfe27c9-df56-4de2-888a-d1f08570caa5-host\") pod \"node-ca-hk7fv\" (UID: \"acfe27c9-df56-4de2-888a-d1f08570caa5\") " pod="openshift-image-registry/node-ca-hk7fv" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.147738 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xnvb\" (UniqueName: \"kubernetes.io/projected/acfe27c9-df56-4de2-888a-d1f08570caa5-kube-api-access-7xnvb\") pod \"node-ca-hk7fv\" (UID: \"acfe27c9-df56-4de2-888a-d1f08570caa5\") " pod="openshift-image-registry/node-ca-hk7fv" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.147762 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/acfe27c9-df56-4de2-888a-d1f08570caa5-serviceca\") pod \"node-ca-hk7fv\" (UID: \"acfe27c9-df56-4de2-888a-d1f08570caa5\") " pod="openshift-image-registry/node-ca-hk7fv" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.225893 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mdbm"] Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.226571 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mdbm" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.228670 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.228712 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.242194 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-vw4bt"] Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.242689 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vw4bt" Feb 19 18:34:10 crc kubenswrapper[4749]: E0219 18:34:10.242757 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vw4bt" podUID="8771d522-aad3-4c8d-8f8b-eccc155fbf71" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.248561 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/acfe27c9-df56-4de2-888a-d1f08570caa5-host\") pod \"node-ca-hk7fv\" (UID: \"acfe27c9-df56-4de2-888a-d1f08570caa5\") " pod="openshift-image-registry/node-ca-hk7fv" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.248602 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xnvb\" (UniqueName: \"kubernetes.io/projected/acfe27c9-df56-4de2-888a-d1f08570caa5-kube-api-access-7xnvb\") pod \"node-ca-hk7fv\" (UID: \"acfe27c9-df56-4de2-888a-d1f08570caa5\") " pod="openshift-image-registry/node-ca-hk7fv" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.248633 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/acfe27c9-df56-4de2-888a-d1f08570caa5-serviceca\") pod \"node-ca-hk7fv\" (UID: \"acfe27c9-df56-4de2-888a-d1f08570caa5\") " pod="openshift-image-registry/node-ca-hk7fv" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.248713 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/acfe27c9-df56-4de2-888a-d1f08570caa5-host\") pod \"node-ca-hk7fv\" (UID: \"acfe27c9-df56-4de2-888a-d1f08570caa5\") " pod="openshift-image-registry/node-ca-hk7fv" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.249658 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/acfe27c9-df56-4de2-888a-d1f08570caa5-serviceca\") pod \"node-ca-hk7fv\" (UID: \"acfe27c9-df56-4de2-888a-d1f08570caa5\") " pod="openshift-image-registry/node-ca-hk7fv" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.272977 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xnvb\" (UniqueName: \"kubernetes.io/projected/acfe27c9-df56-4de2-888a-d1f08570caa5-kube-api-access-7xnvb\") pod \"node-ca-hk7fv\" (UID: \"acfe27c9-df56-4de2-888a-d1f08570caa5\") " pod="openshift-image-registry/node-ca-hk7fv" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.349945 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:10 crc kubenswrapper[4749]: E0219 18:34:10.350156 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:14.350125404 +0000 UTC m=+28.311345368 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.350426 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d288x\" (UniqueName: \"kubernetes.io/projected/8771d522-aad3-4c8d-8f8b-eccc155fbf71-kube-api-access-d288x\") pod \"network-metrics-daemon-vw4bt\" (UID: \"8771d522-aad3-4c8d-8f8b-eccc155fbf71\") " pod="openshift-multus/network-metrics-daemon-vw4bt" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.350559 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ec1c9f32-db08-4461-9b1e-1f7448119124-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4mdbm\" (UID: \"ec1c9f32-db08-4461-9b1e-1f7448119124\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mdbm" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.350674 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ec1c9f32-db08-4461-9b1e-1f7448119124-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4mdbm\" (UID: \"ec1c9f32-db08-4461-9b1e-1f7448119124\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mdbm" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.350806 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq967\" (UniqueName: \"kubernetes.io/projected/ec1c9f32-db08-4461-9b1e-1f7448119124-kube-api-access-qq967\") pod \"ovnkube-control-plane-749d76644c-4mdbm\" (UID: \"ec1c9f32-db08-4461-9b1e-1f7448119124\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mdbm" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.350904 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8771d522-aad3-4c8d-8f8b-eccc155fbf71-metrics-certs\") pod \"network-metrics-daemon-vw4bt\" (UID: \"8771d522-aad3-4c8d-8f8b-eccc155fbf71\") " pod="openshift-multus/network-metrics-daemon-vw4bt" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.351068 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ec1c9f32-db08-4461-9b1e-1f7448119124-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4mdbm\" (UID: \"ec1c9f32-db08-4461-9b1e-1f7448119124\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mdbm" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.357081 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hk7fv" Feb 19 18:34:10 crc kubenswrapper[4749]: W0219 18:34:10.374398 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacfe27c9_df56_4de2_888a_d1f08570caa5.slice/crio-681f896357ffeb384627686d69a613ed73577f77a60d055d06002e717c606a78 WatchSource:0}: Error finding container 681f896357ffeb384627686d69a613ed73577f77a60d055d06002e717c606a78: Status 404 returned error can't find the container with id 681f896357ffeb384627686d69a613ed73577f77a60d055d06002e717c606a78 Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.452584 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq967\" (UniqueName: \"kubernetes.io/projected/ec1c9f32-db08-4461-9b1e-1f7448119124-kube-api-access-qq967\") pod \"ovnkube-control-plane-749d76644c-4mdbm\" (UID: \"ec1c9f32-db08-4461-9b1e-1f7448119124\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mdbm" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.452644 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8771d522-aad3-4c8d-8f8b-eccc155fbf71-metrics-certs\") pod \"network-metrics-daemon-vw4bt\" (UID: \"8771d522-aad3-4c8d-8f8b-eccc155fbf71\") " pod="openshift-multus/network-metrics-daemon-vw4bt" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.452685 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ec1c9f32-db08-4461-9b1e-1f7448119124-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4mdbm\" (UID: \"ec1c9f32-db08-4461-9b1e-1f7448119124\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mdbm" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.452725 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.452778 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d288x\" (UniqueName: \"kubernetes.io/projected/8771d522-aad3-4c8d-8f8b-eccc155fbf71-kube-api-access-d288x\") pod \"network-metrics-daemon-vw4bt\" (UID: \"8771d522-aad3-4c8d-8f8b-eccc155fbf71\") " pod="openshift-multus/network-metrics-daemon-vw4bt" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.452812 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 18:34:10 crc kubenswrapper[4749]: E0219 18:34:10.452837 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 18:34:10 crc kubenswrapper[4749]: E0219 18:34:10.452919 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8771d522-aad3-4c8d-8f8b-eccc155fbf71-metrics-certs podName:8771d522-aad3-4c8d-8f8b-eccc155fbf71 nodeName:}" failed. No retries permitted until 2026-02-19 18:34:10.95289832 +0000 UTC m=+24.914118274 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8771d522-aad3-4c8d-8f8b-eccc155fbf71-metrics-certs") pod "network-metrics-daemon-vw4bt" (UID: "8771d522-aad3-4c8d-8f8b-eccc155fbf71") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.452846 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ec1c9f32-db08-4461-9b1e-1f7448119124-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4mdbm\" (UID: \"ec1c9f32-db08-4461-9b1e-1f7448119124\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mdbm" Feb 19 18:34:10 crc kubenswrapper[4749]: E0219 18:34:10.453431 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 18:34:10 crc kubenswrapper[4749]: E0219 18:34:10.453496 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 18:34:14.453477562 +0000 UTC m=+28.414697596 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 18:34:10 crc kubenswrapper[4749]: E0219 18:34:10.453565 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 18:34:10 crc kubenswrapper[4749]: E0219 18:34:10.453607 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 18:34:14.453595036 +0000 UTC m=+28.414815110 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.453639 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ec1c9f32-db08-4461-9b1e-1f7448119124-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4mdbm\" (UID: \"ec1c9f32-db08-4461-9b1e-1f7448119124\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mdbm" Feb 19 18:34:10 crc kubenswrapper[4749]: E0219 18:34:10.453835 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 18:34:10 crc kubenswrapper[4749]: E0219 18:34:10.453856 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 18:34:10 crc kubenswrapper[4749]: E0219 18:34:10.453870 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 18:34:10 crc kubenswrapper[4749]: E0219 18:34:10.453913 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 18:34:14.453900783 +0000 UTC m=+28.415120747 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.452976 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.454113 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ec1c9f32-db08-4461-9b1e-1f7448119124-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4mdbm\" (UID: \"ec1c9f32-db08-4461-9b1e-1f7448119124\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mdbm" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.454148 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 18:34:10 crc kubenswrapper[4749]: E0219 18:34:10.454236 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 18:34:10 crc kubenswrapper[4749]: E0219 18:34:10.454261 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 18:34:10 crc kubenswrapper[4749]: E0219 18:34:10.454271 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 18:34:10 crc kubenswrapper[4749]: E0219 18:34:10.454303 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 18:34:14.454292531 +0000 UTC m=+28.415512495 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.454836 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ec1c9f32-db08-4461-9b1e-1f7448119124-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4mdbm\" (UID: \"ec1c9f32-db08-4461-9b1e-1f7448119124\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mdbm" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.457295 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ec1c9f32-db08-4461-9b1e-1f7448119124-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4mdbm\" (UID: \"ec1c9f32-db08-4461-9b1e-1f7448119124\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mdbm" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.488998 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d288x\" (UniqueName: \"kubernetes.io/projected/8771d522-aad3-4c8d-8f8b-eccc155fbf71-kube-api-access-d288x\") pod \"network-metrics-daemon-vw4bt\" (UID: \"8771d522-aad3-4c8d-8f8b-eccc155fbf71\") " pod="openshift-multus/network-metrics-daemon-vw4bt" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.491055 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq967\" (UniqueName: \"kubernetes.io/projected/ec1c9f32-db08-4461-9b1e-1f7448119124-kube-api-access-qq967\") pod \"ovnkube-control-plane-749d76644c-4mdbm\" (UID: \"ec1c9f32-db08-4461-9b1e-1f7448119124\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mdbm" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.542260 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mdbm" Feb 19 18:34:10 crc kubenswrapper[4749]: W0219 18:34:10.568885 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec1c9f32_db08_4461_9b1e_1f7448119124.slice/crio-89a83a92159c59dba60986b93b2cc721baa9adf095a1d4dacb4dd8b6dd9570b5 WatchSource:0}: Error finding container 89a83a92159c59dba60986b93b2cc721baa9adf095a1d4dacb4dd8b6dd9570b5: Status 404 returned error can't find the container with id 89a83a92159c59dba60986b93b2cc721baa9adf095a1d4dacb4dd8b6dd9570b5 Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.654814 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 18:46:20.810605599 +0000 UTC Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.678725 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.678826 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 18:34:10 crc kubenswrapper[4749]: E0219 18:34:10.678941 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.679094 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 18:34:10 crc kubenswrapper[4749]: E0219 18:34:10.679185 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 18:34:10 crc kubenswrapper[4749]: E0219 18:34:10.679247 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.852177 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mdbm" event={"ID":"ec1c9f32-db08-4461-9b1e-1f7448119124","Type":"ContainerStarted","Data":"d5d80298b6079a7028e19fde91570afdb50d6fbe75fda951fcfe608964648866"} Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.852485 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mdbm" event={"ID":"ec1c9f32-db08-4461-9b1e-1f7448119124","Type":"ContainerStarted","Data":"89a83a92159c59dba60986b93b2cc721baa9adf095a1d4dacb4dd8b6dd9570b5"} Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.853743 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hk7fv" event={"ID":"acfe27c9-df56-4de2-888a-d1f08570caa5","Type":"ContainerStarted","Data":"2d6e6fd6e8d9be62b476703e1f267094078d159606f1cd922bbcf637ebe20b80"} Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.853801 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hk7fv" event={"ID":"acfe27c9-df56-4de2-888a-d1f08570caa5","Type":"ContainerStarted","Data":"681f896357ffeb384627686d69a613ed73577f77a60d055d06002e717c606a78"} Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.856516 4749 generic.go:334] "Generic (PLEG): container finished" podID="8a658d64-60b9-4161-93e5-8431821c07ce" containerID="001d7a882d1ed25ccf75fba6788ddcdc811d7a2ef5d6a25081ef17a6a202ab33" exitCode=0 Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.856614 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sjmng" event={"ID":"8a658d64-60b9-4161-93e5-8431821c07ce","Type":"ContainerDied","Data":"001d7a882d1ed25ccf75fba6788ddcdc811d7a2ef5d6a25081ef17a6a202ab33"} Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.865491 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-hk7fv" podStartSLOduration=3.86547229 podStartE2EDuration="3.86547229s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:10.864869416 +0000 UTC m=+24.826089380" watchObservedRunningTime="2026-02-19 18:34:10.86547229 +0000 UTC m=+24.826692254" Feb 19 18:34:10 crc kubenswrapper[4749]: I0219 18:34:10.958371 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8771d522-aad3-4c8d-8f8b-eccc155fbf71-metrics-certs\") pod \"network-metrics-daemon-vw4bt\" (UID: \"8771d522-aad3-4c8d-8f8b-eccc155fbf71\") " pod="openshift-multus/network-metrics-daemon-vw4bt" Feb 19 18:34:10 crc kubenswrapper[4749]: E0219 18:34:10.958547 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 18:34:10 crc kubenswrapper[4749]: E0219 18:34:10.958622 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8771d522-aad3-4c8d-8f8b-eccc155fbf71-metrics-certs podName:8771d522-aad3-4c8d-8f8b-eccc155fbf71 nodeName:}" failed. No retries permitted until 2026-02-19 18:34:11.958605057 +0000 UTC m=+25.919825011 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8771d522-aad3-4c8d-8f8b-eccc155fbf71-metrics-certs") pod "network-metrics-daemon-vw4bt" (UID: "8771d522-aad3-4c8d-8f8b-eccc155fbf71") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 18:34:11 crc kubenswrapper[4749]: I0219 18:34:11.038018 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:34:11 crc kubenswrapper[4749]: I0219 18:34:11.656044 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 21:03:15.71262963 +0000 UTC Feb 19 18:34:11 crc kubenswrapper[4749]: I0219 18:34:11.678552 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vw4bt" Feb 19 18:34:11 crc kubenswrapper[4749]: E0219 18:34:11.678687 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vw4bt" podUID="8771d522-aad3-4c8d-8f8b-eccc155fbf71" Feb 19 18:34:11 crc kubenswrapper[4749]: I0219 18:34:11.862877 4749 generic.go:334] "Generic (PLEG): container finished" podID="8a658d64-60b9-4161-93e5-8431821c07ce" containerID="469c79573b7cd7aa5524df72e983510d94a13033328630ae30addf745e502799" exitCode=0 Feb 19 18:34:11 crc kubenswrapper[4749]: I0219 18:34:11.862988 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sjmng" event={"ID":"8a658d64-60b9-4161-93e5-8431821c07ce","Type":"ContainerDied","Data":"469c79573b7cd7aa5524df72e983510d94a13033328630ae30addf745e502799"} Feb 19 18:34:11 crc kubenswrapper[4749]: I0219 18:34:11.866633 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mdbm" event={"ID":"ec1c9f32-db08-4461-9b1e-1f7448119124","Type":"ContainerStarted","Data":"f4e037308fd095541253d7abaddb726bb0a006cc3898607445328e78516e2a46"} Feb 19 18:34:11 crc kubenswrapper[4749]: I0219 18:34:11.936612 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mdbm" podStartSLOduration=4.936592152 podStartE2EDuration="4.936592152s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:11.935871375 +0000 UTC m=+25.897091329" watchObservedRunningTime="2026-02-19 18:34:11.936592152 +0000 UTC m=+25.897812116" Feb 19 18:34:11 crc kubenswrapper[4749]: I0219 18:34:11.971262 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8771d522-aad3-4c8d-8f8b-eccc155fbf71-metrics-certs\") pod \"network-metrics-daemon-vw4bt\" (UID: \"8771d522-aad3-4c8d-8f8b-eccc155fbf71\") " pod="openshift-multus/network-metrics-daemon-vw4bt" Feb 19 18:34:11 crc kubenswrapper[4749]: E0219 18:34:11.971387 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 18:34:11 crc kubenswrapper[4749]: E0219 18:34:11.971476 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8771d522-aad3-4c8d-8f8b-eccc155fbf71-metrics-certs podName:8771d522-aad3-4c8d-8f8b-eccc155fbf71 nodeName:}" failed. No retries permitted until 2026-02-19 18:34:13.971458464 +0000 UTC m=+27.932678418 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8771d522-aad3-4c8d-8f8b-eccc155fbf71-metrics-certs") pod "network-metrics-daemon-vw4bt" (UID: "8771d522-aad3-4c8d-8f8b-eccc155fbf71") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.656373 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 10:15:16.297145299 +0000 UTC Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.680899 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.681253 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 18:34:12 crc kubenswrapper[4749]: E0219 18:34:12.681355 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.681374 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 18:34:12 crc kubenswrapper[4749]: E0219 18:34:12.681713 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 18:34:12 crc kubenswrapper[4749]: E0219 18:34:12.681786 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.684064 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.685834 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.685859 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.685871 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.685932 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.693563 4749 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.693886 4749 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.695143 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.695162 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.695172 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.695186 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.695199 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T18:34:12Z","lastTransitionTime":"2026-02-19T18:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.742071 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-q4smw"] Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.742563 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q4smw" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.744893 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.744930 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.745095 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.745109 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.876315 4749 generic.go:334] "Generic (PLEG): container finished" podID="8a658d64-60b9-4161-93e5-8431821c07ce" containerID="a035c48cfb133ae44e090f7fc92099cacb6cafe48f2756a290847e9ac3274553" exitCode=0 Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.876443 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sjmng" event={"ID":"8a658d64-60b9-4161-93e5-8431821c07ce","Type":"ContainerDied","Data":"a035c48cfb133ae44e090f7fc92099cacb6cafe48f2756a290847e9ac3274553"} Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.887016 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e534a522-f5cc-43f8-8f32-63fc30717aed-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-q4smw\" (UID: \"e534a522-f5cc-43f8-8f32-63fc30717aed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q4smw" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.887074 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" event={"ID":"01e6ef30-b3be-4cfe-869c-0341a645215b","Type":"ContainerStarted","Data":"0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14"} Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.887104 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e534a522-f5cc-43f8-8f32-63fc30717aed-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-q4smw\" (UID: \"e534a522-f5cc-43f8-8f32-63fc30717aed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q4smw" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.887132 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e534a522-f5cc-43f8-8f32-63fc30717aed-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-q4smw\" (UID: \"e534a522-f5cc-43f8-8f32-63fc30717aed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q4smw" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.887222 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e534a522-f5cc-43f8-8f32-63fc30717aed-service-ca\") pod \"cluster-version-operator-5c965bbfc6-q4smw\" (UID: \"e534a522-f5cc-43f8-8f32-63fc30717aed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q4smw" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.887331 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e534a522-f5cc-43f8-8f32-63fc30717aed-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-q4smw\" (UID: \"e534a522-f5cc-43f8-8f32-63fc30717aed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q4smw" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.988493 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e534a522-f5cc-43f8-8f32-63fc30717aed-service-ca\") pod \"cluster-version-operator-5c965bbfc6-q4smw\" (UID: \"e534a522-f5cc-43f8-8f32-63fc30717aed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q4smw" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.988565 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e534a522-f5cc-43f8-8f32-63fc30717aed-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-q4smw\" (UID: \"e534a522-f5cc-43f8-8f32-63fc30717aed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q4smw" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.988656 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e534a522-f5cc-43f8-8f32-63fc30717aed-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-q4smw\" (UID: \"e534a522-f5cc-43f8-8f32-63fc30717aed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q4smw" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.988689 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e534a522-f5cc-43f8-8f32-63fc30717aed-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-q4smw\" (UID: \"e534a522-f5cc-43f8-8f32-63fc30717aed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q4smw" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.988748 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e534a522-f5cc-43f8-8f32-63fc30717aed-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-q4smw\" (UID: \"e534a522-f5cc-43f8-8f32-63fc30717aed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q4smw" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.990241 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e534a522-f5cc-43f8-8f32-63fc30717aed-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-q4smw\" (UID: \"e534a522-f5cc-43f8-8f32-63fc30717aed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q4smw" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.990324 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e534a522-f5cc-43f8-8f32-63fc30717aed-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-q4smw\" (UID: \"e534a522-f5cc-43f8-8f32-63fc30717aed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q4smw" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.990673 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e534a522-f5cc-43f8-8f32-63fc30717aed-service-ca\") pod \"cluster-version-operator-5c965bbfc6-q4smw\" (UID: \"e534a522-f5cc-43f8-8f32-63fc30717aed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q4smw" Feb 19 18:34:12 crc kubenswrapper[4749]: I0219 18:34:12.995389 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e534a522-f5cc-43f8-8f32-63fc30717aed-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-q4smw\" (UID: \"e534a522-f5cc-43f8-8f32-63fc30717aed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q4smw" Feb 19 18:34:13 crc kubenswrapper[4749]: I0219 18:34:13.007907 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e534a522-f5cc-43f8-8f32-63fc30717aed-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-q4smw\" (UID: \"e534a522-f5cc-43f8-8f32-63fc30717aed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q4smw" Feb 19 18:34:13 crc kubenswrapper[4749]: I0219 18:34:13.218877 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q4smw" Feb 19 18:34:13 crc kubenswrapper[4749]: W0219 18:34:13.235956 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode534a522_f5cc_43f8_8f32_63fc30717aed.slice/crio-a453bc0055a28648cb58dfcef89c07554e71a6c4e4f460f84b45b3a8ba9abc2d WatchSource:0}: Error finding container a453bc0055a28648cb58dfcef89c07554e71a6c4e4f460f84b45b3a8ba9abc2d: Status 404 returned error can't find the container with id a453bc0055a28648cb58dfcef89c07554e71a6c4e4f460f84b45b3a8ba9abc2d Feb 19 18:34:13 crc kubenswrapper[4749]: I0219 18:34:13.657454 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 04:00:44.776476386 +0000 UTC Feb 19 18:34:13 crc kubenswrapper[4749]: I0219 18:34:13.657507 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 19 18:34:13 crc kubenswrapper[4749]: I0219 18:34:13.667390 4749 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 19 18:34:13 crc kubenswrapper[4749]: I0219 18:34:13.678125 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vw4bt" Feb 19 18:34:13 crc kubenswrapper[4749]: E0219 18:34:13.678295 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vw4bt" podUID="8771d522-aad3-4c8d-8f8b-eccc155fbf71" Feb 19 18:34:13 crc kubenswrapper[4749]: I0219 18:34:13.897074 4749 generic.go:334] "Generic (PLEG): container finished" podID="8a658d64-60b9-4161-93e5-8431821c07ce" containerID="eb22f61888e5678591e62438bd39c149a5c7c64ecf568bee9085280d3f2ade70" exitCode=0 Feb 19 18:34:13 crc kubenswrapper[4749]: I0219 18:34:13.897152 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sjmng" event={"ID":"8a658d64-60b9-4161-93e5-8431821c07ce","Type":"ContainerDied","Data":"eb22f61888e5678591e62438bd39c149a5c7c64ecf568bee9085280d3f2ade70"} Feb 19 18:34:13 crc kubenswrapper[4749]: I0219 18:34:13.899590 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q4smw" event={"ID":"e534a522-f5cc-43f8-8f32-63fc30717aed","Type":"ContainerStarted","Data":"b3bb8d15bcf62e6a96c5185a43fde6279ef85a27ed100a10af7e9ccf7005eb45"} Feb 19 18:34:13 crc kubenswrapper[4749]: I0219 18:34:13.899659 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q4smw" event={"ID":"e534a522-f5cc-43f8-8f32-63fc30717aed","Type":"ContainerStarted","Data":"a453bc0055a28648cb58dfcef89c07554e71a6c4e4f460f84b45b3a8ba9abc2d"} Feb 19 18:34:14 crc kubenswrapper[4749]: I0219 18:34:14.018290 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8771d522-aad3-4c8d-8f8b-eccc155fbf71-metrics-certs\") pod \"network-metrics-daemon-vw4bt\" (UID: \"8771d522-aad3-4c8d-8f8b-eccc155fbf71\") " pod="openshift-multus/network-metrics-daemon-vw4bt" Feb 19 18:34:14 crc kubenswrapper[4749]: E0219 18:34:14.018947 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 18:34:14 crc kubenswrapper[4749]: E0219 18:34:14.019022 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8771d522-aad3-4c8d-8f8b-eccc155fbf71-metrics-certs podName:8771d522-aad3-4c8d-8f8b-eccc155fbf71 nodeName:}" failed. No retries permitted until 2026-02-19 18:34:18.019002355 +0000 UTC m=+31.980222319 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8771d522-aad3-4c8d-8f8b-eccc155fbf71-metrics-certs") pod "network-metrics-daemon-vw4bt" (UID: "8771d522-aad3-4c8d-8f8b-eccc155fbf71") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 18:34:14 crc kubenswrapper[4749]: I0219 18:34:14.422531 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:14 crc kubenswrapper[4749]: E0219 18:34:14.422769 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:22.422753214 +0000 UTC m=+36.383973168 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:14 crc kubenswrapper[4749]: I0219 18:34:14.523556 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 18:34:14 crc kubenswrapper[4749]: I0219 18:34:14.523656 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 18:34:14 crc kubenswrapper[4749]: I0219 18:34:14.523745 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 18:34:14 crc kubenswrapper[4749]: E0219 18:34:14.523776 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 18:34:14 crc kubenswrapper[4749]: E0219 18:34:14.523802 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 18:34:14 crc kubenswrapper[4749]: E0219 18:34:14.523864 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 18:34:22.523846843 +0000 UTC m=+36.485066797 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 18:34:14 crc kubenswrapper[4749]: E0219 18:34:14.523880 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 18:34:22.523874423 +0000 UTC m=+36.485094367 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 18:34:14 crc kubenswrapper[4749]: E0219 18:34:14.523942 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 18:34:14 crc kubenswrapper[4749]: E0219 18:34:14.523939 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 18:34:14 crc kubenswrapper[4749]: I0219 18:34:14.523789 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 18:34:14 crc kubenswrapper[4749]: E0219 18:34:14.523975 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 18:34:14 crc kubenswrapper[4749]: E0219 18:34:14.523992 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 18:34:14 crc kubenswrapper[4749]: E0219 18:34:14.524092 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 18:34:22.524068038 +0000 UTC m=+36.485288032 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 18:34:14 crc kubenswrapper[4749]: E0219 18:34:14.523953 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 18:34:14 crc kubenswrapper[4749]: E0219 18:34:14.524130 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 18:34:14 crc kubenswrapper[4749]: E0219 18:34:14.524169 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 18:34:22.524155979 +0000 UTC m=+36.485375973 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 18:34:14 crc kubenswrapper[4749]: I0219 18:34:14.678637 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 18:34:14 crc kubenswrapper[4749]: I0219 18:34:14.678714 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 18:34:14 crc kubenswrapper[4749]: I0219 18:34:14.678637 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 18:34:14 crc kubenswrapper[4749]: E0219 18:34:14.678845 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 18:34:14 crc kubenswrapper[4749]: E0219 18:34:14.679077 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 18:34:14 crc kubenswrapper[4749]: E0219 18:34:14.679174 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 18:34:14 crc kubenswrapper[4749]: I0219 18:34:14.906803 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sjmng" event={"ID":"8a658d64-60b9-4161-93e5-8431821c07ce","Type":"ContainerStarted","Data":"e0a06c247492b0c2c8f67dbddb007c7eb5f85eefb3315280910f07e5f31e4934"} Feb 19 18:34:14 crc kubenswrapper[4749]: I0219 18:34:14.911762 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" event={"ID":"01e6ef30-b3be-4cfe-869c-0341a645215b","Type":"ContainerStarted","Data":"2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990"} Feb 19 18:34:14 crc kubenswrapper[4749]: I0219 18:34:14.912061 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:14 crc kubenswrapper[4749]: I0219 18:34:14.912578 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:14 crc kubenswrapper[4749]: I0219 18:34:14.912600 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:14 crc kubenswrapper[4749]: I0219 18:34:14.933497 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-sjmng" podStartSLOduration=8.933478885 podStartE2EDuration="8.933478885s" podCreationTimestamp="2026-02-19 18:34:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:14.933435834 +0000 UTC m=+28.894655818" watchObservedRunningTime="2026-02-19 18:34:14.933478885 +0000 UTC m=+28.894698849" Feb 19 18:34:14 crc kubenswrapper[4749]: I0219 18:34:14.942199 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:14 crc kubenswrapper[4749]: I0219 18:34:14.943557 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:14 crc kubenswrapper[4749]: I0219 18:34:14.948449 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q4smw" podStartSLOduration=8.948432925 podStartE2EDuration="8.948432925s" podCreationTimestamp="2026-02-19 18:34:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:14.948113158 +0000 UTC m=+28.909333162" watchObservedRunningTime="2026-02-19 18:34:14.948432925 +0000 UTC m=+28.909652899" Feb 19 18:34:14 crc kubenswrapper[4749]: I0219 18:34:14.975569 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" podStartSLOduration=8.975551282 podStartE2EDuration="8.975551282s" podCreationTimestamp="2026-02-19 18:34:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:14.974864966 +0000 UTC m=+28.936084960" watchObservedRunningTime="2026-02-19 18:34:14.975551282 +0000 UTC m=+28.936771236" Feb 19 18:34:15 crc kubenswrapper[4749]: I0219 18:34:15.678458 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vw4bt" Feb 19 18:34:15 crc kubenswrapper[4749]: E0219 18:34:15.678595 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vw4bt" podUID="8771d522-aad3-4c8d-8f8b-eccc155fbf71" Feb 19 18:34:16 crc kubenswrapper[4749]: I0219 18:34:16.426699 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vw4bt"] Feb 19 18:34:16 crc kubenswrapper[4749]: I0219 18:34:16.426807 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vw4bt" Feb 19 18:34:16 crc kubenswrapper[4749]: E0219 18:34:16.426903 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vw4bt" podUID="8771d522-aad3-4c8d-8f8b-eccc155fbf71" Feb 19 18:34:16 crc kubenswrapper[4749]: I0219 18:34:16.509074 4749 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 19 18:34:16 crc kubenswrapper[4749]: I0219 18:34:16.679356 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 18:34:16 crc kubenswrapper[4749]: I0219 18:34:16.679443 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 18:34:16 crc kubenswrapper[4749]: E0219 18:34:16.679685 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 18:34:16 crc kubenswrapper[4749]: I0219 18:34:16.679990 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 18:34:16 crc kubenswrapper[4749]: E0219 18:34:16.680095 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 18:34:16 crc kubenswrapper[4749]: E0219 18:34:16.680145 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 18:34:18 crc kubenswrapper[4749]: I0219 18:34:18.072006 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8771d522-aad3-4c8d-8f8b-eccc155fbf71-metrics-certs\") pod \"network-metrics-daemon-vw4bt\" (UID: \"8771d522-aad3-4c8d-8f8b-eccc155fbf71\") " pod="openshift-multus/network-metrics-daemon-vw4bt" Feb 19 18:34:18 crc kubenswrapper[4749]: E0219 18:34:18.072172 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 18:34:18 crc kubenswrapper[4749]: E0219 18:34:18.072262 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8771d522-aad3-4c8d-8f8b-eccc155fbf71-metrics-certs podName:8771d522-aad3-4c8d-8f8b-eccc155fbf71 nodeName:}" failed. No retries permitted until 2026-02-19 18:34:26.072243155 +0000 UTC m=+40.033463109 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8771d522-aad3-4c8d-8f8b-eccc155fbf71-metrics-certs") pod "network-metrics-daemon-vw4bt" (UID: "8771d522-aad3-4c8d-8f8b-eccc155fbf71") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 18:34:18 crc kubenswrapper[4749]: I0219 18:34:18.679116 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vw4bt" Feb 19 18:34:18 crc kubenswrapper[4749]: I0219 18:34:18.679170 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 18:34:18 crc kubenswrapper[4749]: I0219 18:34:18.679208 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 18:34:18 crc kubenswrapper[4749]: E0219 18:34:18.679328 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vw4bt" podUID="8771d522-aad3-4c8d-8f8b-eccc155fbf71" Feb 19 18:34:18 crc kubenswrapper[4749]: E0219 18:34:18.679454 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 18:34:18 crc kubenswrapper[4749]: E0219 18:34:18.679589 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 18:34:18 crc kubenswrapper[4749]: I0219 18:34:18.679289 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 18:34:18 crc kubenswrapper[4749]: E0219 18:34:18.679710 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.033822 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.034068 4749 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.093637 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-d49ll"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.093937 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qbnpr"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.094319 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-497mk"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.099466 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.104887 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qbnpr" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.114813 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-4dfs6"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.115717 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.116162 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4dfs6" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.116158 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.116191 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.116237 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.117537 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.117770 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.119515 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.120546 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.120738 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8cg8w"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.121321 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8cg8w" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.122867 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-htrq5"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.123523 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-htrq5" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.123973 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2wn9n"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.125212 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-2wn9n" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.125422 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-j6r8j"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.125915 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-j6r8j" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.129695 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-7vbmz"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.130372 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-7vbmz" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.132117 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tzw68"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.133171 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.133673 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.134334 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tzw68" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.137334 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9dlzr"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.141958 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-q7rpj"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.142764 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nxsch"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.142822 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-q7rpj" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.142767 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9dlzr" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.144421 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.145075 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tsf9g"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.145448 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-nxsch" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.145580 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tsf9g" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.145926 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.148338 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.148398 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.148669 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.148771 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.150292 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.150806 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.151078 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.151323 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.151671 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.151905 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.152098 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.152293 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.152578 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.152771 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.153273 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.153510 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-w4nkj"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.153669 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.153853 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.154092 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-rnkj5"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.154430 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.154556 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-slnmj"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.155068 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.155255 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-slnmj" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.183886 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.184338 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.184849 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.186434 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.191582 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sl26\" (UniqueName: \"kubernetes.io/projected/1d278b7c-89ac-4929-9820-911c03fdb680-kube-api-access-9sl26\") pod \"openshift-apiserver-operator-796bbdcf4f-tzw68\" (UID: \"1d278b7c-89ac-4929-9820-911c03fdb680\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tzw68" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.191792 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4586ae5a-18b0-474f-8581-a2cd850ea693-default-certificate\") pod \"router-default-5444994796-slnmj\" (UID: \"4586ae5a-18b0-474f-8581-a2cd850ea693\") " pod="openshift-ingress/router-default-5444994796-slnmj" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.191859 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0263ca67-c191-4560-be01-a2ad7fbeea4f-service-ca-bundle\") pod \"authentication-operator-69f744f599-j6r8j\" (UID: \"0263ca67-c191-4560-be01-a2ad7fbeea4f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j6r8j" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.191893 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c30b3c06-11f3-450e-8793-4bacb3756a3e-image-import-ca\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.191934 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bslmj\" (UniqueName: \"kubernetes.io/projected/c553f8a1-0943-4437-9cd4-ef9192718b1e-kube-api-access-bslmj\") pod \"console-operator-58897d9998-2wn9n\" (UID: \"c553f8a1-0943-4437-9cd4-ef9192718b1e\") " pod="openshift-console-operator/console-operator-58897d9998-2wn9n" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.191982 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4586ae5a-18b0-474f-8581-a2cd850ea693-service-ca-bundle\") pod \"router-default-5444994796-slnmj\" (UID: \"4586ae5a-18b0-474f-8581-a2cd850ea693\") " pod="openshift-ingress/router-default-5444994796-slnmj" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.192108 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c30b3c06-11f3-450e-8793-4bacb3756a3e-etcd-client\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.192189 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9blt\" (UniqueName: \"kubernetes.io/projected/0263ca67-c191-4560-be01-a2ad7fbeea4f-kube-api-access-m9blt\") pod \"authentication-operator-69f744f599-j6r8j\" (UID: \"0263ca67-c191-4560-be01-a2ad7fbeea4f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j6r8j" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.192273 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c553f8a1-0943-4437-9cd4-ef9192718b1e-serving-cert\") pod \"console-operator-58897d9998-2wn9n\" (UID: \"c553f8a1-0943-4437-9cd4-ef9192718b1e\") " pod="openshift-console-operator/console-operator-58897d9998-2wn9n" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.192339 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4586ae5a-18b0-474f-8581-a2cd850ea693-stats-auth\") pod \"router-default-5444994796-slnmj\" (UID: \"4586ae5a-18b0-474f-8581-a2cd850ea693\") " pod="openshift-ingress/router-default-5444994796-slnmj" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.192385 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9e1cff1c-e90e-42a5-8ca9-0335f93988ff-auth-proxy-config\") pod \"machine-approver-56656f9798-4dfs6\" (UID: \"9e1cff1c-e90e-42a5-8ca9-0335f93988ff\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4dfs6" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.192425 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c553f8a1-0943-4437-9cd4-ef9192718b1e-trusted-ca\") pod \"console-operator-58897d9998-2wn9n\" (UID: \"c553f8a1-0943-4437-9cd4-ef9192718b1e\") " pod="openshift-console-operator/console-operator-58897d9998-2wn9n" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.192438 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.192463 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c30b3c06-11f3-450e-8793-4bacb3756a3e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.192532 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qw44\" (UniqueName: \"kubernetes.io/projected/10b818d4-d6fd-4377-bd69-510eda43e365-kube-api-access-4qw44\") pod \"cluster-samples-operator-665b6dd947-8cg8w\" (UID: \"10b818d4-d6fd-4377-bd69-510eda43e365\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8cg8w" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.192618 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9e1cff1c-e90e-42a5-8ca9-0335f93988ff-machine-approver-tls\") pod \"machine-approver-56656f9798-4dfs6\" (UID: \"9e1cff1c-e90e-42a5-8ca9-0335f93988ff\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4dfs6" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.192664 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c30b3c06-11f3-450e-8793-4bacb3756a3e-node-pullsecrets\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.192705 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/10b818d4-d6fd-4377-bd69-510eda43e365-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8cg8w\" (UID: \"10b818d4-d6fd-4377-bd69-510eda43e365\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8cg8w" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.192735 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.192765 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e1cff1c-e90e-42a5-8ca9-0335f93988ff-config\") pod \"machine-approver-56656f9798-4dfs6\" (UID: \"9e1cff1c-e90e-42a5-8ca9-0335f93988ff\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4dfs6" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.192853 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0263ca67-c191-4560-be01-a2ad7fbeea4f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-j6r8j\" (UID: \"0263ca67-c191-4560-be01-a2ad7fbeea4f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j6r8j" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.192928 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c30b3c06-11f3-450e-8793-4bacb3756a3e-audit-dir\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.192968 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d278b7c-89ac-4929-9820-911c03fdb680-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-tzw68\" (UID: \"1d278b7c-89ac-4929-9820-911c03fdb680\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tzw68" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.193056 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm22m\" (UniqueName: \"kubernetes.io/projected/4c22c2e7-97be-4549-9d14-c65edcc0b2ec-kube-api-access-gm22m\") pod \"openshift-config-operator-7777fb866f-htrq5\" (UID: \"4c22c2e7-97be-4549-9d14-c65edcc0b2ec\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-htrq5" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.193125 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q2xz\" (UniqueName: \"kubernetes.io/projected/4586ae5a-18b0-474f-8581-a2cd850ea693-kube-api-access-6q2xz\") pod \"router-default-5444994796-slnmj\" (UID: \"4586ae5a-18b0-474f-8581-a2cd850ea693\") " pod="openshift-ingress/router-default-5444994796-slnmj" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.193144 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.193285 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c30b3c06-11f3-450e-8793-4bacb3756a3e-serving-cert\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.193346 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.193364 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c30b3c06-11f3-450e-8793-4bacb3756a3e-encryption-config\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.193385 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.193407 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h49f\" (UniqueName: \"kubernetes.io/projected/9e1cff1c-e90e-42a5-8ca9-0335f93988ff-kube-api-access-2h49f\") pod \"machine-approver-56656f9798-4dfs6\" (UID: \"9e1cff1c-e90e-42a5-8ca9-0335f93988ff\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4dfs6" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.193421 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.193474 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.193357 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.193572 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.193635 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4c22c2e7-97be-4549-9d14-c65edcc0b2ec-available-featuregates\") pod \"openshift-config-operator-7777fb866f-htrq5\" (UID: \"4c22c2e7-97be-4549-9d14-c65edcc0b2ec\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-htrq5" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.193677 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c30b3c06-11f3-450e-8793-4bacb3756a3e-etcd-serving-ca\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.193765 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0263ca67-c191-4560-be01-a2ad7fbeea4f-config\") pod \"authentication-operator-69f744f599-j6r8j\" (UID: \"0263ca67-c191-4560-be01-a2ad7fbeea4f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j6r8j" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.193797 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c30b3c06-11f3-450e-8793-4bacb3756a3e-config\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.193840 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c553f8a1-0943-4437-9cd4-ef9192718b1e-config\") pod \"console-operator-58897d9998-2wn9n\" (UID: \"c553f8a1-0943-4437-9cd4-ef9192718b1e\") " pod="openshift-console-operator/console-operator-58897d9998-2wn9n" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.193857 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0263ca67-c191-4560-be01-a2ad7fbeea4f-serving-cert\") pod \"authentication-operator-69f744f599-j6r8j\" (UID: \"0263ca67-c191-4560-be01-a2ad7fbeea4f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j6r8j" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.193879 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d278b7c-89ac-4929-9820-911c03fdb680-config\") pod \"openshift-apiserver-operator-796bbdcf4f-tzw68\" (UID: \"1d278b7c-89ac-4929-9820-911c03fdb680\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tzw68" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.193901 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4586ae5a-18b0-474f-8581-a2cd850ea693-metrics-certs\") pod \"router-default-5444994796-slnmj\" (UID: \"4586ae5a-18b0-474f-8581-a2cd850ea693\") " pod="openshift-ingress/router-default-5444994796-slnmj" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.193932 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c30b3c06-11f3-450e-8793-4bacb3756a3e-audit\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.193953 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx8tj\" (UniqueName: \"kubernetes.io/projected/c30b3c06-11f3-450e-8793-4bacb3756a3e-kube-api-access-tx8tj\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.193984 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c22c2e7-97be-4549-9d14-c65edcc0b2ec-serving-cert\") pod \"openshift-config-operator-7777fb866f-htrq5\" (UID: \"4c22c2e7-97be-4549-9d14-c65edcc0b2ec\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-htrq5" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.196339 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.200188 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.200451 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.216480 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.216859 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.216991 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.217240 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.217410 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.234728 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.234982 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.235124 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.235215 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.235383 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.235523 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.235637 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.235869 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.236143 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.236367 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.236458 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.236529 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.236621 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.236706 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.236795 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.235657 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.235167 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.237300 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.237655 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.237831 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.245556 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.245602 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.245651 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.245799 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.248402 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.249945 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.250224 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.250311 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.250621 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.250388 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.250461 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.250937 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.251089 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.251167 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.251174 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.251200 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.251281 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.251319 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.252238 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f62zc"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.252825 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f62zc" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.256377 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.260119 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4wt7q"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.260680 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4wt7q" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.261276 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.264108 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.278255 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.278514 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.278704 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.278803 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.278947 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.279319 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.279438 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.278875 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.279363 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.283321 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-8bk42"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.283995 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8bk42" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.290520 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.290533 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.292474 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.294406 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q68nf"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.295607 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4586ae5a-18b0-474f-8581-a2cd850ea693-service-ca-bundle\") pod \"router-default-5444994796-slnmj\" (UID: \"4586ae5a-18b0-474f-8581-a2cd850ea693\") " pod="openshift-ingress/router-default-5444994796-slnmj" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.295640 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bslmj\" (UniqueName: \"kubernetes.io/projected/c553f8a1-0943-4437-9cd4-ef9192718b1e-kube-api-access-bslmj\") pod \"console-operator-58897d9998-2wn9n\" (UID: \"c553f8a1-0943-4437-9cd4-ef9192718b1e\") " pod="openshift-console-operator/console-operator-58897d9998-2wn9n" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.295669 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c30b3c06-11f3-450e-8793-4bacb3756a3e-etcd-client\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.295717 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9blt\" (UniqueName: \"kubernetes.io/projected/0263ca67-c191-4560-be01-a2ad7fbeea4f-kube-api-access-m9blt\") pod \"authentication-operator-69f744f599-j6r8j\" (UID: \"0263ca67-c191-4560-be01-a2ad7fbeea4f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j6r8j" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.295732 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c553f8a1-0943-4437-9cd4-ef9192718b1e-serving-cert\") pod \"console-operator-58897d9998-2wn9n\" (UID: \"c553f8a1-0943-4437-9cd4-ef9192718b1e\") " pod="openshift-console-operator/console-operator-58897d9998-2wn9n" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.295754 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c553f8a1-0943-4437-9cd4-ef9192718b1e-trusted-ca\") pod \"console-operator-58897d9998-2wn9n\" (UID: \"c553f8a1-0943-4437-9cd4-ef9192718b1e\") " pod="openshift-console-operator/console-operator-58897d9998-2wn9n" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.295769 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4586ae5a-18b0-474f-8581-a2cd850ea693-stats-auth\") pod \"router-default-5444994796-slnmj\" (UID: \"4586ae5a-18b0-474f-8581-a2cd850ea693\") " pod="openshift-ingress/router-default-5444994796-slnmj" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.295784 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9e1cff1c-e90e-42a5-8ca9-0335f93988ff-auth-proxy-config\") pod \"machine-approver-56656f9798-4dfs6\" (UID: \"9e1cff1c-e90e-42a5-8ca9-0335f93988ff\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4dfs6" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.295800 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c30b3c06-11f3-450e-8793-4bacb3756a3e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.295816 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qw44\" (UniqueName: \"kubernetes.io/projected/10b818d4-d6fd-4377-bd69-510eda43e365-kube-api-access-4qw44\") pod \"cluster-samples-operator-665b6dd947-8cg8w\" (UID: \"10b818d4-d6fd-4377-bd69-510eda43e365\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8cg8w" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.295830 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9e1cff1c-e90e-42a5-8ca9-0335f93988ff-machine-approver-tls\") pod \"machine-approver-56656f9798-4dfs6\" (UID: \"9e1cff1c-e90e-42a5-8ca9-0335f93988ff\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4dfs6" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.295845 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c30b3c06-11f3-450e-8793-4bacb3756a3e-node-pullsecrets\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.295866 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/10b818d4-d6fd-4377-bd69-510eda43e365-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8cg8w\" (UID: \"10b818d4-d6fd-4377-bd69-510eda43e365\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8cg8w" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.295888 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0263ca67-c191-4560-be01-a2ad7fbeea4f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-j6r8j\" (UID: \"0263ca67-c191-4560-be01-a2ad7fbeea4f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j6r8j" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.295904 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e1cff1c-e90e-42a5-8ca9-0335f93988ff-config\") pod \"machine-approver-56656f9798-4dfs6\" (UID: \"9e1cff1c-e90e-42a5-8ca9-0335f93988ff\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4dfs6" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.295918 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c30b3c06-11f3-450e-8793-4bacb3756a3e-audit-dir\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.295934 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm22m\" (UniqueName: \"kubernetes.io/projected/4c22c2e7-97be-4549-9d14-c65edcc0b2ec-kube-api-access-gm22m\") pod \"openshift-config-operator-7777fb866f-htrq5\" (UID: \"4c22c2e7-97be-4549-9d14-c65edcc0b2ec\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-htrq5" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.295950 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d278b7c-89ac-4929-9820-911c03fdb680-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-tzw68\" (UID: \"1d278b7c-89ac-4929-9820-911c03fdb680\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tzw68" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.295964 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q2xz\" (UniqueName: \"kubernetes.io/projected/4586ae5a-18b0-474f-8581-a2cd850ea693-kube-api-access-6q2xz\") pod \"router-default-5444994796-slnmj\" (UID: \"4586ae5a-18b0-474f-8581-a2cd850ea693\") " pod="openshift-ingress/router-default-5444994796-slnmj" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.295981 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c30b3c06-11f3-450e-8793-4bacb3756a3e-etcd-serving-ca\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.295995 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c30b3c06-11f3-450e-8793-4bacb3756a3e-serving-cert\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.296008 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c30b3c06-11f3-450e-8793-4bacb3756a3e-encryption-config\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.296037 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h49f\" (UniqueName: \"kubernetes.io/projected/9e1cff1c-e90e-42a5-8ca9-0335f93988ff-kube-api-access-2h49f\") pod \"machine-approver-56656f9798-4dfs6\" (UID: \"9e1cff1c-e90e-42a5-8ca9-0335f93988ff\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4dfs6" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.296051 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4c22c2e7-97be-4549-9d14-c65edcc0b2ec-available-featuregates\") pod \"openshift-config-operator-7777fb866f-htrq5\" (UID: \"4c22c2e7-97be-4549-9d14-c65edcc0b2ec\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-htrq5" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.296066 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0263ca67-c191-4560-be01-a2ad7fbeea4f-config\") pod \"authentication-operator-69f744f599-j6r8j\" (UID: \"0263ca67-c191-4560-be01-a2ad7fbeea4f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j6r8j" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.296082 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c30b3c06-11f3-450e-8793-4bacb3756a3e-config\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.296102 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4586ae5a-18b0-474f-8581-a2cd850ea693-metrics-certs\") pod \"router-default-5444994796-slnmj\" (UID: \"4586ae5a-18b0-474f-8581-a2cd850ea693\") " pod="openshift-ingress/router-default-5444994796-slnmj" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.296110 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.296391 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.296728 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c553f8a1-0943-4437-9cd4-ef9192718b1e-config\") pod \"console-operator-58897d9998-2wn9n\" (UID: \"c553f8a1-0943-4437-9cd4-ef9192718b1e\") " pod="openshift-console-operator/console-operator-58897d9998-2wn9n" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.296840 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.296116 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c553f8a1-0943-4437-9cd4-ef9192718b1e-config\") pod \"console-operator-58897d9998-2wn9n\" (UID: \"c553f8a1-0943-4437-9cd4-ef9192718b1e\") " pod="openshift-console-operator/console-operator-58897d9998-2wn9n" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.297160 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0263ca67-c191-4560-be01-a2ad7fbeea4f-serving-cert\") pod \"authentication-operator-69f744f599-j6r8j\" (UID: \"0263ca67-c191-4560-be01-a2ad7fbeea4f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j6r8j" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.297184 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d278b7c-89ac-4929-9820-911c03fdb680-config\") pod \"openshift-apiserver-operator-796bbdcf4f-tzw68\" (UID: \"1d278b7c-89ac-4929-9820-911c03fdb680\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tzw68" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.297204 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c30b3c06-11f3-450e-8793-4bacb3756a3e-audit\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.297222 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx8tj\" (UniqueName: \"kubernetes.io/projected/c30b3c06-11f3-450e-8793-4bacb3756a3e-kube-api-access-tx8tj\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.297252 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c22c2e7-97be-4549-9d14-c65edcc0b2ec-serving-cert\") pod \"openshift-config-operator-7777fb866f-htrq5\" (UID: \"4c22c2e7-97be-4549-9d14-c65edcc0b2ec\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-htrq5" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.297275 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sl26\" (UniqueName: \"kubernetes.io/projected/1d278b7c-89ac-4929-9820-911c03fdb680-kube-api-access-9sl26\") pod \"openshift-apiserver-operator-796bbdcf4f-tzw68\" (UID: \"1d278b7c-89ac-4929-9820-911c03fdb680\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tzw68" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.297299 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4586ae5a-18b0-474f-8581-a2cd850ea693-default-certificate\") pod \"router-default-5444994796-slnmj\" (UID: \"4586ae5a-18b0-474f-8581-a2cd850ea693\") " pod="openshift-ingress/router-default-5444994796-slnmj" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.297316 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c30b3c06-11f3-450e-8793-4bacb3756a3e-image-import-ca\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.297333 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0263ca67-c191-4560-be01-a2ad7fbeea4f-service-ca-bundle\") pod \"authentication-operator-69f744f599-j6r8j\" (UID: \"0263ca67-c191-4560-be01-a2ad7fbeea4f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j6r8j" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.297392 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4586ae5a-18b0-474f-8581-a2cd850ea693-service-ca-bundle\") pod \"router-default-5444994796-slnmj\" (UID: \"4586ae5a-18b0-474f-8581-a2cd850ea693\") " pod="openshift-ingress/router-default-5444994796-slnmj" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.297909 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0263ca67-c191-4560-be01-a2ad7fbeea4f-service-ca-bundle\") pod \"authentication-operator-69f744f599-j6r8j\" (UID: \"0263ca67-c191-4560-be01-a2ad7fbeea4f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j6r8j" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.301113 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c30b3c06-11f3-450e-8793-4bacb3756a3e-audit-dir\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.305193 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0263ca67-c191-4560-be01-a2ad7fbeea4f-config\") pod \"authentication-operator-69f744f599-j6r8j\" (UID: \"0263ca67-c191-4560-be01-a2ad7fbeea4f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j6r8j" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.306105 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4c22c2e7-97be-4549-9d14-c65edcc0b2ec-available-featuregates\") pod \"openshift-config-operator-7777fb866f-htrq5\" (UID: \"4c22c2e7-97be-4549-9d14-c65edcc0b2ec\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-htrq5" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.309008 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c30b3c06-11f3-450e-8793-4bacb3756a3e-image-import-ca\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.309642 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qbnpr"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.309661 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2dbbd"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.312147 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c30b3c06-11f3-450e-8793-4bacb3756a3e-node-pullsecrets\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.315778 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.316303 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.332949 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d278b7c-89ac-4929-9820-911c03fdb680-config\") pod \"openshift-apiserver-operator-796bbdcf4f-tzw68\" (UID: \"1d278b7c-89ac-4929-9820-911c03fdb680\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tzw68" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.333608 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c30b3c06-11f3-450e-8793-4bacb3756a3e-audit\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.334699 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/10b818d4-d6fd-4377-bd69-510eda43e365-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8cg8w\" (UID: \"10b818d4-d6fd-4377-bd69-510eda43e365\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8cg8w" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.335554 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4586ae5a-18b0-474f-8581-a2cd850ea693-default-certificate\") pod \"router-default-5444994796-slnmj\" (UID: \"4586ae5a-18b0-474f-8581-a2cd850ea693\") " pod="openshift-ingress/router-default-5444994796-slnmj" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.339116 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c30b3c06-11f3-450e-8793-4bacb3756a3e-etcd-serving-ca\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.339416 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.340130 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9e1cff1c-e90e-42a5-8ca9-0335f93988ff-machine-approver-tls\") pod \"machine-approver-56656f9798-4dfs6\" (UID: \"9e1cff1c-e90e-42a5-8ca9-0335f93988ff\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4dfs6" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.340417 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q68nf" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.340443 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c22c2e7-97be-4549-9d14-c65edcc0b2ec-serving-cert\") pod \"openshift-config-operator-7777fb866f-htrq5\" (UID: \"4c22c2e7-97be-4549-9d14-c65edcc0b2ec\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-htrq5" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.340617 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2jxcr"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.340910 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c30b3c06-11f3-450e-8793-4bacb3756a3e-etcd-client\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.341459 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2jxcr" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.341526 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.342607 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c553f8a1-0943-4437-9cd4-ef9192718b1e-serving-cert\") pod \"console-operator-58897d9998-2wn9n\" (UID: \"c553f8a1-0943-4437-9cd4-ef9192718b1e\") " pod="openshift-console-operator/console-operator-58897d9998-2wn9n" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.342855 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c30b3c06-11f3-450e-8793-4bacb3756a3e-config\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.343154 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2wn9n"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.343395 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.343414 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d278b7c-89ac-4929-9820-911c03fdb680-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-tzw68\" (UID: \"1d278b7c-89ac-4929-9820-911c03fdb680\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tzw68" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.343813 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c30b3c06-11f3-450e-8793-4bacb3756a3e-encryption-config\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.350160 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.350751 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0263ca67-c191-4560-be01-a2ad7fbeea4f-serving-cert\") pod \"authentication-operator-69f744f599-j6r8j\" (UID: \"0263ca67-c191-4560-be01-a2ad7fbeea4f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j6r8j" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.350859 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4586ae5a-18b0-474f-8581-a2cd850ea693-stats-auth\") pod \"router-default-5444994796-slnmj\" (UID: \"4586ae5a-18b0-474f-8581-a2cd850ea693\") " pod="openshift-ingress/router-default-5444994796-slnmj" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.351123 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4586ae5a-18b0-474f-8581-a2cd850ea693-metrics-certs\") pod \"router-default-5444994796-slnmj\" (UID: \"4586ae5a-18b0-474f-8581-a2cd850ea693\") " pod="openshift-ingress/router-default-5444994796-slnmj" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.351163 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.351761 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c30b3c06-11f3-450e-8793-4bacb3756a3e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.351835 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9e1cff1c-e90e-42a5-8ca9-0335f93988ff-auth-proxy-config\") pod \"machine-approver-56656f9798-4dfs6\" (UID: \"9e1cff1c-e90e-42a5-8ca9-0335f93988ff\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4dfs6" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.351878 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.352154 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e1cff1c-e90e-42a5-8ca9-0335f93988ff-config\") pod \"machine-approver-56656f9798-4dfs6\" (UID: \"9e1cff1c-e90e-42a5-8ca9-0335f93988ff\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4dfs6" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.352772 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.356381 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8cg8w"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.356419 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-q7rpj"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.356430 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cgsk"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.356596 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c30b3c06-11f3-450e-8793-4bacb3756a3e-serving-cert\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.356865 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cgsk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.359823 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-d49ll"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.359850 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qjxjv"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.360306 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-htrq5"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.360390 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qjxjv" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.362375 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.362461 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.363614 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0263ca67-c191-4560-be01-a2ad7fbeea4f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-j6r8j\" (UID: \"0263ca67-c191-4560-be01-a2ad7fbeea4f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j6r8j" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.367264 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.367301 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-k89cm"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.367795 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nzbph"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.368242 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nzbph" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.368655 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k89cm" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.371728 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xzpgj"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.372370 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xzpgj" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.372957 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c553f8a1-0943-4437-9cd4-ef9192718b1e-trusted-ca\") pod \"console-operator-58897d9998-2wn9n\" (UID: \"c553f8a1-0943-4437-9cd4-ef9192718b1e\") " pod="openshift-console-operator/console-operator-58897d9998-2wn9n" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.375057 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7q4b5"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.376209 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.376290 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7q4b5" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.378320 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85pd6"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.378977 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85pd6" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.381943 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6hzt4"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.385658 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-v8227"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.387316 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6hzt4" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.389059 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-x2vtd"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.389178 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-v8227" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.389917 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pgnn"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.390523 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525430-b9kh6"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.390836 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-8tc74"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.391162 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-9drdb"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.391396 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pgnn" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.390092 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x2vtd" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.391935 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525430-b9kh6" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.392686 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.394060 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tsf9g"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.394082 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-w4nkj"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.394162 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-9drdb" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.394328 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-65n7c"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.394971 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl4bn"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.395128 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-65n7c" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.395838 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rnkj5"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.395859 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-j6r8j"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.395897 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl4bn" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.397528 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q68nf"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.398937 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cgsk"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.400769 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nxsch"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.402158 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.404645 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f62zc"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.410970 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-7vbmz"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.411866 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9dlzr"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.413503 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-kxd92"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.417238 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tzw68"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.417300 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nzbph"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.417441 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-kxd92" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.418750 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-k89cm"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.422891 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.423640 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85pd6"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.423678 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2dbbd"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.426314 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-497mk"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.427660 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qjxjv"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.429174 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4wt7q"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.431370 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-v8227"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.432607 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2jxcr"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.433620 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-8bk42"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.434650 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-x2vtd"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.435921 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xzpgj"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.437365 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525430-b9kh6"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.438393 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-65n7c"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.439388 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-9drdb"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.440376 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7q4b5"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.441295 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pgnn"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.442223 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-psm7l"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.442423 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.443181 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-psm7l" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.443387 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6hzt4"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.444361 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl4bn"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.445323 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-psm7l"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.461879 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.482264 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.502992 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.522426 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.547950 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.562186 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.582162 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.603287 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.622647 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.676586 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bslmj\" (UniqueName: \"kubernetes.io/projected/c553f8a1-0943-4437-9cd4-ef9192718b1e-kube-api-access-bslmj\") pod \"console-operator-58897d9998-2wn9n\" (UID: \"c553f8a1-0943-4437-9cd4-ef9192718b1e\") " pod="openshift-console-operator/console-operator-58897d9998-2wn9n" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.696369 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9blt\" (UniqueName: \"kubernetes.io/projected/0263ca67-c191-4560-be01-a2ad7fbeea4f-kube-api-access-m9blt\") pod \"authentication-operator-69f744f599-j6r8j\" (UID: \"0263ca67-c191-4560-be01-a2ad7fbeea4f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j6r8j" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.714420 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h49f\" (UniqueName: \"kubernetes.io/projected/9e1cff1c-e90e-42a5-8ca9-0335f93988ff-kube-api-access-2h49f\") pod \"machine-approver-56656f9798-4dfs6\" (UID: \"9e1cff1c-e90e-42a5-8ca9-0335f93988ff\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4dfs6" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.735255 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q2xz\" (UniqueName: \"kubernetes.io/projected/4586ae5a-18b0-474f-8581-a2cd850ea693-kube-api-access-6q2xz\") pod \"router-default-5444994796-slnmj\" (UID: \"4586ae5a-18b0-474f-8581-a2cd850ea693\") " pod="openshift-ingress/router-default-5444994796-slnmj" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.754238 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm22m\" (UniqueName: \"kubernetes.io/projected/4c22c2e7-97be-4549-9d14-c65edcc0b2ec-kube-api-access-gm22m\") pod \"openshift-config-operator-7777fb866f-htrq5\" (UID: \"4c22c2e7-97be-4549-9d14-c65edcc0b2ec\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-htrq5" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.774447 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx8tj\" (UniqueName: \"kubernetes.io/projected/c30b3c06-11f3-450e-8793-4bacb3756a3e-kube-api-access-tx8tj\") pod \"apiserver-76f77b778f-497mk\" (UID: \"c30b3c06-11f3-450e-8793-4bacb3756a3e\") " pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.774603 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.795678 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4dfs6" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.800489 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sl26\" (UniqueName: \"kubernetes.io/projected/1d278b7c-89ac-4929-9820-911c03fdb680-kube-api-access-9sl26\") pod \"openshift-apiserver-operator-796bbdcf4f-tzw68\" (UID: \"1d278b7c-89ac-4929-9820-911c03fdb680\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tzw68" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.803291 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 19 18:34:19 crc kubenswrapper[4749]: W0219 18:34:19.809152 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e1cff1c_e90e_42a5_8ca9_0335f93988ff.slice/crio-ca361793f1692315552123e66cba85a2c58f8b56b5ae3ee9816fd66999320cad WatchSource:0}: Error finding container ca361793f1692315552123e66cba85a2c58f8b56b5ae3ee9816fd66999320cad: Status 404 returned error can't find the container with id ca361793f1692315552123e66cba85a2c58f8b56b5ae3ee9816fd66999320cad Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.822100 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.830274 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-htrq5" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.842310 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.848646 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-2wn9n" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.863369 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.868975 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-j6r8j" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.889526 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.902642 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.940995 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qw44\" (UniqueName: \"kubernetes.io/projected/10b818d4-d6fd-4377-bd69-510eda43e365-kube-api-access-4qw44\") pod \"cluster-samples-operator-665b6dd947-8cg8w\" (UID: \"10b818d4-d6fd-4377-bd69-510eda43e365\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8cg8w" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.943447 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.963123 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.964211 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4dfs6" event={"ID":"9e1cff1c-e90e-42a5-8ca9-0335f93988ff","Type":"ContainerStarted","Data":"ca361793f1692315552123e66cba85a2c58f8b56b5ae3ee9816fd66999320cad"} Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.964542 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-497mk"] Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.973109 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tzw68" Feb 19 18:34:19 crc kubenswrapper[4749]: I0219 18:34:19.983783 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.003413 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.029390 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.034964 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-slnmj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.039697 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-htrq5"] Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.042981 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.048608 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2wn9n"] Feb 19 18:34:20 crc kubenswrapper[4749]: W0219 18:34:20.054589 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4586ae5a_18b0_474f_8581_a2cd850ea693.slice/crio-25bd0e2c07b715c677aa4a5c10176e43ffe2679a7fa624ad362fe0640b3847d3 WatchSource:0}: Error finding container 25bd0e2c07b715c677aa4a5c10176e43ffe2679a7fa624ad362fe0640b3847d3: Status 404 returned error can't find the container with id 25bd0e2c07b715c677aa4a5c10176e43ffe2679a7fa624ad362fe0640b3847d3 Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.074190 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.082847 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.091878 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-j6r8j"] Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.102891 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.118277 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8cg8w" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.122279 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.142065 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.145704 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tzw68"] Feb 19 18:34:20 crc kubenswrapper[4749]: W0219 18:34:20.153364 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d278b7c_89ac_4929_9820_911c03fdb680.slice/crio-be529f01876217401e462ecfbe3ac32a29051fe6aba707c3deac4d5712549d37 WatchSource:0}: Error finding container be529f01876217401e462ecfbe3ac32a29051fe6aba707c3deac4d5712549d37: Status 404 returned error can't find the container with id be529f01876217401e462ecfbe3ac32a29051fe6aba707c3deac4d5712549d37 Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.162170 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.183786 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.206128 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.222350 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.243075 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.263755 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.282518 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.299712 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8cg8w"] Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.302886 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.322573 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.343105 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.361734 4749 request.go:700] Waited for 1.000817054s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.363905 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.382745 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.408338 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.422538 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.442255 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.462581 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.483628 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.502672 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.522876 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.542951 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.562491 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.583280 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.602864 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.622598 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.642338 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.678799 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.679150 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.679231 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vw4bt" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.679240 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.682604 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709192 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ac1677b6-8344-44c7-a5fc-2924da30ddbc-ca-trust-extracted\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709238 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4919053a-3b2b-4575-a86f-85004ba4b899-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-tsf9g\" (UID: \"4919053a-3b2b-4575-a86f-85004ba4b899\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tsf9g" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709266 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1a11162-6554-4080-9b1f-e0864a79ec01-client-ca\") pod \"route-controller-manager-6576b87f9c-87stm\" (UID: \"b1a11162-6554-4080-9b1f-e0864a79ec01\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709314 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1cfc2e6-b878-4af9-969b-33a513042b75-serving-cert\") pod \"controller-manager-879f6c89f-d49ll\" (UID: \"a1cfc2e6-b878-4af9-969b-33a513042b75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709357 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65ddl\" (UniqueName: \"kubernetes.io/projected/41d2c5ca-2f07-4fb0-9822-5d3f7119f56b-kube-api-access-65ddl\") pod \"downloads-7954f5f757-7vbmz\" (UID: \"41d2c5ca-2f07-4fb0-9822-5d3f7119f56b\") " pod="openshift-console/downloads-7954f5f757-7vbmz" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709395 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1a11162-6554-4080-9b1f-e0864a79ec01-serving-cert\") pod \"route-controller-manager-6576b87f9c-87stm\" (UID: \"b1a11162-6554-4080-9b1f-e0864a79ec01\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709432 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce6dc220-87e7-485b-ac8d-16654ba01e3a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-9dlzr\" (UID: \"ce6dc220-87e7-485b-ac8d-16654ba01e3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9dlzr" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709453 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9bea494d-956c-4fa1-b65e-7d791d79690c-etcd-client\") pod \"apiserver-7bbb656c7d-dwgqx\" (UID: \"9bea494d-956c-4fa1-b65e-7d791d79690c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709469 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9bea494d-956c-4fa1-b65e-7d791d79690c-audit-dir\") pod \"apiserver-7bbb656c7d-dwgqx\" (UID: \"9bea494d-956c-4fa1-b65e-7d791d79690c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709495 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkk99\" (UniqueName: \"kubernetes.io/projected/4919053a-3b2b-4575-a86f-85004ba4b899-kube-api-access-bkk99\") pod \"cluster-image-registry-operator-dc59b4c8b-tsf9g\" (UID: \"4919053a-3b2b-4575-a86f-85004ba4b899\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tsf9g" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709514 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg2k6\" (UniqueName: \"kubernetes.io/projected/9bea494d-956c-4fa1-b65e-7d791d79690c-kube-api-access-tg2k6\") pod \"apiserver-7bbb656c7d-dwgqx\" (UID: \"9bea494d-956c-4fa1-b65e-7d791d79690c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709534 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-722xt\" (UniqueName: \"kubernetes.io/projected/864274b1-985f-447a-b8d3-c7d9b2c2751b-kube-api-access-722xt\") pod \"dns-operator-744455d44c-q7rpj\" (UID: \"864274b1-985f-447a-b8d3-c7d9b2c2751b\") " pod="openshift-dns-operator/dns-operator-744455d44c-q7rpj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709571 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ced680c0-5c35-4d74-b553-7f95483907f7-serving-cert\") pod \"etcd-operator-b45778765-nxsch\" (UID: \"ced680c0-5c35-4d74-b553-7f95483907f7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nxsch" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709646 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-service-ca\") pod \"console-f9d7485db-rnkj5\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709710 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9bea494d-956c-4fa1-b65e-7d791d79690c-audit-policies\") pod \"apiserver-7bbb656c7d-dwgqx\" (UID: \"9bea494d-956c-4fa1-b65e-7d791d79690c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709736 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-console-oauth-config\") pod \"console-f9d7485db-rnkj5\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709759 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6s22\" (UniqueName: \"kubernetes.io/projected/b1a11162-6554-4080-9b1f-e0864a79ec01-kube-api-access-z6s22\") pod \"route-controller-manager-6576b87f9c-87stm\" (UID: \"b1a11162-6554-4080-9b1f-e0864a79ec01\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709779 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdhhw\" (UniqueName: \"kubernetes.io/projected/a1cfc2e6-b878-4af9-969b-33a513042b75-kube-api-access-fdhhw\") pod \"controller-manager-879f6c89f-d49ll\" (UID: \"a1cfc2e6-b878-4af9-969b-33a513042b75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709794 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ced680c0-5c35-4d74-b553-7f95483907f7-etcd-client\") pod \"etcd-operator-b45778765-nxsch\" (UID: \"ced680c0-5c35-4d74-b553-7f95483907f7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nxsch" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709813 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1cfc2e6-b878-4af9-969b-33a513042b75-config\") pod \"controller-manager-879f6c89f-d49ll\" (UID: \"a1cfc2e6-b878-4af9-969b-33a513042b75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709833 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-trusted-ca-bundle\") pod \"console-f9d7485db-rnkj5\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709859 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ac1677b6-8344-44c7-a5fc-2924da30ddbc-registry-certificates\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709882 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ced680c0-5c35-4d74-b553-7f95483907f7-config\") pod \"etcd-operator-b45778765-nxsch\" (UID: \"ced680c0-5c35-4d74-b553-7f95483907f7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nxsch" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709902 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-console-config\") pod \"console-f9d7485db-rnkj5\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709920 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-oauth-serving-cert\") pod \"console-f9d7485db-rnkj5\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709938 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac1677b6-8344-44c7-a5fc-2924da30ddbc-registry-tls\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709957 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-console-serving-cert\") pod \"console-f9d7485db-rnkj5\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.709977 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/864274b1-985f-447a-b8d3-c7d9b2c2751b-metrics-tls\") pod \"dns-operator-744455d44c-q7rpj\" (UID: \"864274b1-985f-447a-b8d3-c7d9b2c2751b\") " pod="openshift-dns-operator/dns-operator-744455d44c-q7rpj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.710010 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clchw\" (UniqueName: \"kubernetes.io/projected/ced680c0-5c35-4d74-b553-7f95483907f7-kube-api-access-clchw\") pod \"etcd-operator-b45778765-nxsch\" (UID: \"ced680c0-5c35-4d74-b553-7f95483907f7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nxsch" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.710050 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkx6v\" (UniqueName: \"kubernetes.io/projected/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-kube-api-access-kkx6v\") pod \"console-f9d7485db-rnkj5\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.710077 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1a11162-6554-4080-9b1f-e0864a79ec01-config\") pod \"route-controller-manager-6576b87f9c-87stm\" (UID: \"b1a11162-6554-4080-9b1f-e0864a79ec01\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.710100 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9bea494d-956c-4fa1-b65e-7d791d79690c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dwgqx\" (UID: \"9bea494d-956c-4fa1-b65e-7d791d79690c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.710123 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ac1677b6-8344-44c7-a5fc-2924da30ddbc-installation-pull-secrets\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.710147 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjrh8\" (UniqueName: \"kubernetes.io/projected/ac1677b6-8344-44c7-a5fc-2924da30ddbc-kube-api-access-gjrh8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.710162 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9bea494d-956c-4fa1-b65e-7d791d79690c-encryption-config\") pod \"apiserver-7bbb656c7d-dwgqx\" (UID: \"9bea494d-956c-4fa1-b65e-7d791d79690c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.710179 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4919053a-3b2b-4575-a86f-85004ba4b899-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-tsf9g\" (UID: \"4919053a-3b2b-4575-a86f-85004ba4b899\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tsf9g" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.710196 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qdm6\" (UniqueName: \"kubernetes.io/projected/ce6dc220-87e7-485b-ac8d-16654ba01e3a-kube-api-access-6qdm6\") pod \"openshift-controller-manager-operator-756b6f6bc6-9dlzr\" (UID: \"ce6dc220-87e7-485b-ac8d-16654ba01e3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9dlzr" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.710425 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ced680c0-5c35-4d74-b553-7f95483907f7-etcd-ca\") pod \"etcd-operator-b45778765-nxsch\" (UID: \"ced680c0-5c35-4d74-b553-7f95483907f7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nxsch" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.710503 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ced680c0-5c35-4d74-b553-7f95483907f7-etcd-service-ca\") pod \"etcd-operator-b45778765-nxsch\" (UID: \"ced680c0-5c35-4d74-b553-7f95483907f7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nxsch" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.710594 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fc385e2a-5c57-49bc-a308-57a35663a452-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qbnpr\" (UID: \"fc385e2a-5c57-49bc-a308-57a35663a452\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qbnpr" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.710723 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.710788 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac1677b6-8344-44c7-a5fc-2924da30ddbc-bound-sa-token\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.710862 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4919053a-3b2b-4575-a86f-85004ba4b899-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-tsf9g\" (UID: \"4919053a-3b2b-4575-a86f-85004ba4b899\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tsf9g" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.710933 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fc385e2a-5c57-49bc-a308-57a35663a452-images\") pod \"machine-api-operator-5694c8668f-qbnpr\" (UID: \"fc385e2a-5c57-49bc-a308-57a35663a452\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qbnpr" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.710996 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc385e2a-5c57-49bc-a308-57a35663a452-config\") pod \"machine-api-operator-5694c8668f-qbnpr\" (UID: \"fc385e2a-5c57-49bc-a308-57a35663a452\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qbnpr" Feb 19 18:34:20 crc kubenswrapper[4749]: E0219 18:34:20.711226 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:21.211199432 +0000 UTC m=+35.172419506 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.711224 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac1677b6-8344-44c7-a5fc-2924da30ddbc-trusted-ca\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.711322 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1cfc2e6-b878-4af9-969b-33a513042b75-client-ca\") pod \"controller-manager-879f6c89f-d49ll\" (UID: \"a1cfc2e6-b878-4af9-969b-33a513042b75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.711369 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce6dc220-87e7-485b-ac8d-16654ba01e3a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-9dlzr\" (UID: \"ce6dc220-87e7-485b-ac8d-16654ba01e3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9dlzr" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.711416 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bea494d-956c-4fa1-b65e-7d791d79690c-serving-cert\") pod \"apiserver-7bbb656c7d-dwgqx\" (UID: \"9bea494d-956c-4fa1-b65e-7d791d79690c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.711456 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9bea494d-956c-4fa1-b65e-7d791d79690c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dwgqx\" (UID: \"9bea494d-956c-4fa1-b65e-7d791d79690c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.711490 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1cfc2e6-b878-4af9-969b-33a513042b75-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-d49ll\" (UID: \"a1cfc2e6-b878-4af9-969b-33a513042b75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.711527 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4hdr\" (UniqueName: \"kubernetes.io/projected/fc385e2a-5c57-49bc-a308-57a35663a452-kube-api-access-f4hdr\") pod \"machine-api-operator-5694c8668f-qbnpr\" (UID: \"fc385e2a-5c57-49bc-a308-57a35663a452\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qbnpr" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.711856 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.722940 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.742805 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.763162 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.782823 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.803755 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.812870 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:20 crc kubenswrapper[4749]: E0219 18:34:20.813044 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:21.313001285 +0000 UTC m=+35.274221239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813096 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1a11162-6554-4080-9b1f-e0864a79ec01-client-ca\") pod \"route-controller-manager-6576b87f9c-87stm\" (UID: \"b1a11162-6554-4080-9b1f-e0864a79ec01\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813116 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1a11162-6554-4080-9b1f-e0864a79ec01-serving-cert\") pod \"route-controller-manager-6576b87f9c-87stm\" (UID: \"b1a11162-6554-4080-9b1f-e0864a79ec01\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813134 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce6dc220-87e7-485b-ac8d-16654ba01e3a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-9dlzr\" (UID: \"ce6dc220-87e7-485b-ac8d-16654ba01e3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9dlzr" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813151 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9bea494d-956c-4fa1-b65e-7d791d79690c-audit-dir\") pod \"apiserver-7bbb656c7d-dwgqx\" (UID: \"9bea494d-956c-4fa1-b65e-7d791d79690c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813172 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99352d0d-2b0b-4aa0-b6a0-41e16070a323-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8cgsk\" (UID: \"99352d0d-2b0b-4aa0-b6a0-41e16070a323\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cgsk" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813193 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813211 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-service-ca\") pod \"console-f9d7485db-rnkj5\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813228 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25539c1d-ee15-42e1-8743-3dbed89feb4e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-f62zc\" (UID: \"25539c1d-ee15-42e1-8743-3dbed89feb4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f62zc" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813248 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26515dd8-3e9b-446d-bdda-547bd85ea373-config\") pod \"service-ca-operator-777779d784-x2vtd\" (UID: \"26515dd8-3e9b-446d-bdda-547bd85ea373\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x2vtd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813276 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzf62\" (UniqueName: \"kubernetes.io/projected/e9cd83f3-4f63-4348-b467-b23ba946b7bc-kube-api-access-bzf62\") pod \"machine-config-server-kxd92\" (UID: \"e9cd83f3-4f63-4348-b467-b23ba946b7bc\") " pod="openshift-machine-config-operator/machine-config-server-kxd92" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813293 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b708b2de-ec1b-4cb9-8113-ab44e5437a9c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2jxcr\" (UID: \"b708b2de-ec1b-4cb9-8113-ab44e5437a9c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2jxcr" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813310 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/be345565-0341-4290-b5e8-9cf728685a6b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6hzt4\" (UID: \"be345565-0341-4290-b5e8-9cf728685a6b\") " pod="openshift-marketplace/marketplace-operator-79b997595-6hzt4" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813329 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813353 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9bea494d-956c-4fa1-b65e-7d791d79690c-audit-policies\") pod \"apiserver-7bbb656c7d-dwgqx\" (UID: \"9bea494d-956c-4fa1-b65e-7d791d79690c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813372 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh2tv\" (UniqueName: \"kubernetes.io/projected/eac77a6f-f6c7-404c-9756-c33ff95f0c65-kube-api-access-jh2tv\") pod \"dns-default-65n7c\" (UID: \"eac77a6f-f6c7-404c-9756-c33ff95f0c65\") " pod="openshift-dns/dns-default-65n7c" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813390 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd3a8256-8304-48a6-bcab-be5fee9f8017-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q68nf\" (UID: \"fd3a8256-8304-48a6-bcab-be5fee9f8017\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q68nf" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813406 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2e534c2-c769-4ad3-942b-d181ed2cf11e-config-volume\") pod \"collect-profiles-29525430-b9kh6\" (UID: \"c2e534c2-c769-4ad3-942b-d181ed2cf11e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525430-b9kh6" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813423 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6s22\" (UniqueName: \"kubernetes.io/projected/b1a11162-6554-4080-9b1f-e0864a79ec01-kube-api-access-z6s22\") pod \"route-controller-manager-6576b87f9c-87stm\" (UID: \"b1a11162-6554-4080-9b1f-e0864a79ec01\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813438 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz5hv\" (UniqueName: \"kubernetes.io/projected/cbe83d62-6b6a-4b2e-8f6c-2fda7cb7c012-kube-api-access-pz5hv\") pod \"catalog-operator-68c6474976-7q4b5\" (UID: \"cbe83d62-6b6a-4b2e-8f6c-2fda7cb7c012\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7q4b5" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813453 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbdql\" (UniqueName: \"kubernetes.io/projected/5c6fd864-39d6-4de5-9f6c-ec95242b9178-kube-api-access-sbdql\") pod \"packageserver-d55dfcdfc-xzpgj\" (UID: \"5c6fd864-39d6-4de5-9f6c-ec95242b9178\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xzpgj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813478 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdhhw\" (UniqueName: \"kubernetes.io/projected/a1cfc2e6-b878-4af9-969b-33a513042b75-kube-api-access-fdhhw\") pod \"controller-manager-879f6c89f-d49ll\" (UID: \"a1cfc2e6-b878-4af9-969b-33a513042b75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813493 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2e534c2-c769-4ad3-942b-d181ed2cf11e-secret-volume\") pod \"collect-profiles-29525430-b9kh6\" (UID: \"c2e534c2-c769-4ad3-942b-d181ed2cf11e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525430-b9kh6" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813513 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ced680c0-5c35-4d74-b553-7f95483907f7-etcd-client\") pod \"etcd-operator-b45778765-nxsch\" (UID: \"ced680c0-5c35-4d74-b553-7f95483907f7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nxsch" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813529 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-trusted-ca-bundle\") pod \"console-f9d7485db-rnkj5\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813547 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813565 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ced680c0-5c35-4d74-b553-7f95483907f7-config\") pod \"etcd-operator-b45778765-nxsch\" (UID: \"ced680c0-5c35-4d74-b553-7f95483907f7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nxsch" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813588 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-console-config\") pod \"console-f9d7485db-rnkj5\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813608 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99352d0d-2b0b-4aa0-b6a0-41e16070a323-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8cgsk\" (UID: \"99352d0d-2b0b-4aa0-b6a0-41e16070a323\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cgsk" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813629 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5bccd7a7-c28a-4612-914c-3b8f99324dec-socket-dir\") pod \"csi-hostpathplugin-9drdb\" (UID: \"5bccd7a7-c28a-4612-914c-3b8f99324dec\") " pod="hostpath-provisioner/csi-hostpathplugin-9drdb" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813649 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-console-serving-cert\") pod \"console-f9d7485db-rnkj5\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813668 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5c6fd864-39d6-4de5-9f6c-ec95242b9178-tmpfs\") pod \"packageserver-d55dfcdfc-xzpgj\" (UID: \"5c6fd864-39d6-4de5-9f6c-ec95242b9178\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xzpgj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813698 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clchw\" (UniqueName: \"kubernetes.io/projected/ced680c0-5c35-4d74-b553-7f95483907f7-kube-api-access-clchw\") pod \"etcd-operator-b45778765-nxsch\" (UID: \"ced680c0-5c35-4d74-b553-7f95483907f7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nxsch" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813716 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813737 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/406ec180-fbdf-4a41-84f9-749915f3eaa2-config\") pod \"kube-controller-manager-operator-78b949d7b-qjxjv\" (UID: \"406ec180-fbdf-4a41-84f9-749915f3eaa2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qjxjv" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813755 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5bccd7a7-c28a-4612-914c-3b8f99324dec-registration-dir\") pod \"csi-hostpathplugin-9drdb\" (UID: \"5bccd7a7-c28a-4612-914c-3b8f99324dec\") " pod="hostpath-provisioner/csi-hostpathplugin-9drdb" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813772 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9bea494d-956c-4fa1-b65e-7d791d79690c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dwgqx\" (UID: \"9bea494d-956c-4fa1-b65e-7d791d79690c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813808 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/176bf56a-e65f-42f1-a975-298674f1f6a6-metrics-tls\") pod \"ingress-operator-5b745b69d9-4wt7q\" (UID: \"176bf56a-e65f-42f1-a975-298674f1f6a6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4wt7q" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813828 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6djc\" (UniqueName: \"kubernetes.io/projected/4f215af7-dad4-4dd1-9cc7-20c611eacace-kube-api-access-f6djc\") pod \"cni-sysctl-allowlist-ds-8tc74\" (UID: \"4f215af7-dad4-4dd1-9cc7-20c611eacace\") " pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813843 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0db2db75-ec63-4052-be96-c5c34976fa18-signing-cabundle\") pod \"service-ca-9c57cc56f-v8227\" (UID: \"0db2db75-ec63-4052-be96-c5c34976fa18\") " pod="openshift-service-ca/service-ca-9c57cc56f-v8227" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813858 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5c6fd864-39d6-4de5-9f6c-ec95242b9178-apiservice-cert\") pod \"packageserver-d55dfcdfc-xzpgj\" (UID: \"5c6fd864-39d6-4de5-9f6c-ec95242b9178\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xzpgj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813874 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5n9w\" (UniqueName: \"kubernetes.io/projected/4882b1fe-f663-4de2-9d42-ae55ca424efe-kube-api-access-n5n9w\") pod \"migrator-59844c95c7-8bk42\" (UID: \"4882b1fe-f663-4de2-9d42-ae55ca424efe\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8bk42" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813900 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4919053a-3b2b-4575-a86f-85004ba4b899-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-tsf9g\" (UID: \"4919053a-3b2b-4575-a86f-85004ba4b899\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tsf9g" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813959 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813978 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wptx\" (UniqueName: \"kubernetes.io/projected/18ce2742-770a-492b-a2c1-b1c615b27c71-kube-api-access-7wptx\") pod \"control-plane-machine-set-operator-78cbb6b69f-nl4bn\" (UID: \"18ce2742-770a-492b-a2c1-b1c615b27c71\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl4bn" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.813994 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fc385e2a-5c57-49bc-a308-57a35663a452-images\") pod \"machine-api-operator-5694c8668f-qbnpr\" (UID: \"fc385e2a-5c57-49bc-a308-57a35663a452\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qbnpr" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.814018 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac1677b6-8344-44c7-a5fc-2924da30ddbc-trusted-ca\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.814081 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/176bf56a-e65f-42f1-a975-298674f1f6a6-trusted-ca\") pod \"ingress-operator-5b745b69d9-4wt7q\" (UID: \"176bf56a-e65f-42f1-a975-298674f1f6a6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4wt7q" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.814099 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltfr7\" (UniqueName: \"kubernetes.io/projected/d8df7789-2cce-4ea5-bd59-996174038a1f-kube-api-access-ltfr7\") pod \"machine-config-controller-84d6567774-nzbph\" (UID: \"d8df7789-2cce-4ea5-bd59-996174038a1f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nzbph" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.814114 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26515dd8-3e9b-446d-bdda-547bd85ea373-serving-cert\") pod \"service-ca-operator-777779d784-x2vtd\" (UID: \"26515dd8-3e9b-446d-bdda-547bd85ea373\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x2vtd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.814133 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce6dc220-87e7-485b-ac8d-16654ba01e3a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-9dlzr\" (UID: \"ce6dc220-87e7-485b-ac8d-16654ba01e3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9dlzr" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.815368 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd3a8256-8304-48a6-bcab-be5fee9f8017-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q68nf\" (UID: \"fd3a8256-8304-48a6-bcab-be5fee9f8017\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q68nf" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.815485 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64ff2\" (UniqueName: \"kubernetes.io/projected/92f8c721-43b3-4211-9580-9a1bbd4b61e9-kube-api-access-64ff2\") pod \"machine-config-operator-74547568cd-k89cm\" (UID: \"92f8c721-43b3-4211-9580-9a1bbd4b61e9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k89cm" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.815519 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd3a8256-8304-48a6-bcab-be5fee9f8017-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q68nf\" (UID: \"fd3a8256-8304-48a6-bcab-be5fee9f8017\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q68nf" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.815548 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/18ce2742-770a-492b-a2c1-b1c615b27c71-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nl4bn\" (UID: \"18ce2742-770a-492b-a2c1-b1c615b27c71\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl4bn" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.815589 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.815621 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1cfc2e6-b878-4af9-969b-33a513042b75-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-d49ll\" (UID: \"a1cfc2e6-b878-4af9-969b-33a513042b75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.815798 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4hdr\" (UniqueName: \"kubernetes.io/projected/fc385e2a-5c57-49bc-a308-57a35663a452-kube-api-access-f4hdr\") pod \"machine-api-operator-5694c8668f-qbnpr\" (UID: \"fc385e2a-5c57-49bc-a308-57a35663a452\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qbnpr" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.815827 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e9cd83f3-4f63-4348-b467-b23ba946b7bc-certs\") pod \"machine-config-server-kxd92\" (UID: \"e9cd83f3-4f63-4348-b467-b23ba946b7bc\") " pod="openshift-machine-config-operator/machine-config-server-kxd92" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.815848 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5c6fd864-39d6-4de5-9f6c-ec95242b9178-webhook-cert\") pod \"packageserver-d55dfcdfc-xzpgj\" (UID: \"5c6fd864-39d6-4de5-9f6c-ec95242b9178\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xzpgj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.815878 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxjl5\" (UniqueName: \"kubernetes.io/projected/ad0ad2f0-2678-4896-9f38-081e37050f36-kube-api-access-dxjl5\") pod \"olm-operator-6b444d44fb-85pd6\" (UID: \"ad0ad2f0-2678-4896-9f38-081e37050f36\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85pd6" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.815907 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/5bccd7a7-c28a-4612-914c-3b8f99324dec-plugins-dir\") pod \"csi-hostpathplugin-9drdb\" (UID: \"5bccd7a7-c28a-4612-914c-3b8f99324dec\") " pod="hostpath-provisioner/csi-hostpathplugin-9drdb" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.815932 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cbe83d62-6b6a-4b2e-8f6c-2fda7cb7c012-srv-cert\") pod \"catalog-operator-68c6474976-7q4b5\" (UID: \"cbe83d62-6b6a-4b2e-8f6c-2fda7cb7c012\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7q4b5" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.816127 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rgwv\" (UniqueName: \"kubernetes.io/projected/bb59f1a0-666b-42b1-b0af-6f7bdeb5a895-kube-api-access-7rgwv\") pod \"package-server-manager-789f6589d5-7pgnn\" (UID: \"bb59f1a0-666b-42b1-b0af-6f7bdeb5a895\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pgnn" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.816165 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8df7789-2cce-4ea5-bd59-996174038a1f-proxy-tls\") pod \"machine-config-controller-84d6567774-nzbph\" (UID: \"d8df7789-2cce-4ea5-bd59-996174038a1f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nzbph" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.816188 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/406ec180-fbdf-4a41-84f9-749915f3eaa2-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qjxjv\" (UID: \"406ec180-fbdf-4a41-84f9-749915f3eaa2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qjxjv" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.816209 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/92f8c721-43b3-4211-9580-9a1bbd4b61e9-auth-proxy-config\") pod \"machine-config-operator-74547568cd-k89cm\" (UID: \"92f8c721-43b3-4211-9580-9a1bbd4b61e9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k89cm" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.816232 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqfw2\" (UniqueName: \"kubernetes.io/projected/99352d0d-2b0b-4aa0-b6a0-41e16070a323-kube-api-access-lqfw2\") pod \"kube-storage-version-migrator-operator-b67b599dd-8cgsk\" (UID: \"99352d0d-2b0b-4aa0-b6a0-41e16070a323\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cgsk" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.816249 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-audit-policies\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.816307 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1cfc2e6-b878-4af9-969b-33a513042b75-serving-cert\") pod \"controller-manager-879f6c89f-d49ll\" (UID: \"a1cfc2e6-b878-4af9-969b-33a513042b75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.816331 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65ddl\" (UniqueName: \"kubernetes.io/projected/41d2c5ca-2f07-4fb0-9822-5d3f7119f56b-kube-api-access-65ddl\") pod \"downloads-7954f5f757-7vbmz\" (UID: \"41d2c5ca-2f07-4fb0-9822-5d3f7119f56b\") " pod="openshift-console/downloads-7954f5f757-7vbmz" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.817160 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-service-ca\") pod \"console-f9d7485db-rnkj5\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.817302 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ced680c0-5c35-4d74-b553-7f95483907f7-config\") pod \"etcd-operator-b45778765-nxsch\" (UID: \"ced680c0-5c35-4d74-b553-7f95483907f7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nxsch" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.817909 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-console-config\") pod \"console-f9d7485db-rnkj5\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.817991 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9bea494d-956c-4fa1-b65e-7d791d79690c-audit-policies\") pod \"apiserver-7bbb656c7d-dwgqx\" (UID: \"9bea494d-956c-4fa1-b65e-7d791d79690c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.818489 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1a11162-6554-4080-9b1f-e0864a79ec01-client-ca\") pod \"route-controller-manager-6576b87f9c-87stm\" (UID: \"b1a11162-6554-4080-9b1f-e0864a79ec01\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.821174 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9bea494d-956c-4fa1-b65e-7d791d79690c-audit-dir\") pod \"apiserver-7bbb656c7d-dwgqx\" (UID: \"9bea494d-956c-4fa1-b65e-7d791d79690c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.822144 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9bea494d-956c-4fa1-b65e-7d791d79690c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dwgqx\" (UID: \"9bea494d-956c-4fa1-b65e-7d791d79690c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.822159 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-console-serving-cert\") pod \"console-f9d7485db-rnkj5\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.822984 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1a11162-6554-4080-9b1f-e0864a79ec01-serving-cert\") pod \"route-controller-manager-6576b87f9c-87stm\" (UID: \"b1a11162-6554-4080-9b1f-e0864a79ec01\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.823335 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-trusted-ca-bundle\") pod \"console-f9d7485db-rnkj5\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.823389 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.823939 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce6dc220-87e7-485b-ac8d-16654ba01e3a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-9dlzr\" (UID: \"ce6dc220-87e7-485b-ac8d-16654ba01e3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9dlzr" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.825744 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fc385e2a-5c57-49bc-a308-57a35663a452-images\") pod \"machine-api-operator-5694c8668f-qbnpr\" (UID: \"fc385e2a-5c57-49bc-a308-57a35663a452\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qbnpr" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.827191 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1cfc2e6-b878-4af9-969b-33a513042b75-serving-cert\") pod \"controller-manager-879f6c89f-d49ll\" (UID: \"a1cfc2e6-b878-4af9-969b-33a513042b75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.827704 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1cfc2e6-b878-4af9-969b-33a513042b75-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-d49ll\" (UID: \"a1cfc2e6-b878-4af9-969b-33a513042b75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.827776 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/be345565-0341-4290-b5e8-9cf728685a6b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6hzt4\" (UID: \"be345565-0341-4290-b5e8-9cf728685a6b\") " pod="openshift-marketplace/marketplace-operator-79b997595-6hzt4" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.827941 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25539c1d-ee15-42e1-8743-3dbed89feb4e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-f62zc\" (UID: \"25539c1d-ee15-42e1-8743-3dbed89feb4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f62zc" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.828013 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9bea494d-956c-4fa1-b65e-7d791d79690c-etcd-client\") pod \"apiserver-7bbb656c7d-dwgqx\" (UID: \"9bea494d-956c-4fa1-b65e-7d791d79690c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.828101 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkk99\" (UniqueName: \"kubernetes.io/projected/4919053a-3b2b-4575-a86f-85004ba4b899-kube-api-access-bkk99\") pod \"cluster-image-registry-operator-dc59b4c8b-tsf9g\" (UID: \"4919053a-3b2b-4575-a86f-85004ba4b899\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tsf9g" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.828133 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg2k6\" (UniqueName: \"kubernetes.io/projected/9bea494d-956c-4fa1-b65e-7d791d79690c-kube-api-access-tg2k6\") pod \"apiserver-7bbb656c7d-dwgqx\" (UID: \"9bea494d-956c-4fa1-b65e-7d791d79690c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.828190 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ced680c0-5c35-4d74-b553-7f95483907f7-etcd-client\") pod \"etcd-operator-b45778765-nxsch\" (UID: \"ced680c0-5c35-4d74-b553-7f95483907f7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nxsch" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.828222 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-722xt\" (UniqueName: \"kubernetes.io/projected/864274b1-985f-447a-b8d3-c7d9b2c2751b-kube-api-access-722xt\") pod \"dns-operator-744455d44c-q7rpj\" (UID: \"864274b1-985f-447a-b8d3-c7d9b2c2751b\") " pod="openshift-dns-operator/dns-operator-744455d44c-q7rpj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.828262 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0db2db75-ec63-4052-be96-c5c34976fa18-signing-key\") pod \"service-ca-9c57cc56f-v8227\" (UID: \"0db2db75-ec63-4052-be96-c5c34976fa18\") " pod="openshift-service-ca/service-ca-9c57cc56f-v8227" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.828550 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac1677b6-8344-44c7-a5fc-2924da30ddbc-trusted-ca\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.828629 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ced680c0-5c35-4d74-b553-7f95483907f7-serving-cert\") pod \"etcd-operator-b45778765-nxsch\" (UID: \"ced680c0-5c35-4d74-b553-7f95483907f7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nxsch" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.828939 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4f215af7-dad4-4dd1-9cc7-20c611eacace-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-8tc74\" (UID: \"4f215af7-dad4-4dd1-9cc7-20c611eacace\") " pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.829009 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/4f215af7-dad4-4dd1-9cc7-20c611eacace-ready\") pod \"cni-sysctl-allowlist-ds-8tc74\" (UID: \"4f215af7-dad4-4dd1-9cc7-20c611eacace\") " pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.829070 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-console-oauth-config\") pod \"console-f9d7485db-rnkj5\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.829096 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsbn8\" (UniqueName: \"kubernetes.io/projected/b708b2de-ec1b-4cb9-8113-ab44e5437a9c-kube-api-access-qsbn8\") pod \"multus-admission-controller-857f4d67dd-2jxcr\" (UID: \"b708b2de-ec1b-4cb9-8113-ab44e5437a9c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2jxcr" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.829512 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbdpm\" (UniqueName: \"kubernetes.io/projected/26515dd8-3e9b-446d-bdda-547bd85ea373-kube-api-access-pbdpm\") pod \"service-ca-operator-777779d784-x2vtd\" (UID: \"26515dd8-3e9b-446d-bdda-547bd85ea373\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x2vtd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.829954 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/92f8c721-43b3-4211-9580-9a1bbd4b61e9-images\") pod \"machine-config-operator-74547568cd-k89cm\" (UID: \"92f8c721-43b3-4211-9580-9a1bbd4b61e9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k89cm" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.830064 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.830215 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1cfc2e6-b878-4af9-969b-33a513042b75-config\") pod \"controller-manager-879f6c89f-d49ll\" (UID: \"a1cfc2e6-b878-4af9-969b-33a513042b75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.830274 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ad0ad2f0-2678-4896-9f38-081e37050f36-srv-cert\") pod \"olm-operator-6b444d44fb-85pd6\" (UID: \"ad0ad2f0-2678-4896-9f38-081e37050f36\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85pd6" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.830478 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ac1677b6-8344-44c7-a5fc-2924da30ddbc-registry-certificates\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.830615 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4f215af7-dad4-4dd1-9cc7-20c611eacace-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-8tc74\" (UID: \"4f215af7-dad4-4dd1-9cc7-20c611eacace\") " pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.830739 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klmzb\" (UniqueName: \"kubernetes.io/projected/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-kube-api-access-klmzb\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.830842 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac1677b6-8344-44c7-a5fc-2924da30ddbc-registry-tls\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.830889 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-oauth-serving-cert\") pod \"console-f9d7485db-rnkj5\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.830923 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/864274b1-985f-447a-b8d3-c7d9b2c2751b-metrics-tls\") pod \"dns-operator-744455d44c-q7rpj\" (UID: \"864274b1-985f-447a-b8d3-c7d9b2c2751b\") " pod="openshift-dns-operator/dns-operator-744455d44c-q7rpj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.830983 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eac77a6f-f6c7-404c-9756-c33ff95f0c65-config-volume\") pod \"dns-default-65n7c\" (UID: \"eac77a6f-f6c7-404c-9756-c33ff95f0c65\") " pod="openshift-dns/dns-default-65n7c" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.831062 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.831431 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkx6v\" (UniqueName: \"kubernetes.io/projected/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-kube-api-access-kkx6v\") pod \"console-f9d7485db-rnkj5\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.831554 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwfkx\" (UniqueName: \"kubernetes.io/projected/0db2db75-ec63-4052-be96-c5c34976fa18-kube-api-access-fwfkx\") pod \"service-ca-9c57cc56f-v8227\" (UID: \"0db2db75-ec63-4052-be96-c5c34976fa18\") " pod="openshift-service-ca/service-ca-9c57cc56f-v8227" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.831765 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p67zx\" (UniqueName: \"kubernetes.io/projected/f81702e7-e998-42de-926c-8d704940eab8-kube-api-access-p67zx\") pod \"ingress-canary-psm7l\" (UID: \"f81702e7-e998-42de-926c-8d704940eab8\") " pod="openshift-ingress-canary/ingress-canary-psm7l" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.831954 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ac1677b6-8344-44c7-a5fc-2924da30ddbc-registry-certificates\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.832002 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1a11162-6554-4080-9b1f-e0864a79ec01-config\") pod \"route-controller-manager-6576b87f9c-87stm\" (UID: \"b1a11162-6554-4080-9b1f-e0864a79ec01\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.832190 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/176bf56a-e65f-42f1-a975-298674f1f6a6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4wt7q\" (UID: \"176bf56a-e65f-42f1-a975-298674f1f6a6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4wt7q" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.832268 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-audit-dir\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.832404 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ac1677b6-8344-44c7-a5fc-2924da30ddbc-installation-pull-secrets\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.832399 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1cfc2e6-b878-4af9-969b-33a513042b75-config\") pod \"controller-manager-879f6c89f-d49ll\" (UID: \"a1cfc2e6-b878-4af9-969b-33a513042b75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.832508 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d8df7789-2cce-4ea5-bd59-996174038a1f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nzbph\" (UID: \"d8df7789-2cce-4ea5-bd59-996174038a1f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nzbph" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.832696 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cbe83d62-6b6a-4b2e-8f6c-2fda7cb7c012-profile-collector-cert\") pod \"catalog-operator-68c6474976-7q4b5\" (UID: \"cbe83d62-6b6a-4b2e-8f6c-2fda7cb7c012\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7q4b5" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.832739 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvj9b\" (UniqueName: \"kubernetes.io/projected/c2e534c2-c769-4ad3-942b-d181ed2cf11e-kube-api-access-jvj9b\") pod \"collect-profiles-29525430-b9kh6\" (UID: \"c2e534c2-c769-4ad3-942b-d181ed2cf11e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525430-b9kh6" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.832923 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjrh8\" (UniqueName: \"kubernetes.io/projected/ac1677b6-8344-44c7-a5fc-2924da30ddbc-kube-api-access-gjrh8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.833209 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9bea494d-956c-4fa1-b65e-7d791d79690c-encryption-config\") pod \"apiserver-7bbb656c7d-dwgqx\" (UID: \"9bea494d-956c-4fa1-b65e-7d791d79690c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.833253 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.833345 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qdm6\" (UniqueName: \"kubernetes.io/projected/ce6dc220-87e7-485b-ac8d-16654ba01e3a-kube-api-access-6qdm6\") pod \"openshift-controller-manager-operator-756b6f6bc6-9dlzr\" (UID: \"ce6dc220-87e7-485b-ac8d-16654ba01e3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9dlzr" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.833459 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eac77a6f-f6c7-404c-9756-c33ff95f0c65-metrics-tls\") pod \"dns-default-65n7c\" (UID: \"eac77a6f-f6c7-404c-9756-c33ff95f0c65\") " pod="openshift-dns/dns-default-65n7c" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.833552 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/5bccd7a7-c28a-4612-914c-3b8f99324dec-mountpoint-dir\") pod \"csi-hostpathplugin-9drdb\" (UID: \"5bccd7a7-c28a-4612-914c-3b8f99324dec\") " pod="hostpath-provisioner/csi-hostpathplugin-9drdb" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.833604 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25z5z\" (UniqueName: \"kubernetes.io/projected/5bccd7a7-c28a-4612-914c-3b8f99324dec-kube-api-access-25z5z\") pod \"csi-hostpathplugin-9drdb\" (UID: \"5bccd7a7-c28a-4612-914c-3b8f99324dec\") " pod="hostpath-provisioner/csi-hostpathplugin-9drdb" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.833679 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ced680c0-5c35-4d74-b553-7f95483907f7-etcd-ca\") pod \"etcd-operator-b45778765-nxsch\" (UID: \"ced680c0-5c35-4d74-b553-7f95483907f7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nxsch" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.833722 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ced680c0-5c35-4d74-b553-7f95483907f7-etcd-service-ca\") pod \"etcd-operator-b45778765-nxsch\" (UID: \"ced680c0-5c35-4d74-b553-7f95483907f7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nxsch" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.833763 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fc385e2a-5c57-49bc-a308-57a35663a452-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qbnpr\" (UID: \"fc385e2a-5c57-49bc-a308-57a35663a452\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qbnpr" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.833849 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/5bccd7a7-c28a-4612-914c-3b8f99324dec-csi-data-dir\") pod \"csi-hostpathplugin-9drdb\" (UID: \"5bccd7a7-c28a-4612-914c-3b8f99324dec\") " pod="hostpath-provisioner/csi-hostpathplugin-9drdb" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.833907 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-oauth-serving-cert\") pod \"console-f9d7485db-rnkj5\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.833915 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.834211 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.834260 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac1677b6-8344-44c7-a5fc-2924da30ddbc-bound-sa-token\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.834290 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4919053a-3b2b-4575-a86f-85004ba4b899-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-tsf9g\" (UID: \"4919053a-3b2b-4575-a86f-85004ba4b899\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tsf9g" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.834307 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4919053a-3b2b-4575-a86f-85004ba4b899-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-tsf9g\" (UID: \"4919053a-3b2b-4575-a86f-85004ba4b899\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tsf9g" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.834332 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e9cd83f3-4f63-4348-b467-b23ba946b7bc-node-bootstrap-token\") pod \"machine-config-server-kxd92\" (UID: \"e9cd83f3-4f63-4348-b467-b23ba946b7bc\") " pod="openshift-machine-config-operator/machine-config-server-kxd92" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.834367 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.834457 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1cfc2e6-b878-4af9-969b-33a513042b75-client-ca\") pod \"controller-manager-879f6c89f-d49ll\" (UID: \"a1cfc2e6-b878-4af9-969b-33a513042b75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.834512 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc385e2a-5c57-49bc-a308-57a35663a452-config\") pod \"machine-api-operator-5694c8668f-qbnpr\" (UID: \"fc385e2a-5c57-49bc-a308-57a35663a452\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qbnpr" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.834566 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4r9k\" (UniqueName: \"kubernetes.io/projected/176bf56a-e65f-42f1-a975-298674f1f6a6-kube-api-access-l4r9k\") pod \"ingress-operator-5b745b69d9-4wt7q\" (UID: \"176bf56a-e65f-42f1-a975-298674f1f6a6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4wt7q" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.834627 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ad0ad2f0-2678-4896-9f38-081e37050f36-profile-collector-cert\") pod \"olm-operator-6b444d44fb-85pd6\" (UID: \"ad0ad2f0-2678-4896-9f38-081e37050f36\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85pd6" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.834709 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1a11162-6554-4080-9b1f-e0864a79ec01-config\") pod \"route-controller-manager-6576b87f9c-87stm\" (UID: \"b1a11162-6554-4080-9b1f-e0864a79ec01\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.834766 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ced680c0-5c35-4d74-b553-7f95483907f7-etcd-service-ca\") pod \"etcd-operator-b45778765-nxsch\" (UID: \"ced680c0-5c35-4d74-b553-7f95483907f7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nxsch" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.834868 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25539c1d-ee15-42e1-8743-3dbed89feb4e-config\") pod \"kube-apiserver-operator-766d6c64bb-f62zc\" (UID: \"25539c1d-ee15-42e1-8743-3dbed89feb4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f62zc" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.834943 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/92f8c721-43b3-4211-9580-9a1bbd4b61e9-proxy-tls\") pod \"machine-config-operator-74547568cd-k89cm\" (UID: \"92f8c721-43b3-4211-9580-9a1bbd4b61e9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k89cm" Feb 19 18:34:20 crc kubenswrapper[4749]: E0219 18:34:20.834983 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:21.334961205 +0000 UTC m=+35.296181169 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.835063 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f81702e7-e998-42de-926c-8d704940eab8-cert\") pod \"ingress-canary-psm7l\" (UID: \"f81702e7-e998-42de-926c-8d704940eab8\") " pod="openshift-ingress-canary/ingress-canary-psm7l" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.835111 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bea494d-956c-4fa1-b65e-7d791d79690c-serving-cert\") pod \"apiserver-7bbb656c7d-dwgqx\" (UID: \"9bea494d-956c-4fa1-b65e-7d791d79690c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.835098 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ced680c0-5c35-4d74-b553-7f95483907f7-etcd-ca\") pod \"etcd-operator-b45778765-nxsch\" (UID: \"ced680c0-5c35-4d74-b553-7f95483907f7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nxsch" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.835146 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf4qf\" (UniqueName: \"kubernetes.io/projected/be345565-0341-4290-b5e8-9cf728685a6b-kube-api-access-hf4qf\") pod \"marketplace-operator-79b997595-6hzt4\" (UID: \"be345565-0341-4290-b5e8-9cf728685a6b\") " pod="openshift-marketplace/marketplace-operator-79b997595-6hzt4" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.835449 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc385e2a-5c57-49bc-a308-57a35663a452-config\") pod \"machine-api-operator-5694c8668f-qbnpr\" (UID: \"fc385e2a-5c57-49bc-a308-57a35663a452\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qbnpr" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.835924 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9bea494d-956c-4fa1-b65e-7d791d79690c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dwgqx\" (UID: \"9bea494d-956c-4fa1-b65e-7d791d79690c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.835966 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/406ec180-fbdf-4a41-84f9-749915f3eaa2-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qjxjv\" (UID: \"406ec180-fbdf-4a41-84f9-749915f3eaa2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qjxjv" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.836065 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ac1677b6-8344-44c7-a5fc-2924da30ddbc-ca-trust-extracted\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.836168 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb59f1a0-666b-42b1-b0af-6f7bdeb5a895-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7pgnn\" (UID: \"bb59f1a0-666b-42b1-b0af-6f7bdeb5a895\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pgnn" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.836260 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4919053a-3b2b-4575-a86f-85004ba4b899-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-tsf9g\" (UID: \"4919053a-3b2b-4575-a86f-85004ba4b899\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tsf9g" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.836782 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9bea494d-956c-4fa1-b65e-7d791d79690c-etcd-client\") pod \"apiserver-7bbb656c7d-dwgqx\" (UID: \"9bea494d-956c-4fa1-b65e-7d791d79690c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.837360 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fc385e2a-5c57-49bc-a308-57a35663a452-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qbnpr\" (UID: \"fc385e2a-5c57-49bc-a308-57a35663a452\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qbnpr" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.837692 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9bea494d-956c-4fa1-b65e-7d791d79690c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dwgqx\" (UID: \"9bea494d-956c-4fa1-b65e-7d791d79690c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.837705 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ac1677b6-8344-44c7-a5fc-2924da30ddbc-ca-trust-extracted\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.837875 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ac1677b6-8344-44c7-a5fc-2924da30ddbc-installation-pull-secrets\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.839774 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ced680c0-5c35-4d74-b553-7f95483907f7-serving-cert\") pod \"etcd-operator-b45778765-nxsch\" (UID: \"ced680c0-5c35-4d74-b553-7f95483907f7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nxsch" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.839817 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce6dc220-87e7-485b-ac8d-16654ba01e3a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-9dlzr\" (UID: \"ce6dc220-87e7-485b-ac8d-16654ba01e3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9dlzr" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.839859 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1cfc2e6-b878-4af9-969b-33a513042b75-client-ca\") pod \"controller-manager-879f6c89f-d49ll\" (UID: \"a1cfc2e6-b878-4af9-969b-33a513042b75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.840150 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9bea494d-956c-4fa1-b65e-7d791d79690c-encryption-config\") pod \"apiserver-7bbb656c7d-dwgqx\" (UID: \"9bea494d-956c-4fa1-b65e-7d791d79690c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.840713 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac1677b6-8344-44c7-a5fc-2924da30ddbc-registry-tls\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.841155 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bea494d-956c-4fa1-b65e-7d791d79690c-serving-cert\") pod \"apiserver-7bbb656c7d-dwgqx\" (UID: \"9bea494d-956c-4fa1-b65e-7d791d79690c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.841653 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4919053a-3b2b-4575-a86f-85004ba4b899-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-tsf9g\" (UID: \"4919053a-3b2b-4575-a86f-85004ba4b899\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tsf9g" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.842511 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.842855 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/864274b1-985f-447a-b8d3-c7d9b2c2751b-metrics-tls\") pod \"dns-operator-744455d44c-q7rpj\" (UID: \"864274b1-985f-447a-b8d3-c7d9b2c2751b\") " pod="openshift-dns-operator/dns-operator-744455d44c-q7rpj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.844214 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-console-oauth-config\") pod \"console-f9d7485db-rnkj5\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.863406 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.883621 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.903281 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.923499 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.937676 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:20 crc kubenswrapper[4749]: E0219 18:34:20.937830 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:21.437796252 +0000 UTC m=+35.399016216 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.937872 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb59f1a0-666b-42b1-b0af-6f7bdeb5a895-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7pgnn\" (UID: \"bb59f1a0-666b-42b1-b0af-6f7bdeb5a895\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pgnn" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.937906 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99352d0d-2b0b-4aa0-b6a0-41e16070a323-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8cgsk\" (UID: \"99352d0d-2b0b-4aa0-b6a0-41e16070a323\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cgsk" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.937935 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.937955 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26515dd8-3e9b-446d-bdda-547bd85ea373-config\") pod \"service-ca-operator-777779d784-x2vtd\" (UID: \"26515dd8-3e9b-446d-bdda-547bd85ea373\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x2vtd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.937982 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25539c1d-ee15-42e1-8743-3dbed89feb4e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-f62zc\" (UID: \"25539c1d-ee15-42e1-8743-3dbed89feb4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f62zc" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938005 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938045 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzf62\" (UniqueName: \"kubernetes.io/projected/e9cd83f3-4f63-4348-b467-b23ba946b7bc-kube-api-access-bzf62\") pod \"machine-config-server-kxd92\" (UID: \"e9cd83f3-4f63-4348-b467-b23ba946b7bc\") " pod="openshift-machine-config-operator/machine-config-server-kxd92" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938064 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b708b2de-ec1b-4cb9-8113-ab44e5437a9c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2jxcr\" (UID: \"b708b2de-ec1b-4cb9-8113-ab44e5437a9c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2jxcr" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938080 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/be345565-0341-4290-b5e8-9cf728685a6b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6hzt4\" (UID: \"be345565-0341-4290-b5e8-9cf728685a6b\") " pod="openshift-marketplace/marketplace-operator-79b997595-6hzt4" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938102 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh2tv\" (UniqueName: \"kubernetes.io/projected/eac77a6f-f6c7-404c-9756-c33ff95f0c65-kube-api-access-jh2tv\") pod \"dns-default-65n7c\" (UID: \"eac77a6f-f6c7-404c-9756-c33ff95f0c65\") " pod="openshift-dns/dns-default-65n7c" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938117 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd3a8256-8304-48a6-bcab-be5fee9f8017-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q68nf\" (UID: \"fd3a8256-8304-48a6-bcab-be5fee9f8017\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q68nf" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938133 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2e534c2-c769-4ad3-942b-d181ed2cf11e-config-volume\") pod \"collect-profiles-29525430-b9kh6\" (UID: \"c2e534c2-c769-4ad3-942b-d181ed2cf11e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525430-b9kh6" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938160 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz5hv\" (UniqueName: \"kubernetes.io/projected/cbe83d62-6b6a-4b2e-8f6c-2fda7cb7c012-kube-api-access-pz5hv\") pod \"catalog-operator-68c6474976-7q4b5\" (UID: \"cbe83d62-6b6a-4b2e-8f6c-2fda7cb7c012\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7q4b5" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938176 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbdql\" (UniqueName: \"kubernetes.io/projected/5c6fd864-39d6-4de5-9f6c-ec95242b9178-kube-api-access-sbdql\") pod \"packageserver-d55dfcdfc-xzpgj\" (UID: \"5c6fd864-39d6-4de5-9f6c-ec95242b9178\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xzpgj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938198 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2e534c2-c769-4ad3-942b-d181ed2cf11e-secret-volume\") pod \"collect-profiles-29525430-b9kh6\" (UID: \"c2e534c2-c769-4ad3-942b-d181ed2cf11e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525430-b9kh6" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938213 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938229 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99352d0d-2b0b-4aa0-b6a0-41e16070a323-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8cgsk\" (UID: \"99352d0d-2b0b-4aa0-b6a0-41e16070a323\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cgsk" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938248 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5bccd7a7-c28a-4612-914c-3b8f99324dec-socket-dir\") pod \"csi-hostpathplugin-9drdb\" (UID: \"5bccd7a7-c28a-4612-914c-3b8f99324dec\") " pod="hostpath-provisioner/csi-hostpathplugin-9drdb" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938263 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5c6fd864-39d6-4de5-9f6c-ec95242b9178-tmpfs\") pod \"packageserver-d55dfcdfc-xzpgj\" (UID: \"5c6fd864-39d6-4de5-9f6c-ec95242b9178\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xzpgj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938295 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938314 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/406ec180-fbdf-4a41-84f9-749915f3eaa2-config\") pod \"kube-controller-manager-operator-78b949d7b-qjxjv\" (UID: \"406ec180-fbdf-4a41-84f9-749915f3eaa2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qjxjv" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938329 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5bccd7a7-c28a-4612-914c-3b8f99324dec-registration-dir\") pod \"csi-hostpathplugin-9drdb\" (UID: \"5bccd7a7-c28a-4612-914c-3b8f99324dec\") " pod="hostpath-provisioner/csi-hostpathplugin-9drdb" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938348 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5c6fd864-39d6-4de5-9f6c-ec95242b9178-apiservice-cert\") pod \"packageserver-d55dfcdfc-xzpgj\" (UID: \"5c6fd864-39d6-4de5-9f6c-ec95242b9178\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xzpgj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938364 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/176bf56a-e65f-42f1-a975-298674f1f6a6-metrics-tls\") pod \"ingress-operator-5b745b69d9-4wt7q\" (UID: \"176bf56a-e65f-42f1-a975-298674f1f6a6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4wt7q" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938380 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6djc\" (UniqueName: \"kubernetes.io/projected/4f215af7-dad4-4dd1-9cc7-20c611eacace-kube-api-access-f6djc\") pod \"cni-sysctl-allowlist-ds-8tc74\" (UID: \"4f215af7-dad4-4dd1-9cc7-20c611eacace\") " pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938412 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0db2db75-ec63-4052-be96-c5c34976fa18-signing-cabundle\") pod \"service-ca-9c57cc56f-v8227\" (UID: \"0db2db75-ec63-4052-be96-c5c34976fa18\") " pod="openshift-service-ca/service-ca-9c57cc56f-v8227" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938440 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5n9w\" (UniqueName: \"kubernetes.io/projected/4882b1fe-f663-4de2-9d42-ae55ca424efe-kube-api-access-n5n9w\") pod \"migrator-59844c95c7-8bk42\" (UID: \"4882b1fe-f663-4de2-9d42-ae55ca424efe\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8bk42" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938459 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938476 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wptx\" (UniqueName: \"kubernetes.io/projected/18ce2742-770a-492b-a2c1-b1c615b27c71-kube-api-access-7wptx\") pod \"control-plane-machine-set-operator-78cbb6b69f-nl4bn\" (UID: \"18ce2742-770a-492b-a2c1-b1c615b27c71\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl4bn" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938496 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26515dd8-3e9b-446d-bdda-547bd85ea373-serving-cert\") pod \"service-ca-operator-777779d784-x2vtd\" (UID: \"26515dd8-3e9b-446d-bdda-547bd85ea373\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x2vtd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938529 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/176bf56a-e65f-42f1-a975-298674f1f6a6-trusted-ca\") pod \"ingress-operator-5b745b69d9-4wt7q\" (UID: \"176bf56a-e65f-42f1-a975-298674f1f6a6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4wt7q" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938547 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltfr7\" (UniqueName: \"kubernetes.io/projected/d8df7789-2cce-4ea5-bd59-996174038a1f-kube-api-access-ltfr7\") pod \"machine-config-controller-84d6567774-nzbph\" (UID: \"d8df7789-2cce-4ea5-bd59-996174038a1f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nzbph" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938563 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd3a8256-8304-48a6-bcab-be5fee9f8017-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q68nf\" (UID: \"fd3a8256-8304-48a6-bcab-be5fee9f8017\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q68nf" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938593 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64ff2\" (UniqueName: \"kubernetes.io/projected/92f8c721-43b3-4211-9580-9a1bbd4b61e9-kube-api-access-64ff2\") pod \"machine-config-operator-74547568cd-k89cm\" (UID: \"92f8c721-43b3-4211-9580-9a1bbd4b61e9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k89cm" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938617 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd3a8256-8304-48a6-bcab-be5fee9f8017-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q68nf\" (UID: \"fd3a8256-8304-48a6-bcab-be5fee9f8017\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q68nf" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938638 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/18ce2742-770a-492b-a2c1-b1c615b27c71-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nl4bn\" (UID: \"18ce2742-770a-492b-a2c1-b1c615b27c71\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl4bn" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938691 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938723 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e9cd83f3-4f63-4348-b467-b23ba946b7bc-certs\") pod \"machine-config-server-kxd92\" (UID: \"e9cd83f3-4f63-4348-b467-b23ba946b7bc\") " pod="openshift-machine-config-operator/machine-config-server-kxd92" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938749 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5c6fd864-39d6-4de5-9f6c-ec95242b9178-webhook-cert\") pod \"packageserver-d55dfcdfc-xzpgj\" (UID: \"5c6fd864-39d6-4de5-9f6c-ec95242b9178\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xzpgj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938773 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxjl5\" (UniqueName: \"kubernetes.io/projected/ad0ad2f0-2678-4896-9f38-081e37050f36-kube-api-access-dxjl5\") pod \"olm-operator-6b444d44fb-85pd6\" (UID: \"ad0ad2f0-2678-4896-9f38-081e37050f36\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85pd6" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938794 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/5bccd7a7-c28a-4612-914c-3b8f99324dec-plugins-dir\") pod \"csi-hostpathplugin-9drdb\" (UID: \"5bccd7a7-c28a-4612-914c-3b8f99324dec\") " pod="hostpath-provisioner/csi-hostpathplugin-9drdb" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938811 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5bccd7a7-c28a-4612-914c-3b8f99324dec-socket-dir\") pod \"csi-hostpathplugin-9drdb\" (UID: \"5bccd7a7-c28a-4612-914c-3b8f99324dec\") " pod="hostpath-provisioner/csi-hostpathplugin-9drdb" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938814 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cbe83d62-6b6a-4b2e-8f6c-2fda7cb7c012-srv-cert\") pod \"catalog-operator-68c6474976-7q4b5\" (UID: \"cbe83d62-6b6a-4b2e-8f6c-2fda7cb7c012\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7q4b5" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938875 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rgwv\" (UniqueName: \"kubernetes.io/projected/bb59f1a0-666b-42b1-b0af-6f7bdeb5a895-kube-api-access-7rgwv\") pod \"package-server-manager-789f6589d5-7pgnn\" (UID: \"bb59f1a0-666b-42b1-b0af-6f7bdeb5a895\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pgnn" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938899 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8df7789-2cce-4ea5-bd59-996174038a1f-proxy-tls\") pod \"machine-config-controller-84d6567774-nzbph\" (UID: \"d8df7789-2cce-4ea5-bd59-996174038a1f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nzbph" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938918 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/406ec180-fbdf-4a41-84f9-749915f3eaa2-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qjxjv\" (UID: \"406ec180-fbdf-4a41-84f9-749915f3eaa2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qjxjv" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938942 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/92f8c721-43b3-4211-9580-9a1bbd4b61e9-auth-proxy-config\") pod \"machine-config-operator-74547568cd-k89cm\" (UID: \"92f8c721-43b3-4211-9580-9a1bbd4b61e9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k89cm" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938958 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqfw2\" (UniqueName: \"kubernetes.io/projected/99352d0d-2b0b-4aa0-b6a0-41e16070a323-kube-api-access-lqfw2\") pod \"kube-storage-version-migrator-operator-b67b599dd-8cgsk\" (UID: \"99352d0d-2b0b-4aa0-b6a0-41e16070a323\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cgsk" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.938976 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-audit-policies\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939000 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/be345565-0341-4290-b5e8-9cf728685a6b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6hzt4\" (UID: \"be345565-0341-4290-b5e8-9cf728685a6b\") " pod="openshift-marketplace/marketplace-operator-79b997595-6hzt4" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939016 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25539c1d-ee15-42e1-8743-3dbed89feb4e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-f62zc\" (UID: \"25539c1d-ee15-42e1-8743-3dbed89feb4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f62zc" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939070 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0db2db75-ec63-4052-be96-c5c34976fa18-signing-key\") pod \"service-ca-9c57cc56f-v8227\" (UID: \"0db2db75-ec63-4052-be96-c5c34976fa18\") " pod="openshift-service-ca/service-ca-9c57cc56f-v8227" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939115 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4f215af7-dad4-4dd1-9cc7-20c611eacace-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-8tc74\" (UID: \"4f215af7-dad4-4dd1-9cc7-20c611eacace\") " pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939131 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/4f215af7-dad4-4dd1-9cc7-20c611eacace-ready\") pod \"cni-sysctl-allowlist-ds-8tc74\" (UID: \"4f215af7-dad4-4dd1-9cc7-20c611eacace\") " pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939151 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsbn8\" (UniqueName: \"kubernetes.io/projected/b708b2de-ec1b-4cb9-8113-ab44e5437a9c-kube-api-access-qsbn8\") pod \"multus-admission-controller-857f4d67dd-2jxcr\" (UID: \"b708b2de-ec1b-4cb9-8113-ab44e5437a9c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2jxcr" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939171 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbdpm\" (UniqueName: \"kubernetes.io/projected/26515dd8-3e9b-446d-bdda-547bd85ea373-kube-api-access-pbdpm\") pod \"service-ca-operator-777779d784-x2vtd\" (UID: \"26515dd8-3e9b-446d-bdda-547bd85ea373\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x2vtd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939190 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/92f8c721-43b3-4211-9580-9a1bbd4b61e9-images\") pod \"machine-config-operator-74547568cd-k89cm\" (UID: \"92f8c721-43b3-4211-9580-9a1bbd4b61e9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k89cm" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939206 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939223 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ad0ad2f0-2678-4896-9f38-081e37050f36-srv-cert\") pod \"olm-operator-6b444d44fb-85pd6\" (UID: \"ad0ad2f0-2678-4896-9f38-081e37050f36\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85pd6" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939242 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4f215af7-dad4-4dd1-9cc7-20c611eacace-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-8tc74\" (UID: \"4f215af7-dad4-4dd1-9cc7-20c611eacace\") " pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939262 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klmzb\" (UniqueName: \"kubernetes.io/projected/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-kube-api-access-klmzb\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939283 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eac77a6f-f6c7-404c-9756-c33ff95f0c65-config-volume\") pod \"dns-default-65n7c\" (UID: \"eac77a6f-f6c7-404c-9756-c33ff95f0c65\") " pod="openshift-dns/dns-default-65n7c" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939299 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939321 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwfkx\" (UniqueName: \"kubernetes.io/projected/0db2db75-ec63-4052-be96-c5c34976fa18-kube-api-access-fwfkx\") pod \"service-ca-9c57cc56f-v8227\" (UID: \"0db2db75-ec63-4052-be96-c5c34976fa18\") " pod="openshift-service-ca/service-ca-9c57cc56f-v8227" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939342 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p67zx\" (UniqueName: \"kubernetes.io/projected/f81702e7-e998-42de-926c-8d704940eab8-kube-api-access-p67zx\") pod \"ingress-canary-psm7l\" (UID: \"f81702e7-e998-42de-926c-8d704940eab8\") " pod="openshift-ingress-canary/ingress-canary-psm7l" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939360 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/176bf56a-e65f-42f1-a975-298674f1f6a6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4wt7q\" (UID: \"176bf56a-e65f-42f1-a975-298674f1f6a6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4wt7q" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939376 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-audit-dir\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939394 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d8df7789-2cce-4ea5-bd59-996174038a1f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nzbph\" (UID: \"d8df7789-2cce-4ea5-bd59-996174038a1f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nzbph" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939410 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cbe83d62-6b6a-4b2e-8f6c-2fda7cb7c012-profile-collector-cert\") pod \"catalog-operator-68c6474976-7q4b5\" (UID: \"cbe83d62-6b6a-4b2e-8f6c-2fda7cb7c012\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7q4b5" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939426 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvj9b\" (UniqueName: \"kubernetes.io/projected/c2e534c2-c769-4ad3-942b-d181ed2cf11e-kube-api-access-jvj9b\") pod \"collect-profiles-29525430-b9kh6\" (UID: \"c2e534c2-c769-4ad3-942b-d181ed2cf11e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525430-b9kh6" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939450 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939471 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eac77a6f-f6c7-404c-9756-c33ff95f0c65-metrics-tls\") pod \"dns-default-65n7c\" (UID: \"eac77a6f-f6c7-404c-9756-c33ff95f0c65\") " pod="openshift-dns/dns-default-65n7c" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939489 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/5bccd7a7-c28a-4612-914c-3b8f99324dec-mountpoint-dir\") pod \"csi-hostpathplugin-9drdb\" (UID: \"5bccd7a7-c28a-4612-914c-3b8f99324dec\") " pod="hostpath-provisioner/csi-hostpathplugin-9drdb" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939509 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25z5z\" (UniqueName: \"kubernetes.io/projected/5bccd7a7-c28a-4612-914c-3b8f99324dec-kube-api-access-25z5z\") pod \"csi-hostpathplugin-9drdb\" (UID: \"5bccd7a7-c28a-4612-914c-3b8f99324dec\") " pod="hostpath-provisioner/csi-hostpathplugin-9drdb" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939529 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/5bccd7a7-c28a-4612-914c-3b8f99324dec-csi-data-dir\") pod \"csi-hostpathplugin-9drdb\" (UID: \"5bccd7a7-c28a-4612-914c-3b8f99324dec\") " pod="hostpath-provisioner/csi-hostpathplugin-9drdb" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939545 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939570 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939596 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e9cd83f3-4f63-4348-b467-b23ba946b7bc-node-bootstrap-token\") pod \"machine-config-server-kxd92\" (UID: \"e9cd83f3-4f63-4348-b467-b23ba946b7bc\") " pod="openshift-machine-config-operator/machine-config-server-kxd92" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939612 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939626 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25539c1d-ee15-42e1-8743-3dbed89feb4e-config\") pod \"kube-apiserver-operator-766d6c64bb-f62zc\" (UID: \"25539c1d-ee15-42e1-8743-3dbed89feb4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f62zc" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939647 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4r9k\" (UniqueName: \"kubernetes.io/projected/176bf56a-e65f-42f1-a975-298674f1f6a6-kube-api-access-l4r9k\") pod \"ingress-operator-5b745b69d9-4wt7q\" (UID: \"176bf56a-e65f-42f1-a975-298674f1f6a6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4wt7q" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939663 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ad0ad2f0-2678-4896-9f38-081e37050f36-profile-collector-cert\") pod \"olm-operator-6b444d44fb-85pd6\" (UID: \"ad0ad2f0-2678-4896-9f38-081e37050f36\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85pd6" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939678 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/92f8c721-43b3-4211-9580-9a1bbd4b61e9-proxy-tls\") pod \"machine-config-operator-74547568cd-k89cm\" (UID: \"92f8c721-43b3-4211-9580-9a1bbd4b61e9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k89cm" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939698 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f81702e7-e998-42de-926c-8d704940eab8-cert\") pod \"ingress-canary-psm7l\" (UID: \"f81702e7-e998-42de-926c-8d704940eab8\") " pod="openshift-ingress-canary/ingress-canary-psm7l" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939722 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf4qf\" (UniqueName: \"kubernetes.io/projected/be345565-0341-4290-b5e8-9cf728685a6b-kube-api-access-hf4qf\") pod \"marketplace-operator-79b997595-6hzt4\" (UID: \"be345565-0341-4290-b5e8-9cf728685a6b\") " pod="openshift-marketplace/marketplace-operator-79b997595-6hzt4" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.939744 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/406ec180-fbdf-4a41-84f9-749915f3eaa2-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qjxjv\" (UID: \"406ec180-fbdf-4a41-84f9-749915f3eaa2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qjxjv" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.940694 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/5bccd7a7-c28a-4612-914c-3b8f99324dec-mountpoint-dir\") pod \"csi-hostpathplugin-9drdb\" (UID: \"5bccd7a7-c28a-4612-914c-3b8f99324dec\") " pod="hostpath-provisioner/csi-hostpathplugin-9drdb" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.940932 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/5bccd7a7-c28a-4612-914c-3b8f99324dec-csi-data-dir\") pod \"csi-hostpathplugin-9drdb\" (UID: \"5bccd7a7-c28a-4612-914c-3b8f99324dec\") " pod="hostpath-provisioner/csi-hostpathplugin-9drdb" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.941323 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99352d0d-2b0b-4aa0-b6a0-41e16070a323-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8cgsk\" (UID: \"99352d0d-2b0b-4aa0-b6a0-41e16070a323\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cgsk" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.941434 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/92f8c721-43b3-4211-9580-9a1bbd4b61e9-images\") pod \"machine-config-operator-74547568cd-k89cm\" (UID: \"92f8c721-43b3-4211-9580-9a1bbd4b61e9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k89cm" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.941848 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.942305 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/406ec180-fbdf-4a41-84f9-749915f3eaa2-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qjxjv\" (UID: \"406ec180-fbdf-4a41-84f9-749915f3eaa2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qjxjv" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.942587 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cbe83d62-6b6a-4b2e-8f6c-2fda7cb7c012-srv-cert\") pod \"catalog-operator-68c6474976-7q4b5\" (UID: \"cbe83d62-6b6a-4b2e-8f6c-2fda7cb7c012\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7q4b5" Feb 19 18:34:20 crc kubenswrapper[4749]: E0219 18:34:20.942789 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:21.442759996 +0000 UTC m=+35.403980030 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.943315 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb59f1a0-666b-42b1-b0af-6f7bdeb5a895-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7pgnn\" (UID: \"bb59f1a0-666b-42b1-b0af-6f7bdeb5a895\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pgnn" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.943885 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-audit-dir\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.944646 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.945093 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8df7789-2cce-4ea5-bd59-996174038a1f-proxy-tls\") pod \"machine-config-controller-84d6567774-nzbph\" (UID: \"d8df7789-2cce-4ea5-bd59-996174038a1f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nzbph" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.945680 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d8df7789-2cce-4ea5-bd59-996174038a1f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nzbph\" (UID: \"d8df7789-2cce-4ea5-bd59-996174038a1f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nzbph" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.945761 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/176bf56a-e65f-42f1-a975-298674f1f6a6-trusted-ca\") pod \"ingress-operator-5b745b69d9-4wt7q\" (UID: \"176bf56a-e65f-42f1-a975-298674f1f6a6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4wt7q" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.946745 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd3a8256-8304-48a6-bcab-be5fee9f8017-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q68nf\" (UID: \"fd3a8256-8304-48a6-bcab-be5fee9f8017\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q68nf" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.947363 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25539c1d-ee15-42e1-8743-3dbed89feb4e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-f62zc\" (UID: \"25539c1d-ee15-42e1-8743-3dbed89feb4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f62zc" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.948008 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/be345565-0341-4290-b5e8-9cf728685a6b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6hzt4\" (UID: \"be345565-0341-4290-b5e8-9cf728685a6b\") " pod="openshift-marketplace/marketplace-operator-79b997595-6hzt4" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.948061 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b708b2de-ec1b-4cb9-8113-ab44e5437a9c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2jxcr\" (UID: \"b708b2de-ec1b-4cb9-8113-ab44e5437a9c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2jxcr" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.944821 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.948152 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/92f8c721-43b3-4211-9580-9a1bbd4b61e9-auth-proxy-config\") pod \"machine-config-operator-74547568cd-k89cm\" (UID: \"92f8c721-43b3-4211-9580-9a1bbd4b61e9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k89cm" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.948367 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0db2db75-ec63-4052-be96-c5c34976fa18-signing-cabundle\") pod \"service-ca-9c57cc56f-v8227\" (UID: \"0db2db75-ec63-4052-be96-c5c34976fa18\") " pod="openshift-service-ca/service-ca-9c57cc56f-v8227" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.948386 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99352d0d-2b0b-4aa0-b6a0-41e16070a323-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8cgsk\" (UID: \"99352d0d-2b0b-4aa0-b6a0-41e16070a323\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cgsk" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.948548 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.949012 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.949397 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/4f215af7-dad4-4dd1-9cc7-20c611eacace-ready\") pod \"cni-sysctl-allowlist-ds-8tc74\" (UID: \"4f215af7-dad4-4dd1-9cc7-20c611eacace\") " pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.950401 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.950846 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ad0ad2f0-2678-4896-9f38-081e37050f36-srv-cert\") pod \"olm-operator-6b444d44fb-85pd6\" (UID: \"ad0ad2f0-2678-4896-9f38-081e37050f36\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85pd6" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.950960 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4f215af7-dad4-4dd1-9cc7-20c611eacace-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-8tc74\" (UID: \"4f215af7-dad4-4dd1-9cc7-20c611eacace\") " pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.951106 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0db2db75-ec63-4052-be96-c5c34976fa18-signing-key\") pod \"service-ca-9c57cc56f-v8227\" (UID: \"0db2db75-ec63-4052-be96-c5c34976fa18\") " pod="openshift-service-ca/service-ca-9c57cc56f-v8227" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.951235 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cbe83d62-6b6a-4b2e-8f6c-2fda7cb7c012-profile-collector-cert\") pod \"catalog-operator-68c6474976-7q4b5\" (UID: \"cbe83d62-6b6a-4b2e-8f6c-2fda7cb7c012\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7q4b5" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.951772 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.952059 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25539c1d-ee15-42e1-8743-3dbed89feb4e-config\") pod \"kube-apiserver-operator-766d6c64bb-f62zc\" (UID: \"25539c1d-ee15-42e1-8743-3dbed89feb4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f62zc" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.952085 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.952451 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-audit-policies\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.952503 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2e534c2-c769-4ad3-942b-d181ed2cf11e-secret-volume\") pod \"collect-profiles-29525430-b9kh6\" (UID: \"c2e534c2-c769-4ad3-942b-d181ed2cf11e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525430-b9kh6" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.952595 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/406ec180-fbdf-4a41-84f9-749915f3eaa2-config\") pod \"kube-controller-manager-operator-78b949d7b-qjxjv\" (UID: \"406ec180-fbdf-4a41-84f9-749915f3eaa2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qjxjv" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.952655 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/5bccd7a7-c28a-4612-914c-3b8f99324dec-plugins-dir\") pod \"csi-hostpathplugin-9drdb\" (UID: \"5bccd7a7-c28a-4612-914c-3b8f99324dec\") " pod="hostpath-provisioner/csi-hostpathplugin-9drdb" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.952702 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5bccd7a7-c28a-4612-914c-3b8f99324dec-registration-dir\") pod \"csi-hostpathplugin-9drdb\" (UID: \"5bccd7a7-c28a-4612-914c-3b8f99324dec\") " pod="hostpath-provisioner/csi-hostpathplugin-9drdb" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.953795 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.953935 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.954144 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.954154 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5c6fd864-39d6-4de5-9f6c-ec95242b9178-tmpfs\") pod \"packageserver-d55dfcdfc-xzpgj\" (UID: \"5c6fd864-39d6-4de5-9f6c-ec95242b9178\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xzpgj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.954523 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/176bf56a-e65f-42f1-a975-298674f1f6a6-metrics-tls\") pod \"ingress-operator-5b745b69d9-4wt7q\" (UID: \"176bf56a-e65f-42f1-a975-298674f1f6a6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4wt7q" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.954712 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/92f8c721-43b3-4211-9580-9a1bbd4b61e9-proxy-tls\") pod \"machine-config-operator-74547568cd-k89cm\" (UID: \"92f8c721-43b3-4211-9580-9a1bbd4b61e9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k89cm" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.954741 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd3a8256-8304-48a6-bcab-be5fee9f8017-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q68nf\" (UID: \"fd3a8256-8304-48a6-bcab-be5fee9f8017\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q68nf" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.955625 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ad0ad2f0-2678-4896-9f38-081e37050f36-profile-collector-cert\") pod \"olm-operator-6b444d44fb-85pd6\" (UID: \"ad0ad2f0-2678-4896-9f38-081e37050f36\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85pd6" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.955742 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.955751 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5c6fd864-39d6-4de5-9f6c-ec95242b9178-apiservice-cert\") pod \"packageserver-d55dfcdfc-xzpgj\" (UID: \"5c6fd864-39d6-4de5-9f6c-ec95242b9178\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xzpgj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.956247 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/be345565-0341-4290-b5e8-9cf728685a6b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6hzt4\" (UID: \"be345565-0341-4290-b5e8-9cf728685a6b\") " pod="openshift-marketplace/marketplace-operator-79b997595-6hzt4" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.956957 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5c6fd864-39d6-4de5-9f6c-ec95242b9178-webhook-cert\") pod \"packageserver-d55dfcdfc-xzpgj\" (UID: \"5c6fd864-39d6-4de5-9f6c-ec95242b9178\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xzpgj" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.959118 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26515dd8-3e9b-446d-bdda-547bd85ea373-serving-cert\") pod \"service-ca-operator-777779d784-x2vtd\" (UID: \"26515dd8-3e9b-446d-bdda-547bd85ea373\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x2vtd" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.962464 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.976050 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-slnmj" event={"ID":"4586ae5a-18b0-474f-8581-a2cd850ea693","Type":"ContainerStarted","Data":"89b2e4a50d9b53e5f698379eab75a7c19dfd68a2117271fb8721fb3e7c3fc9ff"} Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.976112 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-slnmj" event={"ID":"4586ae5a-18b0-474f-8581-a2cd850ea693","Type":"ContainerStarted","Data":"25bd0e2c07b715c677aa4a5c10176e43ffe2679a7fa624ad362fe0640b3847d3"} Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.978793 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-j6r8j" event={"ID":"0263ca67-c191-4560-be01-a2ad7fbeea4f","Type":"ContainerStarted","Data":"40a367cc229ede894a3af1669d12326977bffb61c7471f35c57cc66562cf4904"} Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.978863 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-j6r8j" event={"ID":"0263ca67-c191-4560-be01-a2ad7fbeea4f","Type":"ContainerStarted","Data":"2520c41a855907d5786b34721b0cedc960d2cb0a756da6eafa076431a2516196"} Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.981014 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4dfs6" event={"ID":"9e1cff1c-e90e-42a5-8ca9-0335f93988ff","Type":"ContainerStarted","Data":"3debed165b161e4038082a8d1d871a6524e5f835faab4c08f07864d4a9254823"} Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.981058 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4dfs6" event={"ID":"9e1cff1c-e90e-42a5-8ca9-0335f93988ff","Type":"ContainerStarted","Data":"38d2bf48a6e9cc05df0398249ff127faab3946ce35c66bdb9f2832ddee14858d"} Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.982572 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-2wn9n" event={"ID":"c553f8a1-0943-4437-9cd4-ef9192718b1e","Type":"ContainerStarted","Data":"c793b777cb9d4092b12c3a9aa6c856ac709ecd7c0d82232d542df126c2dc4a59"} Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.982606 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-2wn9n" event={"ID":"c553f8a1-0943-4437-9cd4-ef9192718b1e","Type":"ContainerStarted","Data":"4cf330480a8a261ca7b18b679a33e5dc7a4939865d3c8f7bfae9e2cc4f73f86d"} Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.982775 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-2wn9n" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.983649 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.984187 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tzw68" event={"ID":"1d278b7c-89ac-4929-9820-911c03fdb680","Type":"ContainerStarted","Data":"e9d3f8c11d622a971e555770f26ce03b46a5ce248b577b422e082eb74a80f26e"} Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.984229 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tzw68" event={"ID":"1d278b7c-89ac-4929-9820-911c03fdb680","Type":"ContainerStarted","Data":"be529f01876217401e462ecfbe3ac32a29051fe6aba707c3deac4d5712549d37"} Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.984429 4749 patch_prober.go:28] interesting pod/console-operator-58897d9998-2wn9n container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.984501 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-2wn9n" podUID="c553f8a1-0943-4437-9cd4-ef9192718b1e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.985939 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8cg8w" event={"ID":"10b818d4-d6fd-4377-bd69-510eda43e365","Type":"ContainerStarted","Data":"df0088bcd86708f67a2b4c3c58b8c58ee73bef558d2acc37b69cf14841f1cd10"} Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.985966 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8cg8w" event={"ID":"10b818d4-d6fd-4377-bd69-510eda43e365","Type":"ContainerStarted","Data":"e968981582f10309a435e300c87bec25fd38533187a2dbae78573ea5b59ff21e"} Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.985980 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8cg8w" event={"ID":"10b818d4-d6fd-4377-bd69-510eda43e365","Type":"ContainerStarted","Data":"8a9277b79389bb55da7aeb68df357ec76f80fc91d153b416741650ba6e27474d"} Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.987988 4749 generic.go:334] "Generic (PLEG): container finished" podID="c30b3c06-11f3-450e-8793-4bacb3756a3e" containerID="1941215d28bba97445be6ba679fdb8e89b7ae3e3428af7510b1719fc70743e37" exitCode=0 Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.988064 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-497mk" event={"ID":"c30b3c06-11f3-450e-8793-4bacb3756a3e","Type":"ContainerDied","Data":"1941215d28bba97445be6ba679fdb8e89b7ae3e3428af7510b1719fc70743e37"} Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.988083 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-497mk" event={"ID":"c30b3c06-11f3-450e-8793-4bacb3756a3e","Type":"ContainerStarted","Data":"b564a4bbd942cf2187c0b5bcde2bc79ab61f75d7d1f2f7565df18738dc22fc6c"} Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.991486 4749 generic.go:334] "Generic (PLEG): container finished" podID="4c22c2e7-97be-4549-9d14-c65edcc0b2ec" containerID="8306e3c68fb5c15c5475ff7c8e944a8771924a967a93cae103565613c5a5718c" exitCode=0 Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.991530 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-htrq5" event={"ID":"4c22c2e7-97be-4549-9d14-c65edcc0b2ec","Type":"ContainerDied","Data":"8306e3c68fb5c15c5475ff7c8e944a8771924a967a93cae103565613c5a5718c"} Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.991559 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-htrq5" event={"ID":"4c22c2e7-97be-4549-9d14-c65edcc0b2ec","Type":"ContainerStarted","Data":"8fc9d34c5fbe5ba53d024b0f9800781d16c99336d3e06473d248d49c5255529b"} Feb 19 18:34:20 crc kubenswrapper[4749]: I0219 18:34:20.992225 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26515dd8-3e9b-446d-bdda-547bd85ea373-config\") pod \"service-ca-operator-777779d784-x2vtd\" (UID: \"26515dd8-3e9b-446d-bdda-547bd85ea373\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x2vtd" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.002592 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.009215 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2e534c2-c769-4ad3-942b-d181ed2cf11e-config-volume\") pod \"collect-profiles-29525430-b9kh6\" (UID: \"c2e534c2-c769-4ad3-942b-d181ed2cf11e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525430-b9kh6" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.027145 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.035828 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-slnmj" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.040206 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.040323 4749 patch_prober.go:28] interesting pod/router-default-5444994796-slnmj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 18:34:21 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Feb 19 18:34:21 crc kubenswrapper[4749]: [+]process-running ok Feb 19 18:34:21 crc kubenswrapper[4749]: healthz check failed Feb 19 18:34:21 crc kubenswrapper[4749]: E0219 18:34:21.040406 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:21.540374545 +0000 UTC m=+35.501594499 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.040404 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-slnmj" podUID="4586ae5a-18b0-474f-8581-a2cd850ea693" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.040927 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:21 crc kubenswrapper[4749]: E0219 18:34:21.041499 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:21.541488961 +0000 UTC m=+35.502708915 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.043002 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.044338 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.051719 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4f215af7-dad4-4dd1-9cc7-20c611eacace-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-8tc74\" (UID: \"4f215af7-dad4-4dd1-9cc7-20c611eacace\") " pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.062768 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.083239 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.102116 4749 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.122926 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.141672 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:21 crc kubenswrapper[4749]: E0219 18:34:21.142773 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:21.642740162 +0000 UTC m=+35.603960156 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.143670 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.156343 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eac77a6f-f6c7-404c-9756-c33ff95f0c65-metrics-tls\") pod \"dns-default-65n7c\" (UID: \"eac77a6f-f6c7-404c-9756-c33ff95f0c65\") " pod="openshift-dns/dns-default-65n7c" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.165168 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.174794 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/18ce2742-770a-492b-a2c1-b1c615b27c71-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nl4bn\" (UID: \"18ce2742-770a-492b-a2c1-b1c615b27c71\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl4bn" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.182855 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.205019 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.210250 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eac77a6f-f6c7-404c-9756-c33ff95f0c65-config-volume\") pod \"dns-default-65n7c\" (UID: \"eac77a6f-f6c7-404c-9756-c33ff95f0c65\") " pod="openshift-dns/dns-default-65n7c" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.223561 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.235232 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e9cd83f3-4f63-4348-b467-b23ba946b7bc-node-bootstrap-token\") pod \"machine-config-server-kxd92\" (UID: \"e9cd83f3-4f63-4348-b467-b23ba946b7bc\") " pod="openshift-machine-config-operator/machine-config-server-kxd92" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.243154 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.244313 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:21 crc kubenswrapper[4749]: E0219 18:34:21.244972 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:21.744943916 +0000 UTC m=+35.706163950 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.263777 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.273497 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e9cd83f3-4f63-4348-b467-b23ba946b7bc-certs\") pod \"machine-config-server-kxd92\" (UID: \"e9cd83f3-4f63-4348-b467-b23ba946b7bc\") " pod="openshift-machine-config-operator/machine-config-server-kxd92" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.282108 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.295124 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f81702e7-e998-42de-926c-8d704940eab8-cert\") pod \"ingress-canary-psm7l\" (UID: \"f81702e7-e998-42de-926c-8d704940eab8\") " pod="openshift-ingress-canary/ingress-canary-psm7l" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.302988 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.322199 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.328449 4749 csr.go:261] certificate signing request csr-cjzhr is approved, waiting to be issued Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.336016 4749 csr.go:257] certificate signing request csr-cjzhr is issued Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.343141 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.345741 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:21 crc kubenswrapper[4749]: E0219 18:34:21.345864 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:21.84584955 +0000 UTC m=+35.807069504 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.346141 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:21 crc kubenswrapper[4749]: E0219 18:34:21.346399 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:21.846392402 +0000 UTC m=+35.807612356 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.384837 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.402598 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.422617 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.442557 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.446904 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:21 crc kubenswrapper[4749]: E0219 18:34:21.447112 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:21.947086071 +0000 UTC m=+35.908306025 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.447381 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:21 crc kubenswrapper[4749]: E0219 18:34:21.447785 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:21.947767177 +0000 UTC m=+35.908987141 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.462805 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.482525 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.518097 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6s22\" (UniqueName: \"kubernetes.io/projected/b1a11162-6554-4080-9b1f-e0864a79ec01-kube-api-access-z6s22\") pod \"route-controller-manager-6576b87f9c-87stm\" (UID: \"b1a11162-6554-4080-9b1f-e0864a79ec01\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.544411 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdhhw\" (UniqueName: \"kubernetes.io/projected/a1cfc2e6-b878-4af9-969b-33a513042b75-kube-api-access-fdhhw\") pod \"controller-manager-879f6c89f-d49ll\" (UID: \"a1cfc2e6-b878-4af9-969b-33a513042b75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.548759 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.548817 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:21 crc kubenswrapper[4749]: E0219 18:34:21.549231 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:22.049207383 +0000 UTC m=+36.010427327 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.572760 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clchw\" (UniqueName: \"kubernetes.io/projected/ced680c0-5c35-4d74-b553-7f95483907f7-kube-api-access-clchw\") pod \"etcd-operator-b45778765-nxsch\" (UID: \"ced680c0-5c35-4d74-b553-7f95483907f7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nxsch" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.581321 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65ddl\" (UniqueName: \"kubernetes.io/projected/41d2c5ca-2f07-4fb0-9822-5d3f7119f56b-kube-api-access-65ddl\") pod \"downloads-7954f5f757-7vbmz\" (UID: \"41d2c5ca-2f07-4fb0-9822-5d3f7119f56b\") " pod="openshift-console/downloads-7954f5f757-7vbmz" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.604398 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4hdr\" (UniqueName: \"kubernetes.io/projected/fc385e2a-5c57-49bc-a308-57a35663a452-kube-api-access-f4hdr\") pod \"machine-api-operator-5694c8668f-qbnpr\" (UID: \"fc385e2a-5c57-49bc-a308-57a35663a452\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qbnpr" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.624797 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg2k6\" (UniqueName: \"kubernetes.io/projected/9bea494d-956c-4fa1-b65e-7d791d79690c-kube-api-access-tg2k6\") pod \"apiserver-7bbb656c7d-dwgqx\" (UID: \"9bea494d-956c-4fa1-b65e-7d791d79690c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.651152 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:21 crc kubenswrapper[4749]: E0219 18:34:21.651601 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:22.151585871 +0000 UTC m=+36.112805825 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.660513 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkk99\" (UniqueName: \"kubernetes.io/projected/4919053a-3b2b-4575-a86f-85004ba4b899-kube-api-access-bkk99\") pod \"cluster-image-registry-operator-dc59b4c8b-tsf9g\" (UID: \"4919053a-3b2b-4575-a86f-85004ba4b899\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tsf9g" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.660534 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-722xt\" (UniqueName: \"kubernetes.io/projected/864274b1-985f-447a-b8d3-c7d9b2c2751b-kube-api-access-722xt\") pod \"dns-operator-744455d44c-q7rpj\" (UID: \"864274b1-985f-447a-b8d3-c7d9b2c2751b\") " pod="openshift-dns-operator/dns-operator-744455d44c-q7rpj" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.679505 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-7vbmz" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.685979 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkx6v\" (UniqueName: \"kubernetes.io/projected/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-kube-api-access-kkx6v\") pod \"console-f9d7485db-rnkj5\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.700083 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.706715 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjrh8\" (UniqueName: \"kubernetes.io/projected/ac1677b6-8344-44c7-a5fc-2924da30ddbc-kube-api-access-gjrh8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.719606 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qdm6\" (UniqueName: \"kubernetes.io/projected/ce6dc220-87e7-485b-ac8d-16654ba01e3a-kube-api-access-6qdm6\") pod \"openshift-controller-manager-operator-756b6f6bc6-9dlzr\" (UID: \"ce6dc220-87e7-485b-ac8d-16654ba01e3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9dlzr" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.742817 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac1677b6-8344-44c7-a5fc-2924da30ddbc-bound-sa-token\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.752586 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:21 crc kubenswrapper[4749]: E0219 18:34:21.752704 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:22.252680099 +0000 UTC m=+36.213900053 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.752993 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:21 crc kubenswrapper[4749]: E0219 18:34:21.753407 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:22.253399826 +0000 UTC m=+36.214619780 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.764251 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4919053a-3b2b-4575-a86f-85004ba4b899-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-tsf9g\" (UID: \"4919053a-3b2b-4575-a86f-85004ba4b899\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tsf9g" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.774608 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25z5z\" (UniqueName: \"kubernetes.io/projected/5bccd7a7-c28a-4612-914c-3b8f99324dec-kube-api-access-25z5z\") pod \"csi-hostpathplugin-9drdb\" (UID: \"5bccd7a7-c28a-4612-914c-3b8f99324dec\") " pod="hostpath-provisioner/csi-hostpathplugin-9drdb" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.790553 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-q7rpj" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.797229 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9dlzr" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.801485 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25539c1d-ee15-42e1-8743-3dbed89feb4e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-f62zc\" (UID: \"25539c1d-ee15-42e1-8743-3dbed89feb4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f62zc" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.803853 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-nxsch" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.810733 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tsf9g" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.826547 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.826837 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rgwv\" (UniqueName: \"kubernetes.io/projected/bb59f1a0-666b-42b1-b0af-6f7bdeb5a895-kube-api-access-7rgwv\") pod \"package-server-manager-789f6589d5-7pgnn\" (UID: \"bb59f1a0-666b-42b1-b0af-6f7bdeb5a895\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pgnn" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.828070 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.849212 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwfkx\" (UniqueName: \"kubernetes.io/projected/0db2db75-ec63-4052-be96-c5c34976fa18-kube-api-access-fwfkx\") pod \"service-ca-9c57cc56f-v8227\" (UID: \"0db2db75-ec63-4052-be96-c5c34976fa18\") " pod="openshift-service-ca/service-ca-9c57cc56f-v8227" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.849755 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f62zc" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.853715 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:21 crc kubenswrapper[4749]: E0219 18:34:21.853892 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:22.35386445 +0000 UTC m=+36.315084404 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.854761 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:21 crc kubenswrapper[4749]: E0219 18:34:21.855193 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:22.35517717 +0000 UTC m=+36.316397114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.863970 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klmzb\" (UniqueName: \"kubernetes.io/projected/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-kube-api-access-klmzb\") pod \"oauth-openshift-558db77b4-2dbbd\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.864236 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qbnpr" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.882129 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p67zx\" (UniqueName: \"kubernetes.io/projected/f81702e7-e998-42de-926c-8d704940eab8-kube-api-access-p67zx\") pod \"ingress-canary-psm7l\" (UID: \"f81702e7-e998-42de-926c-8d704940eab8\") " pod="openshift-ingress-canary/ingress-canary-psm7l" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.893293 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.905523 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/176bf56a-e65f-42f1-a975-298674f1f6a6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4wt7q\" (UID: \"176bf56a-e65f-42f1-a975-298674f1f6a6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4wt7q" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.927744 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzf62\" (UniqueName: \"kubernetes.io/projected/e9cd83f3-4f63-4348-b467-b23ba946b7bc-kube-api-access-bzf62\") pod \"machine-config-server-kxd92\" (UID: \"e9cd83f3-4f63-4348-b467-b23ba946b7bc\") " pod="openshift-machine-config-operator/machine-config-server-kxd92" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.937934 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltfr7\" (UniqueName: \"kubernetes.io/projected/d8df7789-2cce-4ea5-bd59-996174038a1f-kube-api-access-ltfr7\") pod \"machine-config-controller-84d6567774-nzbph\" (UID: \"d8df7789-2cce-4ea5-bd59-996174038a1f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nzbph" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.955462 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-v8227" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.955654 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:21 crc kubenswrapper[4749]: E0219 18:34:21.956339 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:22.456320119 +0000 UTC m=+36.417540073 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.958064 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:21 crc kubenswrapper[4749]: E0219 18:34:21.958608 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:22.458594991 +0000 UTC m=+36.419814935 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.960811 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pgnn" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.960968 4749 request.go:700] Waited for 1.014105815s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/serviceaccounts/machine-config-operator/token Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.973281 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-d49ll"] Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.983503 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6djc\" (UniqueName: \"kubernetes.io/projected/4f215af7-dad4-4dd1-9cc7-20c611eacace-kube-api-access-f6djc\") pod \"cni-sysctl-allowlist-ds-8tc74\" (UID: \"4f215af7-dad4-4dd1-9cc7-20c611eacace\") " pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.990341 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.995308 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64ff2\" (UniqueName: \"kubernetes.io/projected/92f8c721-43b3-4211-9580-9a1bbd4b61e9-kube-api-access-64ff2\") pod \"machine-config-operator-74547568cd-k89cm\" (UID: \"92f8c721-43b3-4211-9580-9a1bbd4b61e9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k89cm" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.998189 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-htrq5" event={"ID":"4c22c2e7-97be-4549-9d14-c65edcc0b2ec","Type":"ContainerStarted","Data":"195ab69e99fdfa20ea43f3e99699f428c8240d00871f3ed87089fe73e58167ae"} Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.998357 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-htrq5" Feb 19 18:34:21 crc kubenswrapper[4749]: I0219 18:34:21.998630 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/406ec180-fbdf-4a41-84f9-749915f3eaa2-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qjxjv\" (UID: \"406ec180-fbdf-4a41-84f9-749915f3eaa2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qjxjv" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.008190 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-9drdb" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.008757 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-497mk" event={"ID":"c30b3c06-11f3-450e-8793-4bacb3756a3e","Type":"ContainerStarted","Data":"39f8c215681f34c0276644c77be94495b913ac57ae85001bc75a20a716ceb3f4"} Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.008787 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-497mk" event={"ID":"c30b3c06-11f3-450e-8793-4bacb3756a3e","Type":"ContainerStarted","Data":"cc03e0b8860b82219bad9367b5d598cdbf864001e1f462c8d73d96344b3e8e01"} Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.014675 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-2wn9n" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.032492 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-kxd92" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.036128 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd3a8256-8304-48a6-bcab-be5fee9f8017-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q68nf\" (UID: \"fd3a8256-8304-48a6-bcab-be5fee9f8017\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q68nf" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.038056 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-psm7l" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.040942 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh2tv\" (UniqueName: \"kubernetes.io/projected/eac77a6f-f6c7-404c-9756-c33ff95f0c65-kube-api-access-jh2tv\") pod \"dns-default-65n7c\" (UID: \"eac77a6f-f6c7-404c-9756-c33ff95f0c65\") " pod="openshift-dns/dns-default-65n7c" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.059604 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz5hv\" (UniqueName: \"kubernetes.io/projected/cbe83d62-6b6a-4b2e-8f6c-2fda7cb7c012-kube-api-access-pz5hv\") pod \"catalog-operator-68c6474976-7q4b5\" (UID: \"cbe83d62-6b6a-4b2e-8f6c-2fda7cb7c012\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7q4b5" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.059773 4749 patch_prober.go:28] interesting pod/router-default-5444994796-slnmj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 18:34:22 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Feb 19 18:34:22 crc kubenswrapper[4749]: [+]process-running ok Feb 19 18:34:22 crc kubenswrapper[4749]: healthz check failed Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.059803 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-slnmj" podUID="4586ae5a-18b0-474f-8581-a2cd850ea693" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.061336 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:22 crc kubenswrapper[4749]: E0219 18:34:22.061743 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:22.561728375 +0000 UTC m=+36.522948319 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.086273 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbdql\" (UniqueName: \"kubernetes.io/projected/5c6fd864-39d6-4de5-9f6c-ec95242b9178-kube-api-access-sbdql\") pod \"packageserver-d55dfcdfc-xzpgj\" (UID: \"5c6fd864-39d6-4de5-9f6c-ec95242b9178\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xzpgj" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.114477 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqfw2\" (UniqueName: \"kubernetes.io/projected/99352d0d-2b0b-4aa0-b6a0-41e16070a323-kube-api-access-lqfw2\") pod \"kube-storage-version-migrator-operator-b67b599dd-8cgsk\" (UID: \"99352d0d-2b0b-4aa0-b6a0-41e16070a323\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cgsk" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.131957 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5n9w\" (UniqueName: \"kubernetes.io/projected/4882b1fe-f663-4de2-9d42-ae55ca424efe-kube-api-access-n5n9w\") pod \"migrator-59844c95c7-8bk42\" (UID: \"4882b1fe-f663-4de2-9d42-ae55ca424efe\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8bk42" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.165376 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wptx\" (UniqueName: \"kubernetes.io/projected/18ce2742-770a-492b-a2c1-b1c615b27c71-kube-api-access-7wptx\") pod \"control-plane-machine-set-operator-78cbb6b69f-nl4bn\" (UID: \"18ce2742-770a-492b-a2c1-b1c615b27c71\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl4bn" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.165890 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8bk42" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.167042 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.167115 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsbn8\" (UniqueName: \"kubernetes.io/projected/b708b2de-ec1b-4cb9-8113-ab44e5437a9c-kube-api-access-qsbn8\") pod \"multus-admission-controller-857f4d67dd-2jxcr\" (UID: \"b708b2de-ec1b-4cb9-8113-ab44e5437a9c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2jxcr" Feb 19 18:34:22 crc kubenswrapper[4749]: E0219 18:34:22.170612 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:22.670597321 +0000 UTC m=+36.631817275 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.172493 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q68nf" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.189430 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm"] Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.190546 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2jxcr" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.192117 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cgsk" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.199198 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qjxjv" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.200142 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbdpm\" (UniqueName: \"kubernetes.io/projected/26515dd8-3e9b-446d-bdda-547bd85ea373-kube-api-access-pbdpm\") pod \"service-ca-operator-777779d784-x2vtd\" (UID: \"26515dd8-3e9b-446d-bdda-547bd85ea373\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x2vtd" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.205934 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nzbph" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.205938 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-7vbmz"] Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.212149 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tsf9g"] Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.216811 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k89cm" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.221228 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvj9b\" (UniqueName: \"kubernetes.io/projected/c2e534c2-c769-4ad3-942b-d181ed2cf11e-kube-api-access-jvj9b\") pod \"collect-profiles-29525430-b9kh6\" (UID: \"c2e534c2-c769-4ad3-942b-d181ed2cf11e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525430-b9kh6" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.222509 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4r9k\" (UniqueName: \"kubernetes.io/projected/176bf56a-e65f-42f1-a975-298674f1f6a6-kube-api-access-l4r9k\") pod \"ingress-operator-5b745b69d9-4wt7q\" (UID: \"176bf56a-e65f-42f1-a975-298674f1f6a6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4wt7q" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.223313 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xzpgj" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.229455 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7q4b5" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.243910 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf4qf\" (UniqueName: \"kubernetes.io/projected/be345565-0341-4290-b5e8-9cf728685a6b-kube-api-access-hf4qf\") pod \"marketplace-operator-79b997595-6hzt4\" (UID: \"be345565-0341-4290-b5e8-9cf728685a6b\") " pod="openshift-marketplace/marketplace-operator-79b997595-6hzt4" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.250826 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6hzt4" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.268526 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:22 crc kubenswrapper[4749]: E0219 18:34:22.268877 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:22.768860814 +0000 UTC m=+36.730080768 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.268957 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x2vtd" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.271668 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxjl5\" (UniqueName: \"kubernetes.io/projected/ad0ad2f0-2678-4896-9f38-081e37050f36-kube-api-access-dxjl5\") pod \"olm-operator-6b444d44fb-85pd6\" (UID: \"ad0ad2f0-2678-4896-9f38-081e37050f36\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85pd6" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.278059 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-q7rpj"] Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.278258 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525430-b9kh6" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.279569 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nxsch"] Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.317052 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl4bn" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.325702 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-65n7c" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.338663 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-19 18:29:21 +0000 UTC, rotation deadline is 2026-11-28 01:03:57.41256371 +0000 UTC Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.338718 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6750h29m35.073847647s for next certificate rotation Feb 19 18:34:22 crc kubenswrapper[4749]: W0219 18:34:22.339961 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod864274b1_985f_447a_b8d3_c7d9b2c2751b.slice/crio-c75eaebea175e3d719bbef783bde3a6f5b56db712fedcc306a60259e6e3f3d26 WatchSource:0}: Error finding container c75eaebea175e3d719bbef783bde3a6f5b56db712fedcc306a60259e6e3f3d26: Status 404 returned error can't find the container with id c75eaebea175e3d719bbef783bde3a6f5b56db712fedcc306a60259e6e3f3d26 Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.373598 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:22 crc kubenswrapper[4749]: E0219 18:34:22.374215 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:22.874200829 +0000 UTC m=+36.835420773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.375316 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f62zc"] Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.458015 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4wt7q" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.460121 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9dlzr"] Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.461485 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx"] Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.476566 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:22 crc kubenswrapper[4749]: E0219 18:34:22.476786 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:22.976765592 +0000 UTC m=+36.937985546 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.479503 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:22 crc kubenswrapper[4749]: E0219 18:34:22.479863 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:22.979855092 +0000 UTC m=+36.941075046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.505366 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rnkj5"] Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.525912 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-v8227"] Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.536249 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85pd6" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.580557 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:22 crc kubenswrapper[4749]: E0219 18:34:22.580981 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:23.08095343 +0000 UTC m=+37.042173384 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.581134 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.581185 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.581230 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.581253 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.581279 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.583010 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 18:34:22 crc kubenswrapper[4749]: E0219 18:34:22.585086 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:23.085062203 +0000 UTC m=+37.046282147 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.585777 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pgnn"] Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.589780 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.590578 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.600056 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.605851 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-9drdb"] Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.672392 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qbnpr"] Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.681737 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:22 crc kubenswrapper[4749]: E0219 18:34:22.681877 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:23.181835303 +0000 UTC m=+37.143055257 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.682054 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:22 crc kubenswrapper[4749]: E0219 18:34:22.682369 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:23.182356105 +0000 UTC m=+37.143576059 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:22 crc kubenswrapper[4749]: W0219 18:34:22.711408 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bccd7a7_c28a_4612_914c_3b8f99324dec.slice/crio-75c2a6f83f85bfa25176764c775f36e4c5d8abf7c96a7a560740f35abad59576 WatchSource:0}: Error finding container 75c2a6f83f85bfa25176764c775f36e4c5d8abf7c96a7a560740f35abad59576: Status 404 returned error can't find the container with id 75c2a6f83f85bfa25176764c775f36e4c5d8abf7c96a7a560740f35abad59576 Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.714871 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2dbbd"] Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.727585 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q68nf"] Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.770170 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2jxcr"] Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.782776 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:22 crc kubenswrapper[4749]: E0219 18:34:22.783122 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:23.283105845 +0000 UTC m=+37.244325799 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.795497 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.798871 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.818124 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.865401 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-8bk42"] Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.883575 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-psm7l"] Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.883794 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:22 crc kubenswrapper[4749]: E0219 18:34:22.886547 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:23.386533227 +0000 UTC m=+37.347753171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.985709 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:22 crc kubenswrapper[4749]: E0219 18:34:22.985826 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:23.485801714 +0000 UTC m=+37.447021668 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:22 crc kubenswrapper[4749]: I0219 18:34:22.985911 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:22 crc kubenswrapper[4749]: E0219 18:34:22.986715 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:23.486705455 +0000 UTC m=+37.447925459 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.035286 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" event={"ID":"9bea494d-956c-4fa1-b65e-7d791d79690c","Type":"ContainerStarted","Data":"75644297a6503def22b8ec4da897b51450d6e6bef46ebc27db46ef71665328e2"} Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.041126 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" event={"ID":"4f215af7-dad4-4dd1-9cc7-20c611eacace","Type":"ContainerStarted","Data":"dffabbfdfa3d38499c2adf033d395e3afebdffff6631e5b3abba38ea6d2be20b"} Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.041397 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" event={"ID":"4f215af7-dad4-4dd1-9cc7-20c611eacace","Type":"ContainerStarted","Data":"a2410d7ed1ca93b0dc77f073b57939615bef27b8edc58ce7964b9a5d1380e39a"} Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.042717 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.042758 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6hzt4"] Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.052467 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f62zc" event={"ID":"25539c1d-ee15-42e1-8743-3dbed89feb4e","Type":"ContainerStarted","Data":"bddf01d73af6f5ef4e9b46576048c4eebb6b91006e92e9656abbc301089c9357"} Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.065281 4749 patch_prober.go:28] interesting pod/router-default-5444994796-slnmj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 18:34:23 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Feb 19 18:34:23 crc kubenswrapper[4749]: [+]process-running ok Feb 19 18:34:23 crc kubenswrapper[4749]: healthz check failed Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.065325 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-slnmj" podUID="4586ae5a-18b0-474f-8581-a2cd850ea693" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.068460 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-kxd92" event={"ID":"e9cd83f3-4f63-4348-b467-b23ba946b7bc","Type":"ContainerStarted","Data":"93d474ee02d0214137f7268fc194ce81aaa797f13dcedec3fac7a0c87b51e565"} Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.069691 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" event={"ID":"b1a11162-6554-4080-9b1f-e0864a79ec01","Type":"ContainerStarted","Data":"3fe30a1ac9c81d9e353c6e3b53345e544f400fb05c6e4578342a26b0af123e2d"} Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.069715 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" event={"ID":"b1a11162-6554-4080-9b1f-e0864a79ec01","Type":"ContainerStarted","Data":"50de07c05a930148df516e42956f444c5c12beca8e6ace09e9f1db62b7267fb3"} Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.070388 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.074395 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-nxsch" event={"ID":"ced680c0-5c35-4d74-b553-7f95483907f7","Type":"ContainerStarted","Data":"4cb9e53a6ed855fd34569fb39518ddc31dfd2b3b9eb4684f74b0af8621c0e03e"} Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.081188 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pgnn" event={"ID":"bb59f1a0-666b-42b1-b0af-6f7bdeb5a895","Type":"ContainerStarted","Data":"f1a27ccf09d6f92f43fb886ecd2d3b00c3b2636361cfd09fd791fe354023bcc0"} Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.081214 4749 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-87stm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.081349 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" podUID="b1a11162-6554-4080-9b1f-e0864a79ec01" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.087054 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:23 crc kubenswrapper[4749]: E0219 18:34:23.087281 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:23.587236661 +0000 UTC m=+37.548456615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.087321 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:23 crc kubenswrapper[4749]: E0219 18:34:23.087684 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:23.58767752 +0000 UTC m=+37.548897474 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.088577 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" Feb 19 18:34:23 crc kubenswrapper[4749]: W0219 18:34:23.103554 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4882b1fe_f663_4de2_9d42_ae55ca424efe.slice/crio-fa4b3fea1aeffb578958ed871c559d5a8c8525be9dd308884727c3ad8a90ebfa WatchSource:0}: Error finding container fa4b3fea1aeffb578958ed871c559d5a8c8525be9dd308884727c3ad8a90ebfa: Status 404 returned error can't find the container with id fa4b3fea1aeffb578958ed871c559d5a8c8525be9dd308884727c3ad8a90ebfa Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.103702 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q68nf" event={"ID":"fd3a8256-8304-48a6-bcab-be5fee9f8017","Type":"ContainerStarted","Data":"57f29f6dabad97e75e8625fef16da836cce30f1d77b67b61712dca1ef23e59bf"} Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.121617 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qbnpr" event={"ID":"fc385e2a-5c57-49bc-a308-57a35663a452","Type":"ContainerStarted","Data":"8c84f0272f56876bff5be7a7892b17af0e5a2c76654a29638d92459e5f79f394"} Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.132668 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525430-b9kh6"] Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.146812 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" event={"ID":"a1cfc2e6-b878-4af9-969b-33a513042b75","Type":"ContainerStarted","Data":"f5d9981eb5f4573cb0abba742594063e8269b0bca49482f1a49b479e43c06a9e"} Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.146984 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" event={"ID":"a1cfc2e6-b878-4af9-969b-33a513042b75","Type":"ContainerStarted","Data":"dbd546725d915e3e7eaca96ba66453ecddbd8264ca281f22e92bb45de474be15"} Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.147979 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.149605 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rnkj5" event={"ID":"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d","Type":"ContainerStarted","Data":"27a35c674a36e91af9928f6e8410b6f6ec228b27013665bd0074ddeaf38e663b"} Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.150844 4749 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-d49ll container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.150893 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" podUID="a1cfc2e6-b878-4af9-969b-33a513042b75" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.151377 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-q7rpj" event={"ID":"864274b1-985f-447a-b8d3-c7d9b2c2751b","Type":"ContainerStarted","Data":"c75eaebea175e3d719bbef783bde3a6f5b56db712fedcc306a60259e6e3f3d26"} Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.167239 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tsf9g" event={"ID":"4919053a-3b2b-4575-a86f-85004ba4b899","Type":"ContainerStarted","Data":"675a610a29f9a57583dfe68d0a3a856325a8751baa14151d0006724971c8eaa6"} Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.214580 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.215181 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" podStartSLOduration=4.215170539 podStartE2EDuration="4.215170539s" podCreationTimestamp="2026-02-19 18:34:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:23.136576712 +0000 UTC m=+37.097796686" watchObservedRunningTime="2026-02-19 18:34:23.215170539 +0000 UTC m=+37.176390493" Feb 19 18:34:23 crc kubenswrapper[4749]: E0219 18:34:23.217670 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:23.717648186 +0000 UTC m=+37.678868180 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.218559 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nzbph"] Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.218589 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-7vbmz" event={"ID":"41d2c5ca-2f07-4fb0-9822-5d3f7119f56b","Type":"ContainerStarted","Data":"846561b84e1e5fee6a5088094ab465b41e6a6e87d376645426b2ee45bc6bae2a"} Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.218603 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-7vbmz" event={"ID":"41d2c5ca-2f07-4fb0-9822-5d3f7119f56b","Type":"ContainerStarted","Data":"554243b257acdb453a921ad7c8bd60fb0725440f6f5824555e37c6b999075199"} Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.218827 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-7vbmz" Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.220103 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-7vbmz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.220132 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7vbmz" podUID="41d2c5ca-2f07-4fb0-9822-5d3f7119f56b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.222280 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9dlzr" event={"ID":"ce6dc220-87e7-485b-ac8d-16654ba01e3a","Type":"ContainerStarted","Data":"9a925a85fc782e3960c9fcac3a93afb946f49f4e2dce36e1b8e23fbfbe5ff0ad"} Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.240169 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2jxcr" event={"ID":"b708b2de-ec1b-4cb9-8113-ab44e5437a9c","Type":"ContainerStarted","Data":"c59a4fab380aa7e4adf890b2c690b13ebcb3ec25522ea5392f5f8b12bf19803c"} Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.241625 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-v8227" event={"ID":"0db2db75-ec63-4052-be96-c5c34976fa18","Type":"ContainerStarted","Data":"03f096ff8cb104a3381810669c50a2157c0006109efd18716fb579765934986c"} Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.259840 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" event={"ID":"f8f1b64c-d615-49e7-8b6a-e0f038a58a40","Type":"ContainerStarted","Data":"cae8e74ac12aae92c557e66c0c383bc703699b2c2ae90c92b4ea297987efef20"} Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.262011 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-htrq5" podStartSLOduration=17.261812609 podStartE2EDuration="17.261812609s" podCreationTimestamp="2026-02-19 18:34:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:23.258241688 +0000 UTC m=+37.219461662" watchObservedRunningTime="2026-02-19 18:34:23.261812609 +0000 UTC m=+37.223032563" Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.283907 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-x2vtd"] Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.293272 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xzpgj"] Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.293976 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9drdb" event={"ID":"5bccd7a7-c28a-4612-914c-3b8f99324dec","Type":"ContainerStarted","Data":"75c2a6f83f85bfa25176764c775f36e4c5d8abf7c96a7a560740f35abad59576"} Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.297103 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-k89cm"] Feb 19 18:34:23 crc kubenswrapper[4749]: W0219 18:34:23.305862 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8df7789_2cce_4ea5_bd59_996174038a1f.slice/crio-03ad04ea55c7b16a1cc01e2657f9aa23967fcc5113505c343ac5fca73ce402e3 WatchSource:0}: Error finding container 03ad04ea55c7b16a1cc01e2657f9aa23967fcc5113505c343ac5fca73ce402e3: Status 404 returned error can't find the container with id 03ad04ea55c7b16a1cc01e2657f9aa23967fcc5113505c343ac5fca73ce402e3 Feb 19 18:34:23 crc kubenswrapper[4749]: W0219 18:34:23.310188 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2e534c2_c769_4ad3_942b_d181ed2cf11e.slice/crio-4904ec2c2ffd5eaf37fd62cc7d2fecac3e8c6cfd5831b62c266d25dd47955074 WatchSource:0}: Error finding container 4904ec2c2ffd5eaf37fd62cc7d2fecac3e8c6cfd5831b62c266d25dd47955074: Status 404 returned error can't find the container with id 4904ec2c2ffd5eaf37fd62cc7d2fecac3e8c6cfd5831b62c266d25dd47955074 Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.326778 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7q4b5"] Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.328246 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:23 crc kubenswrapper[4749]: E0219 18:34:23.330955 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:23.830939141 +0000 UTC m=+37.792159205 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.418296 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-2wn9n" podStartSLOduration=17.418282537 podStartE2EDuration="17.418282537s" podCreationTimestamp="2026-02-19 18:34:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:23.378414201 +0000 UTC m=+37.339634155" watchObservedRunningTime="2026-02-19 18:34:23.418282537 +0000 UTC m=+37.379502491" Feb 19 18:34:23 crc kubenswrapper[4749]: W0219 18:34:23.419526 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbe83d62_6b6a_4b2e_8f6c_2fda7cb7c012.slice/crio-b5ae2406ef1c07674f7f6c140cf0b4383c615e608dddbf60c7dd163048c7be1e WatchSource:0}: Error finding container b5ae2406ef1c07674f7f6c140cf0b4383c615e608dddbf60c7dd163048c7be1e: Status 404 returned error can't find the container with id b5ae2406ef1c07674f7f6c140cf0b4383c615e608dddbf60c7dd163048c7be1e Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.431092 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:23 crc kubenswrapper[4749]: E0219 18:34:23.434412 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:23.934392993 +0000 UTC m=+37.895612947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.452437 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qjxjv"] Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.518298 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl4bn"] Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.525604 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85pd6"] Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.527660 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cgsk"] Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.536354 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:23 crc kubenswrapper[4749]: E0219 18:34:23.537072 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:24.037056937 +0000 UTC m=+37.998276891 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.551112 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-65n7c"] Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.575738 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4wt7q"] Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.593484 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4dfs6" podStartSLOduration=17.593466779 podStartE2EDuration="17.593466779s" podCreationTimestamp="2026-02-19 18:34:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:23.592621991 +0000 UTC m=+37.553841935" watchObservedRunningTime="2026-02-19 18:34:23.593466779 +0000 UTC m=+37.554686733" Feb 19 18:34:23 crc kubenswrapper[4749]: W0219 18:34:23.619313 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod406ec180_fbdf_4a41_84f9_749915f3eaa2.slice/crio-c6c9b9e7d39042995dee8d0ef9e73bf31c45fb6a7283d019bde531b63e7a74cb WatchSource:0}: Error finding container c6c9b9e7d39042995dee8d0ef9e73bf31c45fb6a7283d019bde531b63e7a74cb: Status 404 returned error can't find the container with id c6c9b9e7d39042995dee8d0ef9e73bf31c45fb6a7283d019bde531b63e7a74cb Feb 19 18:34:23 crc kubenswrapper[4749]: W0219 18:34:23.629641 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99352d0d_2b0b_4aa0_b6a0_41e16070a323.slice/crio-f845eb2c9bc14da5b01893936ba0b94e4ffae8ae2083f4e70b7d639940446fa1 WatchSource:0}: Error finding container f845eb2c9bc14da5b01893936ba0b94e4ffae8ae2083f4e70b7d639940446fa1: Status 404 returned error can't find the container with id f845eb2c9bc14da5b01893936ba0b94e4ffae8ae2083f4e70b7d639940446fa1 Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.637188 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:23 crc kubenswrapper[4749]: E0219 18:34:23.637595 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:24.137581803 +0000 UTC m=+38.098801757 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.742182 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.742435 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8cg8w" podStartSLOduration=17.742414306 podStartE2EDuration="17.742414306s" podCreationTimestamp="2026-02-19 18:34:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:23.695235403 +0000 UTC m=+37.656455357" watchObservedRunningTime="2026-02-19 18:34:23.742414306 +0000 UTC m=+37.703634270" Feb 19 18:34:23 crc kubenswrapper[4749]: E0219 18:34:23.742470 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:24.242458917 +0000 UTC m=+38.203678871 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.827190 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tzw68" podStartSLOduration=16.827172813 podStartE2EDuration="16.827172813s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:23.814642418 +0000 UTC m=+37.775862362" watchObservedRunningTime="2026-02-19 18:34:23.827172813 +0000 UTC m=+37.788392767" Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.855472 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:23 crc kubenswrapper[4749]: E0219 18:34:23.855867 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:24.355852805 +0000 UTC m=+38.317072759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.938039 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-497mk" podStartSLOduration=16.938006532 podStartE2EDuration="16.938006532s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:23.900824978 +0000 UTC m=+37.862044932" watchObservedRunningTime="2026-02-19 18:34:23.938006532 +0000 UTC m=+37.899226486" Feb 19 18:34:23 crc kubenswrapper[4749]: I0219 18:34:23.957437 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:23 crc kubenswrapper[4749]: E0219 18:34:23.957789 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:24.457767722 +0000 UTC m=+38.418987676 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.045246 4749 patch_prober.go:28] interesting pod/router-default-5444994796-slnmj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 18:34:24 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Feb 19 18:34:24 crc kubenswrapper[4749]: [+]process-running ok Feb 19 18:34:24 crc kubenswrapper[4749]: healthz check failed Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.045552 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-slnmj" podUID="4586ae5a-18b0-474f-8581-a2cd850ea693" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.059399 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:24 crc kubenswrapper[4749]: E0219 18:34:24.059569 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:24.559547776 +0000 UTC m=+38.520767730 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.059711 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:24 crc kubenswrapper[4749]: E0219 18:34:24.060456 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:24.560443176 +0000 UTC m=+38.521663130 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.160935 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:24 crc kubenswrapper[4749]: E0219 18:34:24.163069 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:24.663000438 +0000 UTC m=+38.624220392 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.179505 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-slnmj" podStartSLOduration=17.179483743 podStartE2EDuration="17.179483743s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:24.137867276 +0000 UTC m=+38.099087230" watchObservedRunningTime="2026-02-19 18:34:24.179483743 +0000 UTC m=+38.140703697" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.256082 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-j6r8j" podStartSLOduration=17.256065303 podStartE2EDuration="17.256065303s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:24.254410276 +0000 UTC m=+38.215630220" watchObservedRunningTime="2026-02-19 18:34:24.256065303 +0000 UTC m=+38.217285277" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.265375 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:24 crc kubenswrapper[4749]: E0219 18:34:24.267237 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:24.767216127 +0000 UTC m=+38.728436171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.316906 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl4bn" event={"ID":"18ce2742-770a-492b-a2c1-b1c615b27c71","Type":"ContainerStarted","Data":"dedb3fbc11908198d0a18338f2f7fdd97b93a1d6d415eb99f2b46288efe2b647"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.318366 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cgsk" event={"ID":"99352d0d-2b0b-4aa0-b6a0-41e16070a323","Type":"ContainerStarted","Data":"f845eb2c9bc14da5b01893936ba0b94e4ffae8ae2083f4e70b7d639940446fa1"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.321664 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-q7rpj" event={"ID":"864274b1-985f-447a-b8d3-c7d9b2c2751b","Type":"ContainerStarted","Data":"eb2ffd05754eaf763074c70905e77a1b725ecdddd43ad23a1adffd765c38c28a"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.326145 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-psm7l" event={"ID":"f81702e7-e998-42de-926c-8d704940eab8","Type":"ContainerStarted","Data":"031681cd8d175d095f9477357d1bd5439196d6541f87b15b39110c776eacbf4b"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.326177 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-psm7l" event={"ID":"f81702e7-e998-42de-926c-8d704940eab8","Type":"ContainerStarted","Data":"31f3ca6ffd167143745b7fc588c5e6705370893b8d33f2b995f3d506408f47c2"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.328511 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"86280d7bf60048e4a58c54502ec22b820d07118691340038159f6e8d444e0115"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.334586 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tsf9g" event={"ID":"4919053a-3b2b-4575-a86f-85004ba4b899","Type":"ContainerStarted","Data":"a4728374cbbe88bdcd059bbf36c3e63d2e70fd274e44aee1e0dc50ba3ba70dac"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.338316 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" podStartSLOduration=17.338300963000002 podStartE2EDuration="17.338300963s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:24.333684238 +0000 UTC m=+38.294904212" watchObservedRunningTime="2026-02-19 18:34:24.338300963 +0000 UTC m=+38.299520937" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.342967 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-7vbmz" podStartSLOduration=18.3429514 podStartE2EDuration="18.3429514s" podCreationTimestamp="2026-02-19 18:34:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:24.300241248 +0000 UTC m=+38.261461202" watchObservedRunningTime="2026-02-19 18:34:24.3429514 +0000 UTC m=+38.304171344" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.355247 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" event={"ID":"f8f1b64c-d615-49e7-8b6a-e0f038a58a40","Type":"ContainerStarted","Data":"15f3b08b277ba33865c6359b4e545e70cb1e2d3b638af0cca8d466af5ba87c10"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.356279 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.357727 4749 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-2dbbd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.29:6443/healthz\": dial tcp 10.217.0.29:6443: connect: connection refused" start-of-body= Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.357765 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" podUID="f8f1b64c-d615-49e7-8b6a-e0f038a58a40" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.29:6443/healthz\": dial tcp 10.217.0.29:6443: connect: connection refused" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.358061 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-65n7c" event={"ID":"eac77a6f-f6c7-404c-9756-c33ff95f0c65","Type":"ContainerStarted","Data":"953afbe2f91c1939471133a78cd98e15085062d824667a256195b4c861a71216"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.359007 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qjxjv" event={"ID":"406ec180-fbdf-4a41-84f9-749915f3eaa2","Type":"ContainerStarted","Data":"c6c9b9e7d39042995dee8d0ef9e73bf31c45fb6a7283d019bde531b63e7a74cb"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.361694 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9dlzr" event={"ID":"ce6dc220-87e7-485b-ac8d-16654ba01e3a","Type":"ContainerStarted","Data":"6aca0d4bb917a27f36d795773e149c8fbc0b183e72ca8cbfca0c96ded0d3631f"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.365383 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4wt7q" event={"ID":"176bf56a-e65f-42f1-a975-298674f1f6a6","Type":"ContainerStarted","Data":"5823d411427ff0036c9963cdaabd2ac7f5e034db49e5d0816e940798f6521953"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.365438 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4wt7q" event={"ID":"176bf56a-e65f-42f1-a975-298674f1f6a6","Type":"ContainerStarted","Data":"52489ff8a28c7bd24d1cf99953e2404b99cad4e620b74f911369a5bc924a61b6"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.366184 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:24 crc kubenswrapper[4749]: E0219 18:34:24.366416 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:24.866393592 +0000 UTC m=+38.827613546 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.368635 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:24 crc kubenswrapper[4749]: E0219 18:34:24.370204 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:24.870187438 +0000 UTC m=+38.831407492 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.374746 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2jxcr" event={"ID":"b708b2de-ec1b-4cb9-8113-ab44e5437a9c","Type":"ContainerStarted","Data":"674f7061735fd9b4a44dffac8c54c1a4199dc7dbda49caedd8b5ba9c8dc47a6f"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.379075 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-v8227" event={"ID":"0db2db75-ec63-4052-be96-c5c34976fa18","Type":"ContainerStarted","Data":"d0f69e17758508ffacfbb872b2f5d358b4b02f3a0f8941a5e978fb6eac1e9d47"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.387549 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x2vtd" event={"ID":"26515dd8-3e9b-446d-bdda-547bd85ea373","Type":"ContainerStarted","Data":"b0c80d598c5e2593edd9969cf610b8401c1696101674e172672dcf7f81c4d865"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.389801 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"92e5b4e557d2dc879fb8bd2b32ee86427a532f417e32f827c867a1aa03784290"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.416121 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rnkj5" event={"ID":"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d","Type":"ContainerStarted","Data":"30a872228616b59c714cf50fa1aa236b3d75323377aa298e361fb4a89d4f411b"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.421959 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7q4b5" event={"ID":"cbe83d62-6b6a-4b2e-8f6c-2fda7cb7c012","Type":"ContainerStarted","Data":"6185c7ee2ca5621cff070ec15892e347fbd31351314cf50ecf71040a2eb975bc"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.421992 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7q4b5" event={"ID":"cbe83d62-6b6a-4b2e-8f6c-2fda7cb7c012","Type":"ContainerStarted","Data":"b5ae2406ef1c07674f7f6c140cf0b4383c615e608dddbf60c7dd163048c7be1e"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.422409 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7q4b5" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.423973 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" podStartSLOduration=17.42395446 podStartE2EDuration="17.42395446s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:24.415310634 +0000 UTC m=+38.376530588" watchObservedRunningTime="2026-02-19 18:34:24.42395446 +0000 UTC m=+38.385174414" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.424107 4749 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-7q4b5 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.424143 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7q4b5" podUID="cbe83d62-6b6a-4b2e-8f6c-2fda7cb7c012" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.425470 4749 generic.go:334] "Generic (PLEG): container finished" podID="9bea494d-956c-4fa1-b65e-7d791d79690c" containerID="188778e732cf6ddfa346ee7cb0ebbde71e7c2f691a317424543ab0c34dfaae2c" exitCode=0 Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.425682 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" event={"ID":"9bea494d-956c-4fa1-b65e-7d791d79690c","Type":"ContainerDied","Data":"188778e732cf6ddfa346ee7cb0ebbde71e7c2f691a317424543ab0c34dfaae2c"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.442014 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85pd6" event={"ID":"ad0ad2f0-2678-4896-9f38-081e37050f36","Type":"ContainerStarted","Data":"230b57b2e228b9c36944a1b164a4b8c8863b7b0702c2bcab18039b70daaff68c"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.448315 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-nxsch" event={"ID":"ced680c0-5c35-4d74-b553-7f95483907f7","Type":"ContainerStarted","Data":"f234aa68ad2d8a58a8c5b8ff97606beeb0e481ee03c8a0804d53f72236d93d2e"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.454230 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-v8227" podStartSLOduration=17.454214129 podStartE2EDuration="17.454214129s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:24.452900769 +0000 UTC m=+38.414120723" watchObservedRunningTime="2026-02-19 18:34:24.454214129 +0000 UTC m=+38.415434083" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.454263 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f62zc" event={"ID":"25539c1d-ee15-42e1-8743-3dbed89feb4e","Type":"ContainerStarted","Data":"604aca6fe4cf72ae2b3b4ccfb99c9ce1012802fb1bdfa14b172d199c890f06fc"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.459358 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xzpgj" event={"ID":"5c6fd864-39d6-4de5-9f6c-ec95242b9178","Type":"ContainerStarted","Data":"fb6e13e23f2e221268bbf3218bb2a5e1c1a74db12f01393a5fa34c0b0df7367a"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.461793 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8bk42" event={"ID":"4882b1fe-f663-4de2-9d42-ae55ca424efe","Type":"ContainerStarted","Data":"fe4a4fc546b9c04a6771d559fa307a3f7fcb56095c6f6fbd774ac70568d848d1"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.461818 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8bk42" event={"ID":"4882b1fe-f663-4de2-9d42-ae55ca424efe","Type":"ContainerStarted","Data":"fa4b3fea1aeffb578958ed871c559d5a8c8525be9dd308884727c3ad8a90ebfa"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.467688 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-kxd92" event={"ID":"e9cd83f3-4f63-4348-b467-b23ba946b7bc","Type":"ContainerStarted","Data":"4a715f1fdb64719915d3088c94d8e4a6b7192206b9f68944a4b75e93f9d7a1b1"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.469171 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:24 crc kubenswrapper[4749]: E0219 18:34:24.470444 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:24.970422838 +0000 UTC m=+38.931642802 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.471437 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.472403 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nzbph" event={"ID":"d8df7789-2cce-4ea5-bd59-996174038a1f","Type":"ContainerStarted","Data":"7eff6e4a628d3f91d9baf92e87479496cf5be782c291303f9f4ad5ffd0d383b8"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.472444 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nzbph" event={"ID":"d8df7789-2cce-4ea5-bd59-996174038a1f","Type":"ContainerStarted","Data":"03ad04ea55c7b16a1cc01e2657f9aa23967fcc5113505c343ac5fca73ce402e3"} Feb 19 18:34:24 crc kubenswrapper[4749]: E0219 18:34:24.473275 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:24.973262572 +0000 UTC m=+38.934482706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.475056 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6hzt4" event={"ID":"be345565-0341-4290-b5e8-9cf728685a6b","Type":"ContainerStarted","Data":"a59c55537f38b7418549628cd33f7ee6f1115c9173cef2ec8b7941ccd270acaf"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.476140 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pgnn" event={"ID":"bb59f1a0-666b-42b1-b0af-6f7bdeb5a895","Type":"ContainerStarted","Data":"97cb4d51e64be487afe81d506a13587800947b38d6894f4d75759d5a2eeaafc8"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.477638 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q68nf" event={"ID":"fd3a8256-8304-48a6-bcab-be5fee9f8017","Type":"ContainerStarted","Data":"98cba11e2f85f3af327313358a3af9aff67ad23937d4b14fb03c662efd30faa9"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.481150 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k89cm" event={"ID":"92f8c721-43b3-4211-9580-9a1bbd4b61e9","Type":"ContainerStarted","Data":"769081fc17a7c6572b62687808bd27fda68186bc9f3ce4356344c167aeec89e5"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.481179 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k89cm" event={"ID":"92f8c721-43b3-4211-9580-9a1bbd4b61e9","Type":"ContainerStarted","Data":"8ba7faad912ba15afc244b7783c7b315cc0f0000672fe22316fbfbe13fee0a72"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.484223 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qbnpr" event={"ID":"fc385e2a-5c57-49bc-a308-57a35663a452","Type":"ContainerStarted","Data":"d92ac3e79d550a366db384df008147d5d21e17b10d81590e652941b5b6247751"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.486001 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525430-b9kh6" event={"ID":"c2e534c2-c769-4ad3-942b-d181ed2cf11e","Type":"ContainerStarted","Data":"7b15c2b1a142dfdad1583f214541aacdcd68db7a0d7388d30b6da261a6fe8f8c"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.486040 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525430-b9kh6" event={"ID":"c2e534c2-c769-4ad3-942b-d181ed2cf11e","Type":"ContainerStarted","Data":"4904ec2c2ffd5eaf37fd62cc7d2fecac3e8c6cfd5831b62c266d25dd47955074"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.501168 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tsf9g" podStartSLOduration=17.501122825 podStartE2EDuration="17.501122825s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:24.500153023 +0000 UTC m=+38.461372987" watchObservedRunningTime="2026-02-19 18:34:24.501122825 +0000 UTC m=+38.462342779" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.507361 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"641f1b0ad49469b2eed852c73dd54bb81fb0052918b2196e1dc6a9c8854f0adc"} Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.508255 4749 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-87stm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.508318 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" podUID="b1a11162-6554-4080-9b1f-e0864a79ec01" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.508494 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-7vbmz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.508532 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7vbmz" podUID="41d2c5ca-2f07-4fb0-9822-5d3f7119f56b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.508579 4749 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-d49ll container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.508616 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" podUID="a1cfc2e6-b878-4af9-969b-33a513042b75" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.537032 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-psm7l" podStartSLOduration=5.537003141 podStartE2EDuration="5.537003141s" podCreationTimestamp="2026-02-19 18:34:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:24.535693901 +0000 UTC m=+38.496913865" watchObservedRunningTime="2026-02-19 18:34:24.537003141 +0000 UTC m=+38.498223095" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.574822 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:24 crc kubenswrapper[4749]: E0219 18:34:24.575849 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:25.075833714 +0000 UTC m=+39.037053668 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.590879 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" podStartSLOduration=17.590847175 podStartE2EDuration="17.590847175s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:24.583850266 +0000 UTC m=+38.545070230" watchObservedRunningTime="2026-02-19 18:34:24.590847175 +0000 UTC m=+38.552067129" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.619874 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9dlzr" podStartSLOduration=18.619855454 podStartE2EDuration="18.619855454s" podCreationTimestamp="2026-02-19 18:34:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:24.619347433 +0000 UTC m=+38.580567387" watchObservedRunningTime="2026-02-19 18:34:24.619855454 +0000 UTC m=+38.581075408" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.674584 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-nxsch" podStartSLOduration=17.674567918 podStartE2EDuration="17.674567918s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:24.67332922 +0000 UTC m=+38.634549194" watchObservedRunningTime="2026-02-19 18:34:24.674567918 +0000 UTC m=+38.635787872" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.676181 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:24 crc kubenswrapper[4749]: E0219 18:34:24.676525 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:25.176511733 +0000 UTC m=+39.137731687 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.742503 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-rnkj5" podStartSLOduration=18.742486403 podStartE2EDuration="18.742486403s" podCreationTimestamp="2026-02-19 18:34:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:24.742151875 +0000 UTC m=+38.703371849" watchObservedRunningTime="2026-02-19 18:34:24.742486403 +0000 UTC m=+38.703706367" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.775472 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.776616 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:24 crc kubenswrapper[4749]: E0219 18:34:24.776919 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:25.276905905 +0000 UTC m=+39.238125859 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.777215 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.791579 4749 patch_prober.go:28] interesting pod/apiserver-76f77b778f-497mk container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 19 18:34:24 crc kubenswrapper[4749]: [+]log ok Feb 19 18:34:24 crc kubenswrapper[4749]: [+]etcd ok Feb 19 18:34:24 crc kubenswrapper[4749]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 19 18:34:24 crc kubenswrapper[4749]: [+]poststarthook/generic-apiserver-start-informers ok Feb 19 18:34:24 crc kubenswrapper[4749]: [+]poststarthook/max-in-flight-filter ok Feb 19 18:34:24 crc kubenswrapper[4749]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 19 18:34:24 crc kubenswrapper[4749]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 19 18:34:24 crc kubenswrapper[4749]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 19 18:34:24 crc kubenswrapper[4749]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Feb 19 18:34:24 crc kubenswrapper[4749]: [+]poststarthook/project.openshift.io-projectcache ok Feb 19 18:34:24 crc kubenswrapper[4749]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 19 18:34:24 crc kubenswrapper[4749]: [+]poststarthook/openshift.io-startinformers ok Feb 19 18:34:24 crc kubenswrapper[4749]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 19 18:34:24 crc kubenswrapper[4749]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 19 18:34:24 crc kubenswrapper[4749]: livez check failed Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.791632 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-497mk" podUID="c30b3c06-11f3-450e-8793-4bacb3756a3e" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.793176 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f62zc" podStartSLOduration=17.793161264 podStartE2EDuration="17.793161264s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:24.791789623 +0000 UTC m=+38.753009577" watchObservedRunningTime="2026-02-19 18:34:24.793161264 +0000 UTC m=+38.754381228" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.862892 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q68nf" podStartSLOduration=17.86287064 podStartE2EDuration="17.86287064s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:24.830451703 +0000 UTC m=+38.791671677" watchObservedRunningTime="2026-02-19 18:34:24.86287064 +0000 UTC m=+38.824090594" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.863076 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7q4b5" podStartSLOduration=17.863070244 podStartE2EDuration="17.863070244s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:24.861683202 +0000 UTC m=+38.822903186" watchObservedRunningTime="2026-02-19 18:34:24.863070244 +0000 UTC m=+38.824290198" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.884967 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:24 crc kubenswrapper[4749]: E0219 18:34:24.885877 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:25.385866742 +0000 UTC m=+39.347086696 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.945985 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-kxd92" podStartSLOduration=5.945960628 podStartE2EDuration="5.945960628s" podCreationTimestamp="2026-02-19 18:34:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:24.916533129 +0000 UTC m=+38.877753093" watchObservedRunningTime="2026-02-19 18:34:24.945960628 +0000 UTC m=+38.907180582" Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.986647 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:24 crc kubenswrapper[4749]: E0219 18:34:24.986987 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:25.48696897 +0000 UTC m=+39.448188924 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:24 crc kubenswrapper[4749]: I0219 18:34:24.987185 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:24 crc kubenswrapper[4749]: E0219 18:34:24.987643 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:25.487618956 +0000 UTC m=+39.448838980 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.040491 4749 patch_prober.go:28] interesting pod/router-default-5444994796-slnmj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 18:34:25 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Feb 19 18:34:25 crc kubenswrapper[4749]: [+]process-running ok Feb 19 18:34:25 crc kubenswrapper[4749]: healthz check failed Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.040781 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-slnmj" podUID="4586ae5a-18b0-474f-8581-a2cd850ea693" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.087976 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:25 crc kubenswrapper[4749]: E0219 18:34:25.088079 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:25.588057259 +0000 UTC m=+39.549277223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.088289 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:25 crc kubenswrapper[4749]: E0219 18:34:25.088671 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:25.588659543 +0000 UTC m=+39.549879497 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.189726 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:25 crc kubenswrapper[4749]: E0219 18:34:25.190162 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:25.69014487 +0000 UTC m=+39.651364824 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.291741 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:25 crc kubenswrapper[4749]: E0219 18:34:25.292096 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:25.792082507 +0000 UTC m=+39.753302451 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.311015 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525430-b9kh6" podStartSLOduration=18.310990707 podStartE2EDuration="18.310990707s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:24.946346297 +0000 UTC m=+38.907566271" watchObservedRunningTime="2026-02-19 18:34:25.310990707 +0000 UTC m=+39.272210661" Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.314068 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-8tc74"] Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.393545 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:25 crc kubenswrapper[4749]: E0219 18:34:25.393863 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:25.893844171 +0000 UTC m=+39.855064125 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.495389 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:25 crc kubenswrapper[4749]: E0219 18:34:25.495789 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:25.995769918 +0000 UTC m=+39.956989942 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.515977 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c2fde87081cdf4fcb16ced40bc6a8907e8488b844479ee97545ac4c09ed989fb"} Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.516072 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.519754 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85pd6" event={"ID":"ad0ad2f0-2678-4896-9f38-081e37050f36","Type":"ContainerStarted","Data":"4f41906b641dd8190b08cbb99c28b3f654e1f4044e9bc331f868b0ba0526a8a1"} Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.520072 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85pd6" Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.521206 4749 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-85pd6 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.521260 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85pd6" podUID="ad0ad2f0-2678-4896-9f38-081e37050f36" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.523496 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-q7rpj" event={"ID":"864274b1-985f-447a-b8d3-c7d9b2c2751b","Type":"ContainerStarted","Data":"69436acdd53c14d9264a3b53f5ec31c394a6a37ee924169d698e253f815d2b52"} Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.536297 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-65n7c" event={"ID":"eac77a6f-f6c7-404c-9756-c33ff95f0c65","Type":"ContainerStarted","Data":"d17fda1e8d47c8c4c8bde6aecee1563d61a8240cac0e81bed9d609a30bb352ce"} Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.536347 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-65n7c" Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.536361 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-65n7c" event={"ID":"eac77a6f-f6c7-404c-9756-c33ff95f0c65","Type":"ContainerStarted","Data":"808ce14e66cb6c919e196c56fb728b5a8778480ae1079c1a36cee88e4814b20d"} Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.545446 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nzbph" event={"ID":"d8df7789-2cce-4ea5-bd59-996174038a1f","Type":"ContainerStarted","Data":"ce01df47cd4d42e4654825542be131cb662017bf554fc2e81474b1dd2370124e"} Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.547078 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8bk42" event={"ID":"4882b1fe-f663-4de2-9d42-ae55ca424efe","Type":"ContainerStarted","Data":"906c8cbf24c1edae71ad83fb0baea7058d47137128d9cb6443bb93363b2025a6"} Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.548437 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qbnpr" event={"ID":"fc385e2a-5c57-49bc-a308-57a35663a452","Type":"ContainerStarted","Data":"9c5c11ec14ac4793bf51b4c207c02c3a9db39932addb97c07738020349a76b6f"} Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.549681 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x2vtd" event={"ID":"26515dd8-3e9b-446d-bdda-547bd85ea373","Type":"ContainerStarted","Data":"cef056d13ca1d3d3385d8e1d7b6056d2251664a5e7135a765a0281f4a626a265"} Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.551128 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2fe08973d2ea76c4b258dc22cc7a89f7bffea5be7e27dc074adabe918b96cfda"} Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.552912 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" event={"ID":"9bea494d-956c-4fa1-b65e-7d791d79690c","Type":"ContainerStarted","Data":"30fe74e9eeaea616b859d3bcf5c7c612288261ae036c4c90d734b0625775edda"} Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.565567 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl4bn" event={"ID":"18ce2742-770a-492b-a2c1-b1c615b27c71","Type":"ContainerStarted","Data":"c5efbadf7c01ee70c8557752276437534f3a1d82fbdc7af4b10f1bd19e35cd54"} Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.569613 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qjxjv" event={"ID":"406ec180-fbdf-4a41-84f9-749915f3eaa2","Type":"ContainerStarted","Data":"4ffa7d49f3f45633b0a2644b7ccd72fce1e90f2b537d25bd4886ed4241ad6b5e"} Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.592352 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pgnn" event={"ID":"bb59f1a0-666b-42b1-b0af-6f7bdeb5a895","Type":"ContainerStarted","Data":"c10427653804d1a4b1c87690246f5be1b58b0d821b7e1e0769f79a3e4c7dbfbb"} Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.593067 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pgnn" Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.597244 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:25 crc kubenswrapper[4749]: E0219 18:34:25.598520 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:26.098504434 +0000 UTC m=+40.059724388 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.633238 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-q7rpj" podStartSLOduration=18.633205003 podStartE2EDuration="18.633205003s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:25.558932214 +0000 UTC m=+39.520152178" watchObservedRunningTime="2026-02-19 18:34:25.633205003 +0000 UTC m=+39.594424957" Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.639203 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xzpgj" event={"ID":"5c6fd864-39d6-4de5-9f6c-ec95242b9178","Type":"ContainerStarted","Data":"9ad708aecea044121e90e7022e0f3155e14eedf18d669e405305def1bc550199"} Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.639598 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xzpgj" Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.650160 4749 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-xzpgj container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.650233 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xzpgj" podUID="5c6fd864-39d6-4de5-9f6c-ec95242b9178" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.684954 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pgnn" podStartSLOduration=18.684939819 podStartE2EDuration="18.684939819s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:25.683506417 +0000 UTC m=+39.644726371" watchObservedRunningTime="2026-02-19 18:34:25.684939819 +0000 UTC m=+39.646159773" Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.685923 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85pd6" podStartSLOduration=18.685913862 podStartE2EDuration="18.685913862s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:25.640400286 +0000 UTC m=+39.601620240" watchObservedRunningTime="2026-02-19 18:34:25.685913862 +0000 UTC m=+39.647133816" Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.688051 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2jxcr" event={"ID":"b708b2de-ec1b-4cb9-8113-ab44e5437a9c","Type":"ContainerStarted","Data":"98ccc8fe8642eb3f2962a8b5784671a44af34fce41bc72e446269aadbb4d4e97"} Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.699075 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:25 crc kubenswrapper[4749]: E0219 18:34:25.699436 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:26.199421948 +0000 UTC m=+40.160641902 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.711983 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4wt7q" event={"ID":"176bf56a-e65f-42f1-a975-298674f1f6a6","Type":"ContainerStarted","Data":"4ddc5a62b7313ed4fe318bf95724fc5e7fda969ac048daa1445c673b1a4686f8"} Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.713728 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cgsk" event={"ID":"99352d0d-2b0b-4aa0-b6a0-41e16070a323","Type":"ContainerStarted","Data":"e8e2140cfbb3647900810f5323d3b59b9e964aadfbfc8e13561cf571248b0c59"} Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.715329 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k89cm" event={"ID":"92f8c721-43b3-4211-9580-9a1bbd4b61e9","Type":"ContainerStarted","Data":"a91a058eafb433fffbe5d69c4146738e72eaba80a06cf269db699f05f3a1f827"} Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.717109 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6hzt4" event={"ID":"be345565-0341-4290-b5e8-9cf728685a6b","Type":"ContainerStarted","Data":"06d5e96737d0f4a47d1dddbd662440c6a106b31f04a61223d71c203bd702097a"} Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.717195 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6hzt4" Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.718788 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"672fd4dc14f68aa652d7de661300531ceca5742854e930d7283125c7c21fac49"} Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.723107 4749 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-d49ll container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.723140 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" podUID="a1cfc2e6-b878-4af9-969b-33a513042b75" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.723171 4749 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-7q4b5 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.723184 4749 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6hzt4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.723208 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7q4b5" podUID="cbe83d62-6b6a-4b2e-8f6c-2fda7cb7c012" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.723233 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6hzt4" podUID="be345565-0341-4290-b5e8-9cf728685a6b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.728236 4749 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-2dbbd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.29:6443/healthz\": dial tcp 10.217.0.29:6443: connect: connection refused" start-of-body= Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.728304 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" podUID="f8f1b64c-d615-49e7-8b6a-e0f038a58a40" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.29:6443/healthz\": dial tcp 10.217.0.29:6443: connect: connection refused" Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.753051 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" podStartSLOduration=18.753018897 podStartE2EDuration="18.753018897s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:25.751861671 +0000 UTC m=+39.713081635" watchObservedRunningTime="2026-02-19 18:34:25.753018897 +0000 UTC m=+39.714238861" Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.773485 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qjxjv" podStartSLOduration=18.773460991 podStartE2EDuration="18.773460991s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:25.771834375 +0000 UTC m=+39.733054329" watchObservedRunningTime="2026-02-19 18:34:25.773460991 +0000 UTC m=+39.734680945" Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.801441 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:25 crc kubenswrapper[4749]: E0219 18:34:25.803861 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:26.303839502 +0000 UTC m=+40.265059506 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.830834 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xzpgj" podStartSLOduration=18.830810425 podStartE2EDuration="18.830810425s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:25.802943432 +0000 UTC m=+39.764163396" watchObservedRunningTime="2026-02-19 18:34:25.830810425 +0000 UTC m=+39.792030379" Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.840054 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-htrq5" Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.852288 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nzbph" podStartSLOduration=18.852255113 podStartE2EDuration="18.852255113s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:25.830324035 +0000 UTC m=+39.791543989" watchObservedRunningTime="2026-02-19 18:34:25.852255113 +0000 UTC m=+39.813475067" Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.881927 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl4bn" podStartSLOduration=18.881886667 podStartE2EDuration="18.881886667s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:25.854572786 +0000 UTC m=+39.815792750" watchObservedRunningTime="2026-02-19 18:34:25.881886667 +0000 UTC m=+39.843106621" Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.905912 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:25 crc kubenswrapper[4749]: E0219 18:34:25.906869 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:26.406856804 +0000 UTC m=+40.368076758 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.943317 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x2vtd" podStartSLOduration=18.943296472 podStartE2EDuration="18.943296472s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:25.93919433 +0000 UTC m=+39.900414294" watchObservedRunningTime="2026-02-19 18:34:25.943296472 +0000 UTC m=+39.904516426" Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.944069 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-65n7c" podStartSLOduration=6.9440634 podStartE2EDuration="6.9440634s" podCreationTimestamp="2026-02-19 18:34:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:25.904729016 +0000 UTC m=+39.865948980" watchObservedRunningTime="2026-02-19 18:34:25.9440634 +0000 UTC m=+39.905283354" Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.978924 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8bk42" podStartSLOduration=18.978886902 podStartE2EDuration="18.978886902s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:25.976211751 +0000 UTC m=+39.937431725" watchObservedRunningTime="2026-02-19 18:34:25.978886902 +0000 UTC m=+39.940106856" Feb 19 18:34:25 crc kubenswrapper[4749]: I0219 18:34:25.998182 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-qbnpr" podStartSLOduration=18.998167421 podStartE2EDuration="18.998167421s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:25.997825192 +0000 UTC m=+39.959045146" watchObservedRunningTime="2026-02-19 18:34:25.998167421 +0000 UTC m=+39.959387375" Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.007874 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:26 crc kubenswrapper[4749]: E0219 18:34:26.008179 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:26.508163238 +0000 UTC m=+40.469383192 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.033255 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.041767 4749 patch_prober.go:28] interesting pod/router-default-5444994796-slnmj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 18:34:26 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Feb 19 18:34:26 crc kubenswrapper[4749]: [+]process-running ok Feb 19 18:34:26 crc kubenswrapper[4749]: healthz check failed Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.041821 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-slnmj" podUID="4586ae5a-18b0-474f-8581-a2cd850ea693" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.043662 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cgsk" podStartSLOduration=19.043647794 podStartE2EDuration="19.043647794s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:26.015536775 +0000 UTC m=+39.976756729" watchObservedRunningTime="2026-02-19 18:34:26.043647794 +0000 UTC m=+40.004867748" Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.125593 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.125657 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8771d522-aad3-4c8d-8f8b-eccc155fbf71-metrics-certs\") pod \"network-metrics-daemon-vw4bt\" (UID: \"8771d522-aad3-4c8d-8f8b-eccc155fbf71\") " pod="openshift-multus/network-metrics-daemon-vw4bt" Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.127228 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4wt7q" podStartSLOduration=19.127217714 podStartE2EDuration="19.127217714s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:26.121241319 +0000 UTC m=+40.082461273" watchObservedRunningTime="2026-02-19 18:34:26.127217714 +0000 UTC m=+40.088437668" Feb 19 18:34:26 crc kubenswrapper[4749]: E0219 18:34:26.128919 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:26.628902812 +0000 UTC m=+40.590122766 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.139305 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8771d522-aad3-4c8d-8f8b-eccc155fbf71-metrics-certs\") pod \"network-metrics-daemon-vw4bt\" (UID: \"8771d522-aad3-4c8d-8f8b-eccc155fbf71\") " pod="openshift-multus/network-metrics-daemon-vw4bt" Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.159421 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-2jxcr" podStartSLOduration=19.159401806 podStartE2EDuration="19.159401806s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:26.152836327 +0000 UTC m=+40.114056281" watchObservedRunningTime="2026-02-19 18:34:26.159401806 +0000 UTC m=+40.120621750" Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.177904 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k89cm" podStartSLOduration=19.177887516 podStartE2EDuration="19.177887516s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:26.17673533 +0000 UTC m=+40.137955274" watchObservedRunningTime="2026-02-19 18:34:26.177887516 +0000 UTC m=+40.139107470" Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.228581 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:26 crc kubenswrapper[4749]: E0219 18:34:26.229081 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:26.7290657 +0000 UTC m=+40.690285654 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.230371 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-6hzt4" podStartSLOduration=19.23036139 podStartE2EDuration="19.23036139s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:26.206698272 +0000 UTC m=+40.167918236" watchObservedRunningTime="2026-02-19 18:34:26.23036139 +0000 UTC m=+40.191581344" Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.330063 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:26 crc kubenswrapper[4749]: E0219 18:34:26.330434 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:26.830417184 +0000 UTC m=+40.791637138 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.406233 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vw4bt" Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.430986 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:26 crc kubenswrapper[4749]: E0219 18:34:26.431158 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:26.931132834 +0000 UTC m=+40.892352788 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.431579 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:26 crc kubenswrapper[4749]: E0219 18:34:26.432090 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:26.932078486 +0000 UTC m=+40.893298430 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.533078 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:26 crc kubenswrapper[4749]: E0219 18:34:26.533196 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:27.033174864 +0000 UTC m=+40.994394818 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.533549 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:26 crc kubenswrapper[4749]: E0219 18:34:26.533857 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:27.033845709 +0000 UTC m=+40.995065673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.634941 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:26 crc kubenswrapper[4749]: E0219 18:34:26.635275 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:27.135246935 +0000 UTC m=+41.096466889 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.635434 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:26 crc kubenswrapper[4749]: E0219 18:34:26.637433 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:27.137418663 +0000 UTC m=+41.098638617 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.735927 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9drdb" event={"ID":"5bccd7a7-c28a-4612-914c-3b8f99324dec","Type":"ContainerStarted","Data":"b0369b1de6a2076fdc9b0f6c3d33f59ba8a05d41f15ffbbd715342b7867604be"} Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.736241 4749 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6hzt4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.736270 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6hzt4" podUID="be345565-0341-4290-b5e8-9cf728685a6b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.736499 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" podUID="4f215af7-dad4-4dd1-9cc7-20c611eacace" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://dffabbfdfa3d38499c2adf033d395e3afebdffff6631e5b3abba38ea6d2be20b" gracePeriod=30 Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.739248 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:26 crc kubenswrapper[4749]: E0219 18:34:26.739815 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:27.239791991 +0000 UTC m=+41.201011995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.750407 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-85pd6" Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.765324 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vw4bt"] Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.827010 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.827519 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.829578 4749 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-dwgqx container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.12:8443/livez\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.829611 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" podUID="9bea494d-956c-4fa1-b65e-7d791d79690c" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.12:8443/livez\": dial tcp 10.217.0.12:8443: connect: connection refused" Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.840518 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:26 crc kubenswrapper[4749]: E0219 18:34:26.841712 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:27.341661987 +0000 UTC m=+41.302881941 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.863164 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:34:26 crc kubenswrapper[4749]: I0219 18:34:26.942689 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:26 crc kubenswrapper[4749]: E0219 18:34:26.942976 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:27.4429599 +0000 UTC m=+41.404179854 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:27 crc kubenswrapper[4749]: I0219 18:34:27.043761 4749 patch_prober.go:28] interesting pod/router-default-5444994796-slnmj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 18:34:27 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Feb 19 18:34:27 crc kubenswrapper[4749]: [+]process-running ok Feb 19 18:34:27 crc kubenswrapper[4749]: healthz check failed Feb 19 18:34:27 crc kubenswrapper[4749]: I0219 18:34:27.044157 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-slnmj" podUID="4586ae5a-18b0-474f-8581-a2cd850ea693" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 18:34:27 crc kubenswrapper[4749]: I0219 18:34:27.044649 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:27 crc kubenswrapper[4749]: E0219 18:34:27.045070 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:27.545058171 +0000 UTC m=+41.506278125 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:27 crc kubenswrapper[4749]: I0219 18:34:27.145662 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:27 crc kubenswrapper[4749]: E0219 18:34:27.145847 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:27.645822492 +0000 UTC m=+41.607042446 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:27 crc kubenswrapper[4749]: I0219 18:34:27.145949 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:27 crc kubenswrapper[4749]: E0219 18:34:27.146246 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:27.646233102 +0000 UTC m=+41.607453056 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:27 crc kubenswrapper[4749]: I0219 18:34:27.247262 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:27 crc kubenswrapper[4749]: E0219 18:34:27.247408 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:27.747384001 +0000 UTC m=+41.708603955 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:27 crc kubenswrapper[4749]: I0219 18:34:27.247450 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:27 crc kubenswrapper[4749]: E0219 18:34:27.247734 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:27.747722569 +0000 UTC m=+41.708942523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:27 crc kubenswrapper[4749]: I0219 18:34:27.348562 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:27 crc kubenswrapper[4749]: E0219 18:34:27.348917 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:27.848903419 +0000 UTC m=+41.810123373 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:27 crc kubenswrapper[4749]: I0219 18:34:27.450076 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:27 crc kubenswrapper[4749]: E0219 18:34:27.450717 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:27.950703834 +0000 UTC m=+41.911923788 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:27 crc kubenswrapper[4749]: I0219 18:34:27.551267 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:27 crc kubenswrapper[4749]: E0219 18:34:27.551586 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:28.051568937 +0000 UTC m=+42.012788891 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:27 crc kubenswrapper[4749]: I0219 18:34:27.652620 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:27 crc kubenswrapper[4749]: E0219 18:34:27.652992 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:28.152975803 +0000 UTC m=+42.114195757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:27 crc kubenswrapper[4749]: I0219 18:34:27.738720 4749 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-xzpgj container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 18:34:27 crc kubenswrapper[4749]: I0219 18:34:27.738765 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xzpgj" podUID="5c6fd864-39d6-4de5-9f6c-ec95242b9178" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 18:34:27 crc kubenswrapper[4749]: I0219 18:34:27.753195 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:27 crc kubenswrapper[4749]: E0219 18:34:27.753367 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:28.253342824 +0000 UTC m=+42.214562768 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:27 crc kubenswrapper[4749]: I0219 18:34:27.753424 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:27 crc kubenswrapper[4749]: E0219 18:34:27.753692 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:28.253680472 +0000 UTC m=+42.214900416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:27 crc kubenswrapper[4749]: I0219 18:34:27.757246 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vw4bt" event={"ID":"8771d522-aad3-4c8d-8f8b-eccc155fbf71","Type":"ContainerStarted","Data":"69b0b597aa5390a24d2c06edaa26d384872c56b57aed85d7f725f39ba844e155"} Feb 19 18:34:27 crc kubenswrapper[4749]: I0219 18:34:27.757292 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vw4bt" event={"ID":"8771d522-aad3-4c8d-8f8b-eccc155fbf71","Type":"ContainerStarted","Data":"4cdf89812070c28cb77fe76a513735b432842539c1f855d5539a09a37575ba1b"} Feb 19 18:34:27 crc kubenswrapper[4749]: I0219 18:34:27.757308 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vw4bt" event={"ID":"8771d522-aad3-4c8d-8f8b-eccc155fbf71","Type":"ContainerStarted","Data":"a66284351d45ea9c31de07d00836a8dac8625c0c261d5125290750c9074c3611"} Feb 19 18:34:27 crc kubenswrapper[4749]: I0219 18:34:27.854265 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:27 crc kubenswrapper[4749]: E0219 18:34:27.854545 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:28.354515595 +0000 UTC m=+42.315735549 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:27 crc kubenswrapper[4749]: I0219 18:34:27.854631 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:27 crc kubenswrapper[4749]: E0219 18:34:27.856113 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:28.356094171 +0000 UTC m=+42.317314235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:27 crc kubenswrapper[4749]: I0219 18:34:27.956668 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:27 crc kubenswrapper[4749]: E0219 18:34:27.956979 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:28.456960194 +0000 UTC m=+42.418180148 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.031157 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2n4lk"] Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.032238 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2n4lk" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.034055 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.037950 4749 patch_prober.go:28] interesting pod/router-default-5444994796-slnmj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 18:34:28 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Feb 19 18:34:28 crc kubenswrapper[4749]: [+]process-running ok Feb 19 18:34:28 crc kubenswrapper[4749]: healthz check failed Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.038436 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-slnmj" podUID="4586ae5a-18b0-474f-8581-a2cd850ea693" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.058312 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e19c61ad-b387-457b-814b-e382b0265880-utilities\") pod \"certified-operators-2n4lk\" (UID: \"e19c61ad-b387-457b-814b-e382b0265880\") " pod="openshift-marketplace/certified-operators-2n4lk" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.058639 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmq7w\" (UniqueName: \"kubernetes.io/projected/e19c61ad-b387-457b-814b-e382b0265880-kube-api-access-gmq7w\") pod \"certified-operators-2n4lk\" (UID: \"e19c61ad-b387-457b-814b-e382b0265880\") " pod="openshift-marketplace/certified-operators-2n4lk" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.058753 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e19c61ad-b387-457b-814b-e382b0265880-catalog-content\") pod \"certified-operators-2n4lk\" (UID: \"e19c61ad-b387-457b-814b-e382b0265880\") " pod="openshift-marketplace/certified-operators-2n4lk" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.058836 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.058485 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2n4lk"] Feb 19 18:34:28 crc kubenswrapper[4749]: E0219 18:34:28.059182 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:28.559165997 +0000 UTC m=+42.520385951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.160290 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.160484 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmq7w\" (UniqueName: \"kubernetes.io/projected/e19c61ad-b387-457b-814b-e382b0265880-kube-api-access-gmq7w\") pod \"certified-operators-2n4lk\" (UID: \"e19c61ad-b387-457b-814b-e382b0265880\") " pod="openshift-marketplace/certified-operators-2n4lk" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.160528 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e19c61ad-b387-457b-814b-e382b0265880-catalog-content\") pod \"certified-operators-2n4lk\" (UID: \"e19c61ad-b387-457b-814b-e382b0265880\") " pod="openshift-marketplace/certified-operators-2n4lk" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.161285 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e19c61ad-b387-457b-814b-e382b0265880-catalog-content\") pod \"certified-operators-2n4lk\" (UID: \"e19c61ad-b387-457b-814b-e382b0265880\") " pod="openshift-marketplace/certified-operators-2n4lk" Feb 19 18:34:28 crc kubenswrapper[4749]: E0219 18:34:28.161360 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:28.66134232 +0000 UTC m=+42.622562274 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.161391 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e19c61ad-b387-457b-814b-e382b0265880-utilities\") pod \"certified-operators-2n4lk\" (UID: \"e19c61ad-b387-457b-814b-e382b0265880\") " pod="openshift-marketplace/certified-operators-2n4lk" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.161615 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e19c61ad-b387-457b-814b-e382b0265880-utilities\") pod \"certified-operators-2n4lk\" (UID: \"e19c61ad-b387-457b-814b-e382b0265880\") " pod="openshift-marketplace/certified-operators-2n4lk" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.189981 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmq7w\" (UniqueName: \"kubernetes.io/projected/e19c61ad-b387-457b-814b-e382b0265880-kube-api-access-gmq7w\") pod \"certified-operators-2n4lk\" (UID: \"e19c61ad-b387-457b-814b-e382b0265880\") " pod="openshift-marketplace/certified-operators-2n4lk" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.234226 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5psfv"] Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.235158 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5psfv" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.245653 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.257997 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5psfv"] Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.262482 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.262581 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9165e83-4c09-4c44-b185-8f8922fcdad7-utilities\") pod \"community-operators-5psfv\" (UID: \"f9165e83-4c09-4c44-b185-8f8922fcdad7\") " pod="openshift-marketplace/community-operators-5psfv" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.262615 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dx7n\" (UniqueName: \"kubernetes.io/projected/f9165e83-4c09-4c44-b185-8f8922fcdad7-kube-api-access-7dx7n\") pod \"community-operators-5psfv\" (UID: \"f9165e83-4c09-4c44-b185-8f8922fcdad7\") " pod="openshift-marketplace/community-operators-5psfv" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.262688 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9165e83-4c09-4c44-b185-8f8922fcdad7-catalog-content\") pod \"community-operators-5psfv\" (UID: \"f9165e83-4c09-4c44-b185-8f8922fcdad7\") " pod="openshift-marketplace/community-operators-5psfv" Feb 19 18:34:28 crc kubenswrapper[4749]: E0219 18:34:28.263003 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:28.7629891 +0000 UTC m=+42.724209054 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.345542 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2n4lk" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.363254 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:28 crc kubenswrapper[4749]: E0219 18:34:28.363452 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:28.863421794 +0000 UTC m=+42.824641758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.363487 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9165e83-4c09-4c44-b185-8f8922fcdad7-catalog-content\") pod \"community-operators-5psfv\" (UID: \"f9165e83-4c09-4c44-b185-8f8922fcdad7\") " pod="openshift-marketplace/community-operators-5psfv" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.363519 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.363574 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9165e83-4c09-4c44-b185-8f8922fcdad7-utilities\") pod \"community-operators-5psfv\" (UID: \"f9165e83-4c09-4c44-b185-8f8922fcdad7\") " pod="openshift-marketplace/community-operators-5psfv" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.363600 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dx7n\" (UniqueName: \"kubernetes.io/projected/f9165e83-4c09-4c44-b185-8f8922fcdad7-kube-api-access-7dx7n\") pod \"community-operators-5psfv\" (UID: \"f9165e83-4c09-4c44-b185-8f8922fcdad7\") " pod="openshift-marketplace/community-operators-5psfv" Feb 19 18:34:28 crc kubenswrapper[4749]: E0219 18:34:28.363865 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:28.863850003 +0000 UTC m=+42.825069957 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.364022 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9165e83-4c09-4c44-b185-8f8922fcdad7-catalog-content\") pod \"community-operators-5psfv\" (UID: \"f9165e83-4c09-4c44-b185-8f8922fcdad7\") " pod="openshift-marketplace/community-operators-5psfv" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.364071 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9165e83-4c09-4c44-b185-8f8922fcdad7-utilities\") pod \"community-operators-5psfv\" (UID: \"f9165e83-4c09-4c44-b185-8f8922fcdad7\") " pod="openshift-marketplace/community-operators-5psfv" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.416740 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dx7n\" (UniqueName: \"kubernetes.io/projected/f9165e83-4c09-4c44-b185-8f8922fcdad7-kube-api-access-7dx7n\") pod \"community-operators-5psfv\" (UID: \"f9165e83-4c09-4c44-b185-8f8922fcdad7\") " pod="openshift-marketplace/community-operators-5psfv" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.443739 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pdxkf"] Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.444623 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pdxkf" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.464840 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.465135 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wtxh\" (UniqueName: \"kubernetes.io/projected/81536517-730c-4da7-b371-efe28f18a1f3-kube-api-access-4wtxh\") pod \"certified-operators-pdxkf\" (UID: \"81536517-730c-4da7-b371-efe28f18a1f3\") " pod="openshift-marketplace/certified-operators-pdxkf" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.465194 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81536517-730c-4da7-b371-efe28f18a1f3-utilities\") pod \"certified-operators-pdxkf\" (UID: \"81536517-730c-4da7-b371-efe28f18a1f3\") " pod="openshift-marketplace/certified-operators-pdxkf" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.483870 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81536517-730c-4da7-b371-efe28f18a1f3-catalog-content\") pod \"certified-operators-pdxkf\" (UID: \"81536517-730c-4da7-b371-efe28f18a1f3\") " pod="openshift-marketplace/certified-operators-pdxkf" Feb 19 18:34:28 crc kubenswrapper[4749]: E0219 18:34:28.484306 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:28.984286172 +0000 UTC m=+42.945506126 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.490642 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pdxkf"] Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.549464 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5psfv" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.587642 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wtxh\" (UniqueName: \"kubernetes.io/projected/81536517-730c-4da7-b371-efe28f18a1f3-kube-api-access-4wtxh\") pod \"certified-operators-pdxkf\" (UID: \"81536517-730c-4da7-b371-efe28f18a1f3\") " pod="openshift-marketplace/certified-operators-pdxkf" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.587766 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81536517-730c-4da7-b371-efe28f18a1f3-utilities\") pod \"certified-operators-pdxkf\" (UID: \"81536517-730c-4da7-b371-efe28f18a1f3\") " pod="openshift-marketplace/certified-operators-pdxkf" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.587791 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.587823 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81536517-730c-4da7-b371-efe28f18a1f3-catalog-content\") pod \"certified-operators-pdxkf\" (UID: \"81536517-730c-4da7-b371-efe28f18a1f3\") " pod="openshift-marketplace/certified-operators-pdxkf" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.588234 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81536517-730c-4da7-b371-efe28f18a1f3-catalog-content\") pod \"certified-operators-pdxkf\" (UID: \"81536517-730c-4da7-b371-efe28f18a1f3\") " pod="openshift-marketplace/certified-operators-pdxkf" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.588650 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81536517-730c-4da7-b371-efe28f18a1f3-utilities\") pod \"certified-operators-pdxkf\" (UID: \"81536517-730c-4da7-b371-efe28f18a1f3\") " pod="openshift-marketplace/certified-operators-pdxkf" Feb 19 18:34:28 crc kubenswrapper[4749]: E0219 18:34:28.588862 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:29.088851909 +0000 UTC m=+43.050071863 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.619187 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wtxh\" (UniqueName: \"kubernetes.io/projected/81536517-730c-4da7-b371-efe28f18a1f3-kube-api-access-4wtxh\") pod \"certified-operators-pdxkf\" (UID: \"81536517-730c-4da7-b371-efe28f18a1f3\") " pod="openshift-marketplace/certified-operators-pdxkf" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.664085 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rmg2b"] Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.665212 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rmg2b" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.690762 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:28 crc kubenswrapper[4749]: E0219 18:34:28.691091 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:29.191076123 +0000 UTC m=+43.152296077 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.692705 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rmg2b"] Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.763498 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pdxkf" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.796131 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmzbj\" (UniqueName: \"kubernetes.io/projected/e3563ed4-2c84-4219-9843-c95a8bba26ac-kube-api-access-wmzbj\") pod \"community-operators-rmg2b\" (UID: \"e3563ed4-2c84-4219-9843-c95a8bba26ac\") " pod="openshift-marketplace/community-operators-rmg2b" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.796441 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.796469 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3563ed4-2c84-4219-9843-c95a8bba26ac-utilities\") pod \"community-operators-rmg2b\" (UID: \"e3563ed4-2c84-4219-9843-c95a8bba26ac\") " pod="openshift-marketplace/community-operators-rmg2b" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.796529 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3563ed4-2c84-4219-9843-c95a8bba26ac-catalog-content\") pod \"community-operators-rmg2b\" (UID: \"e3563ed4-2c84-4219-9843-c95a8bba26ac\") " pod="openshift-marketplace/community-operators-rmg2b" Feb 19 18:34:28 crc kubenswrapper[4749]: E0219 18:34:28.796856 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:29.296841008 +0000 UTC m=+43.258060962 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.822322 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9drdb" event={"ID":"5bccd7a7-c28a-4612-914c-3b8f99324dec","Type":"ContainerStarted","Data":"ebf27c4f2123c59cced4a19dc97c9df1a33717a79414c4cc88eb4ac6d5f27726"} Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.855718 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2n4lk"] Feb 19 18:34:28 crc kubenswrapper[4749]: W0219 18:34:28.886632 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode19c61ad_b387_457b_814b_e382b0265880.slice/crio-7d9a9e717d9257a3572dbf5b955b8ff29d67f86c7d002689e7f3edc18a3b0060 WatchSource:0}: Error finding container 7d9a9e717d9257a3572dbf5b955b8ff29d67f86c7d002689e7f3edc18a3b0060: Status 404 returned error can't find the container with id 7d9a9e717d9257a3572dbf5b955b8ff29d67f86c7d002689e7f3edc18a3b0060 Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.901894 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.902051 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmzbj\" (UniqueName: \"kubernetes.io/projected/e3563ed4-2c84-4219-9843-c95a8bba26ac-kube-api-access-wmzbj\") pod \"community-operators-rmg2b\" (UID: \"e3563ed4-2c84-4219-9843-c95a8bba26ac\") " pod="openshift-marketplace/community-operators-rmg2b" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.902106 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3563ed4-2c84-4219-9843-c95a8bba26ac-utilities\") pod \"community-operators-rmg2b\" (UID: \"e3563ed4-2c84-4219-9843-c95a8bba26ac\") " pod="openshift-marketplace/community-operators-rmg2b" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.902207 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3563ed4-2c84-4219-9843-c95a8bba26ac-catalog-content\") pod \"community-operators-rmg2b\" (UID: \"e3563ed4-2c84-4219-9843-c95a8bba26ac\") " pod="openshift-marketplace/community-operators-rmg2b" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.902534 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3563ed4-2c84-4219-9843-c95a8bba26ac-utilities\") pod \"community-operators-rmg2b\" (UID: \"e3563ed4-2c84-4219-9843-c95a8bba26ac\") " pod="openshift-marketplace/community-operators-rmg2b" Feb 19 18:34:28 crc kubenswrapper[4749]: E0219 18:34:28.902608 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:29.402593201 +0000 UTC m=+43.363813155 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.903656 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3563ed4-2c84-4219-9843-c95a8bba26ac-catalog-content\") pod \"community-operators-rmg2b\" (UID: \"e3563ed4-2c84-4219-9843-c95a8bba26ac\") " pod="openshift-marketplace/community-operators-rmg2b" Feb 19 18:34:28 crc kubenswrapper[4749]: I0219 18:34:28.934607 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmzbj\" (UniqueName: \"kubernetes.io/projected/e3563ed4-2c84-4219-9843-c95a8bba26ac-kube-api-access-wmzbj\") pod \"community-operators-rmg2b\" (UID: \"e3563ed4-2c84-4219-9843-c95a8bba26ac\") " pod="openshift-marketplace/community-operators-rmg2b" Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.003243 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:29 crc kubenswrapper[4749]: E0219 18:34:29.004695 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:29.504678502 +0000 UTC m=+43.465898456 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.013814 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rmg2b" Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.037266 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vw4bt" podStartSLOduration=23.037249763 podStartE2EDuration="23.037249763s" podCreationTimestamp="2026-02-19 18:34:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:28.86515645 +0000 UTC m=+42.826376414" watchObservedRunningTime="2026-02-19 18:34:29.037249763 +0000 UTC m=+42.998469727" Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.038725 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5psfv"] Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.049251 4749 patch_prober.go:28] interesting pod/router-default-5444994796-slnmj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 18:34:29 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Feb 19 18:34:29 crc kubenswrapper[4749]: [+]process-running ok Feb 19 18:34:29 crc kubenswrapper[4749]: healthz check failed Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.049311 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-slnmj" podUID="4586ae5a-18b0-474f-8581-a2cd850ea693" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.103762 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:29 crc kubenswrapper[4749]: E0219 18:34:29.104047 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:29.604018201 +0000 UTC m=+43.565238145 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.204731 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:29 crc kubenswrapper[4749]: E0219 18:34:29.205322 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:29.705303073 +0000 UTC m=+43.666523017 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.271332 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pdxkf"] Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.306833 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:29 crc kubenswrapper[4749]: E0219 18:34:29.307253 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:29.8072219 +0000 UTC m=+43.768441854 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.307916 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:29 crc kubenswrapper[4749]: E0219 18:34:29.308587 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:29.80853037 +0000 UTC m=+43.769750324 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:29 crc kubenswrapper[4749]: W0219 18:34:29.331985 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81536517_730c_4da7_b371_efe28f18a1f3.slice/crio-f3f2cf87c27b40ab42ea2ac6f61063e28fcf1cf84b86338bb990aaa53d21e39d WatchSource:0}: Error finding container f3f2cf87c27b40ab42ea2ac6f61063e28fcf1cf84b86338bb990aaa53d21e39d: Status 404 returned error can't find the container with id f3f2cf87c27b40ab42ea2ac6f61063e28fcf1cf84b86338bb990aaa53d21e39d Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.418566 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:29 crc kubenswrapper[4749]: E0219 18:34:29.418972 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:29.918951771 +0000 UTC m=+43.880171725 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.467821 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rmg2b"] Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.503350 4749 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.519818 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:29 crc kubenswrapper[4749]: E0219 18:34:29.520155 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:30.020144141 +0000 UTC m=+43.981364095 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.620789 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:29 crc kubenswrapper[4749]: E0219 18:34:29.620912 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:30.120888181 +0000 UTC m=+44.082108135 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.620957 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:29 crc kubenswrapper[4749]: E0219 18:34:29.621317 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:30.121305751 +0000 UTC m=+44.082525705 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.721978 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:29 crc kubenswrapper[4749]: E0219 18:34:29.722158 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 18:34:30.222132773 +0000 UTC m=+44.183352727 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.722310 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:29 crc kubenswrapper[4749]: E0219 18:34:29.722608 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 18:34:30.222595064 +0000 UTC m=+44.183815028 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w4nkj" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.779107 4749 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-19T18:34:29.503370589Z","Handler":null,"Name":""} Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.781590 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.787019 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-497mk" Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.794843 4749 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.794877 4749 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.822888 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.827309 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.829558 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9drdb" event={"ID":"5bccd7a7-c28a-4612-914c-3b8f99324dec","Type":"ContainerStarted","Data":"8ecdce1e2d535763c88b7bd8fb028f224499b4295d6653e97d9b6201370f6212"} Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.829626 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9drdb" event={"ID":"5bccd7a7-c28a-4612-914c-3b8f99324dec","Type":"ContainerStarted","Data":"39dec356c0109e897b49870d00e7ef0db10374462ffc82d872787991030b9710"} Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.831469 4749 generic.go:334] "Generic (PLEG): container finished" podID="e19c61ad-b387-457b-814b-e382b0265880" containerID="e707574541e931655daa9470cbd350cc04c2cdb19a5980a945380d7f511b0a37" exitCode=0 Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.831544 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2n4lk" event={"ID":"e19c61ad-b387-457b-814b-e382b0265880","Type":"ContainerDied","Data":"e707574541e931655daa9470cbd350cc04c2cdb19a5980a945380d7f511b0a37"} Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.831569 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2n4lk" event={"ID":"e19c61ad-b387-457b-814b-e382b0265880","Type":"ContainerStarted","Data":"7d9a9e717d9257a3572dbf5b955b8ff29d67f86c7d002689e7f3edc18a3b0060"} Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.833416 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.835977 4749 generic.go:334] "Generic (PLEG): container finished" podID="c2e534c2-c769-4ad3-942b-d181ed2cf11e" containerID="7b15c2b1a142dfdad1583f214541aacdcd68db7a0d7388d30b6da261a6fe8f8c" exitCode=0 Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.836062 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525430-b9kh6" event={"ID":"c2e534c2-c769-4ad3-942b-d181ed2cf11e","Type":"ContainerDied","Data":"7b15c2b1a142dfdad1583f214541aacdcd68db7a0d7388d30b6da261a6fe8f8c"} Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.840330 4749 generic.go:334] "Generic (PLEG): container finished" podID="e3563ed4-2c84-4219-9843-c95a8bba26ac" containerID="8555d99d575efeec2e792717f9565c6d006c8bdd0e007bc0a347ad813d06fd80" exitCode=0 Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.840416 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rmg2b" event={"ID":"e3563ed4-2c84-4219-9843-c95a8bba26ac","Type":"ContainerDied","Data":"8555d99d575efeec2e792717f9565c6d006c8bdd0e007bc0a347ad813d06fd80"} Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.840448 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rmg2b" event={"ID":"e3563ed4-2c84-4219-9843-c95a8bba26ac","Type":"ContainerStarted","Data":"99b07d02a0f5faceb901e01468eaee2fef56e89ded85c8ec2020b22782ca5a07"} Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.846767 4749 generic.go:334] "Generic (PLEG): container finished" podID="81536517-730c-4da7-b371-efe28f18a1f3" containerID="5513c34640b835ca8331f826dd692abc13d4195f0163476d0e0db5e09bb115a0" exitCode=0 Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.846850 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pdxkf" event={"ID":"81536517-730c-4da7-b371-efe28f18a1f3","Type":"ContainerDied","Data":"5513c34640b835ca8331f826dd692abc13d4195f0163476d0e0db5e09bb115a0"} Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.846875 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pdxkf" event={"ID":"81536517-730c-4da7-b371-efe28f18a1f3","Type":"ContainerStarted","Data":"f3f2cf87c27b40ab42ea2ac6f61063e28fcf1cf84b86338bb990aaa53d21e39d"} Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.855802 4749 generic.go:334] "Generic (PLEG): container finished" podID="f9165e83-4c09-4c44-b185-8f8922fcdad7" containerID="618acf69a1453aa4006f93fdac71b7f649fbb6268f86664da9c8c53a8fc772a5" exitCode=0 Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.856441 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5psfv" event={"ID":"f9165e83-4c09-4c44-b185-8f8922fcdad7","Type":"ContainerDied","Data":"618acf69a1453aa4006f93fdac71b7f649fbb6268f86664da9c8c53a8fc772a5"} Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.856473 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5psfv" event={"ID":"f9165e83-4c09-4c44-b185-8f8922fcdad7","Type":"ContainerStarted","Data":"0e3ff519c4aeb91a96fa60a1753584bc8bc911794cc34b45677ef82a49543a56"} Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.924814 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.973415 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.973485 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:29 crc kubenswrapper[4749]: I0219 18:34:29.999608 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-9drdb" podStartSLOduration=10.999584321 podStartE2EDuration="10.999584321s" podCreationTimestamp="2026-02-19 18:34:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:29.993564094 +0000 UTC m=+43.954784068" watchObservedRunningTime="2026-02-19 18:34:29.999584321 +0000 UTC m=+43.960804275" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.036137 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-slnmj" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.043819 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zqrzw"] Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.043925 4749 patch_prober.go:28] interesting pod/router-default-5444994796-slnmj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 18:34:30 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Feb 19 18:34:30 crc kubenswrapper[4749]: [+]process-running ok Feb 19 18:34:30 crc kubenswrapper[4749]: healthz check failed Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.043965 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-slnmj" podUID="4586ae5a-18b0-474f-8581-a2cd850ea693" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.044822 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zqrzw" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.047284 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.053326 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zqrzw"] Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.067376 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w4nkj\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.093149 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.093867 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.095719 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.095788 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.104600 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.127721 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2-catalog-content\") pod \"redhat-marketplace-zqrzw\" (UID: \"ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2\") " pod="openshift-marketplace/redhat-marketplace-zqrzw" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.127835 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2-utilities\") pod \"redhat-marketplace-zqrzw\" (UID: \"ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2\") " pod="openshift-marketplace/redhat-marketplace-zqrzw" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.127909 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14a874c1-cd99-4392-bbb3-bdbe9d81b8cc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"14a874c1-cd99-4392-bbb3-bdbe9d81b8cc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.128047 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfdxc\" (UniqueName: \"kubernetes.io/projected/ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2-kube-api-access-pfdxc\") pod \"redhat-marketplace-zqrzw\" (UID: \"ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2\") " pod="openshift-marketplace/redhat-marketplace-zqrzw" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.128133 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/14a874c1-cd99-4392-bbb3-bdbe9d81b8cc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"14a874c1-cd99-4392-bbb3-bdbe9d81b8cc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.229258 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfdxc\" (UniqueName: \"kubernetes.io/projected/ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2-kube-api-access-pfdxc\") pod \"redhat-marketplace-zqrzw\" (UID: \"ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2\") " pod="openshift-marketplace/redhat-marketplace-zqrzw" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.229339 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/14a874c1-cd99-4392-bbb3-bdbe9d81b8cc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"14a874c1-cd99-4392-bbb3-bdbe9d81b8cc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.229366 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2-catalog-content\") pod \"redhat-marketplace-zqrzw\" (UID: \"ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2\") " pod="openshift-marketplace/redhat-marketplace-zqrzw" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.229393 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2-utilities\") pod \"redhat-marketplace-zqrzw\" (UID: \"ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2\") " pod="openshift-marketplace/redhat-marketplace-zqrzw" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.229410 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14a874c1-cd99-4392-bbb3-bdbe9d81b8cc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"14a874c1-cd99-4392-bbb3-bdbe9d81b8cc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.229886 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2-utilities\") pod \"redhat-marketplace-zqrzw\" (UID: \"ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2\") " pod="openshift-marketplace/redhat-marketplace-zqrzw" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.229996 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2-catalog-content\") pod \"redhat-marketplace-zqrzw\" (UID: \"ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2\") " pod="openshift-marketplace/redhat-marketplace-zqrzw" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.230112 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/14a874c1-cd99-4392-bbb3-bdbe9d81b8cc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"14a874c1-cd99-4392-bbb3-bdbe9d81b8cc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.242164 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.254791 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfdxc\" (UniqueName: \"kubernetes.io/projected/ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2-kube-api-access-pfdxc\") pod \"redhat-marketplace-zqrzw\" (UID: \"ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2\") " pod="openshift-marketplace/redhat-marketplace-zqrzw" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.260466 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14a874c1-cd99-4392-bbb3-bdbe9d81b8cc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"14a874c1-cd99-4392-bbb3-bdbe9d81b8cc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.369557 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zqrzw" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.413269 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.436265 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tb5cv"] Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.437651 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tb5cv" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.443780 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq7ss\" (UniqueName: \"kubernetes.io/projected/856ea752-2729-4936-96aa-423c76975a34-kube-api-access-pq7ss\") pod \"redhat-marketplace-tb5cv\" (UID: \"856ea752-2729-4936-96aa-423c76975a34\") " pod="openshift-marketplace/redhat-marketplace-tb5cv" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.443873 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/856ea752-2729-4936-96aa-423c76975a34-catalog-content\") pod \"redhat-marketplace-tb5cv\" (UID: \"856ea752-2729-4936-96aa-423c76975a34\") " pod="openshift-marketplace/redhat-marketplace-tb5cv" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.443908 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/856ea752-2729-4936-96aa-423c76975a34-utilities\") pod \"redhat-marketplace-tb5cv\" (UID: \"856ea752-2729-4936-96aa-423c76975a34\") " pod="openshift-marketplace/redhat-marketplace-tb5cv" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.456339 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tb5cv"] Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.545299 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq7ss\" (UniqueName: \"kubernetes.io/projected/856ea752-2729-4936-96aa-423c76975a34-kube-api-access-pq7ss\") pod \"redhat-marketplace-tb5cv\" (UID: \"856ea752-2729-4936-96aa-423c76975a34\") " pod="openshift-marketplace/redhat-marketplace-tb5cv" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.545365 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/856ea752-2729-4936-96aa-423c76975a34-catalog-content\") pod \"redhat-marketplace-tb5cv\" (UID: \"856ea752-2729-4936-96aa-423c76975a34\") " pod="openshift-marketplace/redhat-marketplace-tb5cv" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.545387 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/856ea752-2729-4936-96aa-423c76975a34-utilities\") pod \"redhat-marketplace-tb5cv\" (UID: \"856ea752-2729-4936-96aa-423c76975a34\") " pod="openshift-marketplace/redhat-marketplace-tb5cv" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.545778 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/856ea752-2729-4936-96aa-423c76975a34-utilities\") pod \"redhat-marketplace-tb5cv\" (UID: \"856ea752-2729-4936-96aa-423c76975a34\") " pod="openshift-marketplace/redhat-marketplace-tb5cv" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.546243 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/856ea752-2729-4936-96aa-423c76975a34-catalog-content\") pod \"redhat-marketplace-tb5cv\" (UID: \"856ea752-2729-4936-96aa-423c76975a34\") " pod="openshift-marketplace/redhat-marketplace-tb5cv" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.563584 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq7ss\" (UniqueName: \"kubernetes.io/projected/856ea752-2729-4936-96aa-423c76975a34-kube-api-access-pq7ss\") pod \"redhat-marketplace-tb5cv\" (UID: \"856ea752-2729-4936-96aa-423c76975a34\") " pod="openshift-marketplace/redhat-marketplace-tb5cv" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.569712 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-w4nkj"] Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.694224 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.764633 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zqrzw"] Feb 19 18:34:30 crc kubenswrapper[4749]: W0219 18:34:30.784216 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce02a175_8cf2_4b11_b1b6_a3e0eb2fe4b2.slice/crio-bf2a63de371b3f6e96e35c39eec2619090881a550d87edd1d374d5570b71dd4b WatchSource:0}: Error finding container bf2a63de371b3f6e96e35c39eec2619090881a550d87edd1d374d5570b71dd4b: Status 404 returned error can't find the container with id bf2a63de371b3f6e96e35c39eec2619090881a550d87edd1d374d5570b71dd4b Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.827079 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tb5cv" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.874820 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" event={"ID":"ac1677b6-8344-44c7-a5fc-2924da30ddbc","Type":"ContainerStarted","Data":"8a953d8d4494f7a77d0414437504a1f253340983b72af14119c3b528e38ad8ec"} Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.874886 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" event={"ID":"ac1677b6-8344-44c7-a5fc-2924da30ddbc","Type":"ContainerStarted","Data":"547682b9d6cb014df5f439a576c469d484e31e3ec2c85b06f7e26882530b28d2"} Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.874958 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.889287 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zqrzw" event={"ID":"ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2","Type":"ContainerStarted","Data":"bf2a63de371b3f6e96e35c39eec2619090881a550d87edd1d374d5570b71dd4b"} Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.891410 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" podStartSLOduration=23.891355185 podStartE2EDuration="23.891355185s" podCreationTimestamp="2026-02-19 18:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:30.891061359 +0000 UTC m=+44.852281313" watchObservedRunningTime="2026-02-19 18:34:30.891355185 +0000 UTC m=+44.852575139" Feb 19 18:34:30 crc kubenswrapper[4749]: I0219 18:34:30.929525 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 19 18:34:30 crc kubenswrapper[4749]: W0219 18:34:30.946446 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod14a874c1_cd99_4392_bbb3_bdbe9d81b8cc.slice/crio-cbaea050081ba30a4e83fda3d18ed885730621f3a6b2ab9994009f3d0d4b60f5 WatchSource:0}: Error finding container cbaea050081ba30a4e83fda3d18ed885730621f3a6b2ab9994009f3d0d4b60f5: Status 404 returned error can't find the container with id cbaea050081ba30a4e83fda3d18ed885730621f3a6b2ab9994009f3d0d4b60f5 Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.042990 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-slnmj" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.057859 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-slnmj" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.233469 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tb5cv"] Feb 19 18:34:31 crc kubenswrapper[4749]: W0219 18:34:31.271068 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod856ea752_2729_4936_96aa_423c76975a34.slice/crio-9ed5a059c4db7002016b2370e9270cac7436ce74b6f6c6ce8ed82e84e2ed6e4e WatchSource:0}: Error finding container 9ed5a059c4db7002016b2370e9270cac7436ce74b6f6c6ce8ed82e84e2ed6e4e: Status 404 returned error can't find the container with id 9ed5a059c4db7002016b2370e9270cac7436ce74b6f6c6ce8ed82e84e2ed6e4e Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.318985 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525430-b9kh6" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.434478 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mw89w"] Feb 19 18:34:31 crc kubenswrapper[4749]: E0219 18:34:31.434974 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2e534c2-c769-4ad3-942b-d181ed2cf11e" containerName="collect-profiles" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.434988 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2e534c2-c769-4ad3-942b-d181ed2cf11e" containerName="collect-profiles" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.435119 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2e534c2-c769-4ad3-942b-d181ed2cf11e" containerName="collect-profiles" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.435858 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mw89w" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.438529 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.447873 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mw89w"] Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.459666 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvj9b\" (UniqueName: \"kubernetes.io/projected/c2e534c2-c769-4ad3-942b-d181ed2cf11e-kube-api-access-jvj9b\") pod \"c2e534c2-c769-4ad3-942b-d181ed2cf11e\" (UID: \"c2e534c2-c769-4ad3-942b-d181ed2cf11e\") " Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.459744 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2e534c2-c769-4ad3-942b-d181ed2cf11e-secret-volume\") pod \"c2e534c2-c769-4ad3-942b-d181ed2cf11e\" (UID: \"c2e534c2-c769-4ad3-942b-d181ed2cf11e\") " Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.459804 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2e534c2-c769-4ad3-942b-d181ed2cf11e-config-volume\") pod \"c2e534c2-c769-4ad3-942b-d181ed2cf11e\" (UID: \"c2e534c2-c769-4ad3-942b-d181ed2cf11e\") " Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.460790 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2e534c2-c769-4ad3-942b-d181ed2cf11e-config-volume" (OuterVolumeSpecName: "config-volume") pod "c2e534c2-c769-4ad3-942b-d181ed2cf11e" (UID: "c2e534c2-c769-4ad3-942b-d181ed2cf11e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.468907 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2e534c2-c769-4ad3-942b-d181ed2cf11e-kube-api-access-jvj9b" (OuterVolumeSpecName: "kube-api-access-jvj9b") pod "c2e534c2-c769-4ad3-942b-d181ed2cf11e" (UID: "c2e534c2-c769-4ad3-942b-d181ed2cf11e"). InnerVolumeSpecName "kube-api-access-jvj9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.469263 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2e534c2-c769-4ad3-942b-d181ed2cf11e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c2e534c2-c769-4ad3-942b-d181ed2cf11e" (UID: "c2e534c2-c769-4ad3-942b-d181ed2cf11e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.520918 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.521532 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.527157 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.530831 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.531007 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.553780 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.561415 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bflbh\" (UniqueName: \"kubernetes.io/projected/69277352-22e8-4094-944f-bb38a3fb3a83-kube-api-access-bflbh\") pod \"redhat-operators-mw89w\" (UID: \"69277352-22e8-4094-944f-bb38a3fb3a83\") " pod="openshift-marketplace/redhat-operators-mw89w" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.561458 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69277352-22e8-4094-944f-bb38a3fb3a83-utilities\") pod \"redhat-operators-mw89w\" (UID: \"69277352-22e8-4094-944f-bb38a3fb3a83\") " pod="openshift-marketplace/redhat-operators-mw89w" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.561501 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69277352-22e8-4094-944f-bb38a3fb3a83-catalog-content\") pod \"redhat-operators-mw89w\" (UID: \"69277352-22e8-4094-944f-bb38a3fb3a83\") " pod="openshift-marketplace/redhat-operators-mw89w" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.561548 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2e534c2-c769-4ad3-942b-d181ed2cf11e-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.561560 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvj9b\" (UniqueName: \"kubernetes.io/projected/c2e534c2-c769-4ad3-942b-d181ed2cf11e-kube-api-access-jvj9b\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.561570 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2e534c2-c769-4ad3-942b-d181ed2cf11e-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.662905 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7d859ff-962c-4177-8315-c296da5a53d7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b7d859ff-962c-4177-8315-c296da5a53d7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.662960 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bflbh\" (UniqueName: \"kubernetes.io/projected/69277352-22e8-4094-944f-bb38a3fb3a83-kube-api-access-bflbh\") pod \"redhat-operators-mw89w\" (UID: \"69277352-22e8-4094-944f-bb38a3fb3a83\") " pod="openshift-marketplace/redhat-operators-mw89w" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.662992 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69277352-22e8-4094-944f-bb38a3fb3a83-utilities\") pod \"redhat-operators-mw89w\" (UID: \"69277352-22e8-4094-944f-bb38a3fb3a83\") " pod="openshift-marketplace/redhat-operators-mw89w" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.663050 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7d859ff-962c-4177-8315-c296da5a53d7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b7d859ff-962c-4177-8315-c296da5a53d7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.663075 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69277352-22e8-4094-944f-bb38a3fb3a83-catalog-content\") pod \"redhat-operators-mw89w\" (UID: \"69277352-22e8-4094-944f-bb38a3fb3a83\") " pod="openshift-marketplace/redhat-operators-mw89w" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.664442 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69277352-22e8-4094-944f-bb38a3fb3a83-utilities\") pod \"redhat-operators-mw89w\" (UID: \"69277352-22e8-4094-944f-bb38a3fb3a83\") " pod="openshift-marketplace/redhat-operators-mw89w" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.664608 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69277352-22e8-4094-944f-bb38a3fb3a83-catalog-content\") pod \"redhat-operators-mw89w\" (UID: \"69277352-22e8-4094-944f-bb38a3fb3a83\") " pod="openshift-marketplace/redhat-operators-mw89w" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.682486 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-7vbmz container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.682536 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-7vbmz" podUID="41d2c5ca-2f07-4fb0-9822-5d3f7119f56b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.683557 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-7vbmz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.683582 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7vbmz" podUID="41d2c5ca-2f07-4fb0-9822-5d3f7119f56b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.698399 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bflbh\" (UniqueName: \"kubernetes.io/projected/69277352-22e8-4094-944f-bb38a3fb3a83-kube-api-access-bflbh\") pod \"redhat-operators-mw89w\" (UID: \"69277352-22e8-4094-944f-bb38a3fb3a83\") " pod="openshift-marketplace/redhat-operators-mw89w" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.764204 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7d859ff-962c-4177-8315-c296da5a53d7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b7d859ff-962c-4177-8315-c296da5a53d7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.764286 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7d859ff-962c-4177-8315-c296da5a53d7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b7d859ff-962c-4177-8315-c296da5a53d7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.764728 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7d859ff-962c-4177-8315-c296da5a53d7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b7d859ff-962c-4177-8315-c296da5a53d7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.771508 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mw89w" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.788901 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7d859ff-962c-4177-8315-c296da5a53d7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b7d859ff-962c-4177-8315-c296da5a53d7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.839979 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.840314 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.841944 4749 patch_prober.go:28] interesting pod/console-f9d7485db-rnkj5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.841982 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rnkj5" podUID="be9f2c50-2589-4bbb-b193-9ee20f2a6c0d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.854501 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.867405 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tscgs"] Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.869786 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tscgs" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.877160 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dwgqx" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.888545 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.911822 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"14a874c1-cd99-4392-bbb3-bdbe9d81b8cc","Type":"ContainerStarted","Data":"12e447f9feebd31211fba72ea29dddbc8849d173212ac3c18d5ffe305c2da628"} Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.911861 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"14a874c1-cd99-4392-bbb3-bdbe9d81b8cc","Type":"ContainerStarted","Data":"cbaea050081ba30a4e83fda3d18ed885730621f3a6b2ab9994009f3d0d4b60f5"} Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.950440 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tscgs"] Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.951815 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525430-b9kh6" event={"ID":"c2e534c2-c769-4ad3-942b-d181ed2cf11e","Type":"ContainerDied","Data":"4904ec2c2ffd5eaf37fd62cc7d2fecac3e8c6cfd5831b62c266d25dd47955074"} Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.951853 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4904ec2c2ffd5eaf37fd62cc7d2fecac3e8c6cfd5831b62c266d25dd47955074" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.951939 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525430-b9kh6" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.958299 4749 generic.go:334] "Generic (PLEG): container finished" podID="856ea752-2729-4936-96aa-423c76975a34" containerID="6362d0d09159f8ba698ea5c8d41fe299a2d0e9c032024692a20595568f08f2b1" exitCode=0 Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.958353 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tb5cv" event={"ID":"856ea752-2729-4936-96aa-423c76975a34","Type":"ContainerDied","Data":"6362d0d09159f8ba698ea5c8d41fe299a2d0e9c032024692a20595568f08f2b1"} Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.958376 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tb5cv" event={"ID":"856ea752-2729-4936-96aa-423c76975a34","Type":"ContainerStarted","Data":"9ed5a059c4db7002016b2370e9270cac7436ce74b6f6c6ce8ed82e84e2ed6e4e"} Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.975668 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8955e517-c79b-4f52-9e06-a399c24532cf-utilities\") pod \"redhat-operators-tscgs\" (UID: \"8955e517-c79b-4f52-9e06-a399c24532cf\") " pod="openshift-marketplace/redhat-operators-tscgs" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.975718 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn4q6\" (UniqueName: \"kubernetes.io/projected/8955e517-c79b-4f52-9e06-a399c24532cf-kube-api-access-qn4q6\") pod \"redhat-operators-tscgs\" (UID: \"8955e517-c79b-4f52-9e06-a399c24532cf\") " pod="openshift-marketplace/redhat-operators-tscgs" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.975801 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8955e517-c79b-4f52-9e06-a399c24532cf-catalog-content\") pod \"redhat-operators-tscgs\" (UID: \"8955e517-c79b-4f52-9e06-a399c24532cf\") " pod="openshift-marketplace/redhat-operators-tscgs" Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.981608 4749 generic.go:334] "Generic (PLEG): container finished" podID="ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2" containerID="d740085761000fb6a8cb0ebd4843bc319d78a11d5b161cd9188174f09be3c54c" exitCode=0 Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.981752 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zqrzw" event={"ID":"ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2","Type":"ContainerDied","Data":"d740085761000fb6a8cb0ebd4843bc319d78a11d5b161cd9188174f09be3c54c"} Feb 19 18:34:31 crc kubenswrapper[4749]: I0219 18:34:31.991301 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.99118152 podStartE2EDuration="1.99118152s" podCreationTimestamp="2026-02-19 18:34:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:31.981714705 +0000 UTC m=+45.942934659" watchObservedRunningTime="2026-02-19 18:34:31.99118152 +0000 UTC m=+45.952401474" Feb 19 18:34:32 crc kubenswrapper[4749]: E0219 18:34:32.022762 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dffabbfdfa3d38499c2adf033d395e3afebdffff6631e5b3abba38ea6d2be20b" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 19 18:34:32 crc kubenswrapper[4749]: E0219 18:34:32.029140 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dffabbfdfa3d38499c2adf033d395e3afebdffff6631e5b3abba38ea6d2be20b" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 19 18:34:32 crc kubenswrapper[4749]: E0219 18:34:32.044710 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dffabbfdfa3d38499c2adf033d395e3afebdffff6631e5b3abba38ea6d2be20b" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 19 18:34:32 crc kubenswrapper[4749]: E0219 18:34:32.044865 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" podUID="4f215af7-dad4-4dd1-9cc7-20c611eacace" containerName="kube-multus-additional-cni-plugins" Feb 19 18:34:32 crc kubenswrapper[4749]: I0219 18:34:32.080209 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8955e517-c79b-4f52-9e06-a399c24532cf-utilities\") pod \"redhat-operators-tscgs\" (UID: \"8955e517-c79b-4f52-9e06-a399c24532cf\") " pod="openshift-marketplace/redhat-operators-tscgs" Feb 19 18:34:32 crc kubenswrapper[4749]: I0219 18:34:32.080255 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn4q6\" (UniqueName: \"kubernetes.io/projected/8955e517-c79b-4f52-9e06-a399c24532cf-kube-api-access-qn4q6\") pod \"redhat-operators-tscgs\" (UID: \"8955e517-c79b-4f52-9e06-a399c24532cf\") " pod="openshift-marketplace/redhat-operators-tscgs" Feb 19 18:34:32 crc kubenswrapper[4749]: I0219 18:34:32.080296 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8955e517-c79b-4f52-9e06-a399c24532cf-catalog-content\") pod \"redhat-operators-tscgs\" (UID: \"8955e517-c79b-4f52-9e06-a399c24532cf\") " pod="openshift-marketplace/redhat-operators-tscgs" Feb 19 18:34:32 crc kubenswrapper[4749]: I0219 18:34:32.081626 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8955e517-c79b-4f52-9e06-a399c24532cf-catalog-content\") pod \"redhat-operators-tscgs\" (UID: \"8955e517-c79b-4f52-9e06-a399c24532cf\") " pod="openshift-marketplace/redhat-operators-tscgs" Feb 19 18:34:32 crc kubenswrapper[4749]: I0219 18:34:32.082969 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8955e517-c79b-4f52-9e06-a399c24532cf-utilities\") pod \"redhat-operators-tscgs\" (UID: \"8955e517-c79b-4f52-9e06-a399c24532cf\") " pod="openshift-marketplace/redhat-operators-tscgs" Feb 19 18:34:32 crc kubenswrapper[4749]: I0219 18:34:32.119412 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn4q6\" (UniqueName: \"kubernetes.io/projected/8955e517-c79b-4f52-9e06-a399c24532cf-kube-api-access-qn4q6\") pod \"redhat-operators-tscgs\" (UID: \"8955e517-c79b-4f52-9e06-a399c24532cf\") " pod="openshift-marketplace/redhat-operators-tscgs" Feb 19 18:34:32 crc kubenswrapper[4749]: I0219 18:34:32.201337 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tscgs" Feb 19 18:34:32 crc kubenswrapper[4749]: I0219 18:34:32.253438 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7q4b5" Feb 19 18:34:32 crc kubenswrapper[4749]: I0219 18:34:32.256106 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xzpgj" Feb 19 18:34:32 crc kubenswrapper[4749]: I0219 18:34:32.262354 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6hzt4" Feb 19 18:34:32 crc kubenswrapper[4749]: I0219 18:34:32.302829 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mw89w"] Feb 19 18:34:32 crc kubenswrapper[4749]: I0219 18:34:32.432631 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 19 18:34:32 crc kubenswrapper[4749]: I0219 18:34:32.639604 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tscgs"] Feb 19 18:34:32 crc kubenswrapper[4749]: I0219 18:34:32.991136 4749 generic.go:334] "Generic (PLEG): container finished" podID="8955e517-c79b-4f52-9e06-a399c24532cf" containerID="00be5b4119000da1bf864d74735e0f2aee7811b98ab0bc3f7076534ffdc1ceff" exitCode=0 Feb 19 18:34:32 crc kubenswrapper[4749]: I0219 18:34:32.991261 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tscgs" event={"ID":"8955e517-c79b-4f52-9e06-a399c24532cf","Type":"ContainerDied","Data":"00be5b4119000da1bf864d74735e0f2aee7811b98ab0bc3f7076534ffdc1ceff"} Feb 19 18:34:32 crc kubenswrapper[4749]: I0219 18:34:32.991554 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tscgs" event={"ID":"8955e517-c79b-4f52-9e06-a399c24532cf","Type":"ContainerStarted","Data":"41df819f21891fed71c7987d061f4f46f1145f3e2b3e03e9b70c9e65a0d1c322"} Feb 19 18:34:33 crc kubenswrapper[4749]: I0219 18:34:33.026288 4749 generic.go:334] "Generic (PLEG): container finished" podID="14a874c1-cd99-4392-bbb3-bdbe9d81b8cc" containerID="12e447f9feebd31211fba72ea29dddbc8849d173212ac3c18d5ffe305c2da628" exitCode=0 Feb 19 18:34:33 crc kubenswrapper[4749]: I0219 18:34:33.026372 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"14a874c1-cd99-4392-bbb3-bdbe9d81b8cc","Type":"ContainerDied","Data":"12e447f9feebd31211fba72ea29dddbc8849d173212ac3c18d5ffe305c2da628"} Feb 19 18:34:33 crc kubenswrapper[4749]: I0219 18:34:33.030618 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b7d859ff-962c-4177-8315-c296da5a53d7","Type":"ContainerStarted","Data":"d200da8db51527fb574e7aca5e34a4852e4629abcb63ad0f23b4501e1bb6cf2b"} Feb 19 18:34:33 crc kubenswrapper[4749]: I0219 18:34:33.035324 4749 generic.go:334] "Generic (PLEG): container finished" podID="69277352-22e8-4094-944f-bb38a3fb3a83" containerID="5620579ece5c2e07ef2e9c99aa6e64eb16c69fac1b649920764774b970dda932" exitCode=0 Feb 19 18:34:33 crc kubenswrapper[4749]: I0219 18:34:33.035367 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mw89w" event={"ID":"69277352-22e8-4094-944f-bb38a3fb3a83","Type":"ContainerDied","Data":"5620579ece5c2e07ef2e9c99aa6e64eb16c69fac1b649920764774b970dda932"} Feb 19 18:34:33 crc kubenswrapper[4749]: I0219 18:34:33.035390 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mw89w" event={"ID":"69277352-22e8-4094-944f-bb38a3fb3a83","Type":"ContainerStarted","Data":"c6064e4f35d132ef526cf97b5d5f70a023cae975be9417d797fce56386080b2c"} Feb 19 18:34:34 crc kubenswrapper[4749]: I0219 18:34:34.043036 4749 generic.go:334] "Generic (PLEG): container finished" podID="b7d859ff-962c-4177-8315-c296da5a53d7" containerID="8cd11804cf2300601627584b77bb452ba25886a55ce28837d6e6e426b865aa7a" exitCode=0 Feb 19 18:34:34 crc kubenswrapper[4749]: I0219 18:34:34.043182 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b7d859ff-962c-4177-8315-c296da5a53d7","Type":"ContainerDied","Data":"8cd11804cf2300601627584b77bb452ba25886a55ce28837d6e6e426b865aa7a"} Feb 19 18:34:34 crc kubenswrapper[4749]: I0219 18:34:34.311897 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 18:34:34 crc kubenswrapper[4749]: I0219 18:34:34.330658 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-65n7c" Feb 19 18:34:34 crc kubenswrapper[4749]: I0219 18:34:34.420264 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14a874c1-cd99-4392-bbb3-bdbe9d81b8cc-kube-api-access\") pod \"14a874c1-cd99-4392-bbb3-bdbe9d81b8cc\" (UID: \"14a874c1-cd99-4392-bbb3-bdbe9d81b8cc\") " Feb 19 18:34:34 crc kubenswrapper[4749]: I0219 18:34:34.420342 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/14a874c1-cd99-4392-bbb3-bdbe9d81b8cc-kubelet-dir\") pod \"14a874c1-cd99-4392-bbb3-bdbe9d81b8cc\" (UID: \"14a874c1-cd99-4392-bbb3-bdbe9d81b8cc\") " Feb 19 18:34:34 crc kubenswrapper[4749]: I0219 18:34:34.421444 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14a874c1-cd99-4392-bbb3-bdbe9d81b8cc-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "14a874c1-cd99-4392-bbb3-bdbe9d81b8cc" (UID: "14a874c1-cd99-4392-bbb3-bdbe9d81b8cc"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:34:34 crc kubenswrapper[4749]: I0219 18:34:34.439790 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a874c1-cd99-4392-bbb3-bdbe9d81b8cc-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "14a874c1-cd99-4392-bbb3-bdbe9d81b8cc" (UID: "14a874c1-cd99-4392-bbb3-bdbe9d81b8cc"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:34 crc kubenswrapper[4749]: I0219 18:34:34.524772 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14a874c1-cd99-4392-bbb3-bdbe9d81b8cc-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:34 crc kubenswrapper[4749]: I0219 18:34:34.524806 4749 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/14a874c1-cd99-4392-bbb3-bdbe9d81b8cc-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:35 crc kubenswrapper[4749]: I0219 18:34:35.070950 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"14a874c1-cd99-4392-bbb3-bdbe9d81b8cc","Type":"ContainerDied","Data":"cbaea050081ba30a4e83fda3d18ed885730621f3a6b2ab9994009f3d0d4b60f5"} Feb 19 18:34:35 crc kubenswrapper[4749]: I0219 18:34:35.070993 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbaea050081ba30a4e83fda3d18ed885730621f3a6b2ab9994009f3d0d4b60f5" Feb 19 18:34:35 crc kubenswrapper[4749]: I0219 18:34:35.071002 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 18:34:35 crc kubenswrapper[4749]: I0219 18:34:35.896443 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 18:34:35 crc kubenswrapper[4749]: I0219 18:34:35.978840 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7d859ff-962c-4177-8315-c296da5a53d7-kube-api-access\") pod \"b7d859ff-962c-4177-8315-c296da5a53d7\" (UID: \"b7d859ff-962c-4177-8315-c296da5a53d7\") " Feb 19 18:34:35 crc kubenswrapper[4749]: I0219 18:34:35.984886 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7d859ff-962c-4177-8315-c296da5a53d7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b7d859ff-962c-4177-8315-c296da5a53d7" (UID: "b7d859ff-962c-4177-8315-c296da5a53d7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:34:36 crc kubenswrapper[4749]: I0219 18:34:36.079929 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7d859ff-962c-4177-8315-c296da5a53d7-kubelet-dir\") pod \"b7d859ff-962c-4177-8315-c296da5a53d7\" (UID: \"b7d859ff-962c-4177-8315-c296da5a53d7\") " Feb 19 18:34:36 crc kubenswrapper[4749]: I0219 18:34:36.080140 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7d859ff-962c-4177-8315-c296da5a53d7-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:36 crc kubenswrapper[4749]: I0219 18:34:36.080115 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7d859ff-962c-4177-8315-c296da5a53d7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b7d859ff-962c-4177-8315-c296da5a53d7" (UID: "b7d859ff-962c-4177-8315-c296da5a53d7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:34:36 crc kubenswrapper[4749]: I0219 18:34:36.094833 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b7d859ff-962c-4177-8315-c296da5a53d7","Type":"ContainerDied","Data":"d200da8db51527fb574e7aca5e34a4852e4629abcb63ad0f23b4501e1bb6cf2b"} Feb 19 18:34:36 crc kubenswrapper[4749]: I0219 18:34:36.094872 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d200da8db51527fb574e7aca5e34a4852e4629abcb63ad0f23b4501e1bb6cf2b" Feb 19 18:34:36 crc kubenswrapper[4749]: I0219 18:34:36.094890 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 18:34:36 crc kubenswrapper[4749]: I0219 18:34:36.182130 4749 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7d859ff-962c-4177-8315-c296da5a53d7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 18:34:39 crc kubenswrapper[4749]: I0219 18:34:39.477484 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 18:34:39 crc kubenswrapper[4749]: I0219 18:34:39.490312 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 19 18:34:41 crc kubenswrapper[4749]: I0219 18:34:41.680800 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-7vbmz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Feb 19 18:34:41 crc kubenswrapper[4749]: I0219 18:34:41.680856 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7vbmz" podUID="41d2c5ca-2f07-4fb0-9822-5d3f7119f56b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Feb 19 18:34:41 crc kubenswrapper[4749]: I0219 18:34:41.680875 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-7vbmz container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Feb 19 18:34:41 crc kubenswrapper[4749]: I0219 18:34:41.680924 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-7vbmz" podUID="41d2c5ca-2f07-4fb0-9822-5d3f7119f56b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Feb 19 18:34:41 crc kubenswrapper[4749]: I0219 18:34:41.832806 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:34:41 crc kubenswrapper[4749]: I0219 18:34:41.836336 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:34:41 crc kubenswrapper[4749]: I0219 18:34:41.852385 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=2.852366843 podStartE2EDuration="2.852366843s" podCreationTimestamp="2026-02-19 18:34:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:41.848475495 +0000 UTC m=+55.809695449" watchObservedRunningTime="2026-02-19 18:34:41.852366843 +0000 UTC m=+55.813586797" Feb 19 18:34:41 crc kubenswrapper[4749]: E0219 18:34:41.994398 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dffabbfdfa3d38499c2adf033d395e3afebdffff6631e5b3abba38ea6d2be20b" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 19 18:34:41 crc kubenswrapper[4749]: E0219 18:34:41.998393 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dffabbfdfa3d38499c2adf033d395e3afebdffff6631e5b3abba38ea6d2be20b" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 19 18:34:42 crc kubenswrapper[4749]: E0219 18:34:42.001346 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dffabbfdfa3d38499c2adf033d395e3afebdffff6631e5b3abba38ea6d2be20b" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 19 18:34:42 crc kubenswrapper[4749]: E0219 18:34:42.001398 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" podUID="4f215af7-dad4-4dd1-9cc7-20c611eacace" containerName="kube-multus-additional-cni-plugins" Feb 19 18:34:42 crc kubenswrapper[4749]: I0219 18:34:42.029468 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:34:48 crc kubenswrapper[4749]: I0219 18:34:48.360927 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-d49ll"] Feb 19 18:34:48 crc kubenswrapper[4749]: I0219 18:34:48.361805 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" podUID="a1cfc2e6-b878-4af9-969b-33a513042b75" containerName="controller-manager" containerID="cri-o://f5d9981eb5f4573cb0abba742594063e8269b0bca49482f1a49b479e43c06a9e" gracePeriod=30 Feb 19 18:34:48 crc kubenswrapper[4749]: I0219 18:34:48.458545 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm"] Feb 19 18:34:48 crc kubenswrapper[4749]: I0219 18:34:48.458784 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" podUID="b1a11162-6554-4080-9b1f-e0864a79ec01" containerName="route-controller-manager" containerID="cri-o://3fe30a1ac9c81d9e353c6e3b53345e544f400fb05c6e4578342a26b0af123e2d" gracePeriod=30 Feb 19 18:34:49 crc kubenswrapper[4749]: I0219 18:34:49.195780 4749 generic.go:334] "Generic (PLEG): container finished" podID="a1cfc2e6-b878-4af9-969b-33a513042b75" containerID="f5d9981eb5f4573cb0abba742594063e8269b0bca49482f1a49b479e43c06a9e" exitCode=0 Feb 19 18:34:49 crc kubenswrapper[4749]: I0219 18:34:49.195833 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" event={"ID":"a1cfc2e6-b878-4af9-969b-33a513042b75","Type":"ContainerDied","Data":"f5d9981eb5f4573cb0abba742594063e8269b0bca49482f1a49b479e43c06a9e"} Feb 19 18:34:49 crc kubenswrapper[4749]: I0219 18:34:49.197708 4749 generic.go:334] "Generic (PLEG): container finished" podID="b1a11162-6554-4080-9b1f-e0864a79ec01" containerID="3fe30a1ac9c81d9e353c6e3b53345e544f400fb05c6e4578342a26b0af123e2d" exitCode=0 Feb 19 18:34:49 crc kubenswrapper[4749]: I0219 18:34:49.197754 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" event={"ID":"b1a11162-6554-4080-9b1f-e0864a79ec01","Type":"ContainerDied","Data":"3fe30a1ac9c81d9e353c6e3b53345e544f400fb05c6e4578342a26b0af123e2d"} Feb 19 18:34:50 crc kubenswrapper[4749]: I0219 18:34:50.249896 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:34:51 crc kubenswrapper[4749]: I0219 18:34:51.550015 4749 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-d49ll container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 19 18:34:51 crc kubenswrapper[4749]: I0219 18:34:51.550330 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" podUID="a1cfc2e6-b878-4af9-969b-33a513042b75" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 19 18:34:51 crc kubenswrapper[4749]: I0219 18:34:51.693013 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 19 18:34:51 crc kubenswrapper[4749]: I0219 18:34:51.695049 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-7vbmz" Feb 19 18:34:51 crc kubenswrapper[4749]: I0219 18:34:51.710177 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=0.710161239 podStartE2EDuration="710.161239ms" podCreationTimestamp="2026-02-19 18:34:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:34:51.709170866 +0000 UTC m=+65.670390830" watchObservedRunningTime="2026-02-19 18:34:51.710161239 +0000 UTC m=+65.671381193" Feb 19 18:34:51 crc kubenswrapper[4749]: E0219 18:34:51.993054 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dffabbfdfa3d38499c2adf033d395e3afebdffff6631e5b3abba38ea6d2be20b" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 19 18:34:51 crc kubenswrapper[4749]: E0219 18:34:51.994348 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dffabbfdfa3d38499c2adf033d395e3afebdffff6631e5b3abba38ea6d2be20b" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 19 18:34:51 crc kubenswrapper[4749]: E0219 18:34:51.995738 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dffabbfdfa3d38499c2adf033d395e3afebdffff6631e5b3abba38ea6d2be20b" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 19 18:34:51 crc kubenswrapper[4749]: E0219 18:34:51.995767 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" podUID="4f215af7-dad4-4dd1-9cc7-20c611eacace" containerName="kube-multus-additional-cni-plugins" Feb 19 18:34:52 crc kubenswrapper[4749]: I0219 18:34:52.700844 4749 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-87stm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 18:34:52 crc kubenswrapper[4749]: I0219 18:34:52.701177 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" podUID="b1a11162-6554-4080-9b1f-e0864a79ec01" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 18:34:57 crc kubenswrapper[4749]: I0219 18:34:57.303906 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-8tc74_4f215af7-dad4-4dd1-9cc7-20c611eacace/kube-multus-additional-cni-plugins/0.log" Feb 19 18:34:57 crc kubenswrapper[4749]: I0219 18:34:57.304303 4749 generic.go:334] "Generic (PLEG): container finished" podID="4f215af7-dad4-4dd1-9cc7-20c611eacace" containerID="dffabbfdfa3d38499c2adf033d395e3afebdffff6631e5b3abba38ea6d2be20b" exitCode=137 Feb 19 18:34:57 crc kubenswrapper[4749]: I0219 18:34:57.304363 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" event={"ID":"4f215af7-dad4-4dd1-9cc7-20c611eacace","Type":"ContainerDied","Data":"dffabbfdfa3d38499c2adf033d395e3afebdffff6631e5b3abba38ea6d2be20b"} Feb 19 18:35:01 crc kubenswrapper[4749]: I0219 18:35:01.550412 4749 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-d49ll container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 19 18:35:01 crc kubenswrapper[4749]: I0219 18:35:01.550714 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" podUID="a1cfc2e6-b878-4af9-969b-33a513042b75" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 19 18:35:01 crc kubenswrapper[4749]: I0219 18:35:01.966633 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7pgnn" Feb 19 18:35:01 crc kubenswrapper[4749]: E0219 18:35:01.991487 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dffabbfdfa3d38499c2adf033d395e3afebdffff6631e5b3abba38ea6d2be20b is running failed: container process not found" containerID="dffabbfdfa3d38499c2adf033d395e3afebdffff6631e5b3abba38ea6d2be20b" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 19 18:35:01 crc kubenswrapper[4749]: E0219 18:35:01.991855 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dffabbfdfa3d38499c2adf033d395e3afebdffff6631e5b3abba38ea6d2be20b is running failed: container process not found" containerID="dffabbfdfa3d38499c2adf033d395e3afebdffff6631e5b3abba38ea6d2be20b" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 19 18:35:01 crc kubenswrapper[4749]: E0219 18:35:01.992043 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dffabbfdfa3d38499c2adf033d395e3afebdffff6631e5b3abba38ea6d2be20b is running failed: container process not found" containerID="dffabbfdfa3d38499c2adf033d395e3afebdffff6631e5b3abba38ea6d2be20b" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 19 18:35:01 crc kubenswrapper[4749]: E0219 18:35:01.992068 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dffabbfdfa3d38499c2adf033d395e3afebdffff6631e5b3abba38ea6d2be20b is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" podUID="4f215af7-dad4-4dd1-9cc7-20c611eacace" containerName="kube-multus-additional-cni-plugins" Feb 19 18:35:02 crc kubenswrapper[4749]: I0219 18:35:02.701173 4749 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-87stm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 18:35:02 crc kubenswrapper[4749]: I0219 18:35:02.701231 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" podUID="b1a11162-6554-4080-9b1f-e0864a79ec01" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 18:35:02 crc kubenswrapper[4749]: I0219 18:35:02.895326 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 18:35:06 crc kubenswrapper[4749]: I0219 18:35:06.716749 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 19 18:35:06 crc kubenswrapper[4749]: E0219 18:35:06.717293 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d859ff-962c-4177-8315-c296da5a53d7" containerName="pruner" Feb 19 18:35:06 crc kubenswrapper[4749]: I0219 18:35:06.717314 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d859ff-962c-4177-8315-c296da5a53d7" containerName="pruner" Feb 19 18:35:06 crc kubenswrapper[4749]: E0219 18:35:06.717324 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a874c1-cd99-4392-bbb3-bdbe9d81b8cc" containerName="pruner" Feb 19 18:35:06 crc kubenswrapper[4749]: I0219 18:35:06.717329 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a874c1-cd99-4392-bbb3-bdbe9d81b8cc" containerName="pruner" Feb 19 18:35:06 crc kubenswrapper[4749]: I0219 18:35:06.717417 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7d859ff-962c-4177-8315-c296da5a53d7" containerName="pruner" Feb 19 18:35:06 crc kubenswrapper[4749]: I0219 18:35:06.717426 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a874c1-cd99-4392-bbb3-bdbe9d81b8cc" containerName="pruner" Feb 19 18:35:06 crc kubenswrapper[4749]: I0219 18:35:06.717740 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 18:35:06 crc kubenswrapper[4749]: I0219 18:35:06.721315 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 19 18:35:06 crc kubenswrapper[4749]: I0219 18:35:06.722129 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 19 18:35:06 crc kubenswrapper[4749]: I0219 18:35:06.723952 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 19 18:35:06 crc kubenswrapper[4749]: I0219 18:35:06.856087 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7152bd9-31db-4fb0-878d-4a4a46a3619d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c7152bd9-31db-4fb0-878d-4a4a46a3619d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 18:35:06 crc kubenswrapper[4749]: I0219 18:35:06.856441 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7152bd9-31db-4fb0-878d-4a4a46a3619d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c7152bd9-31db-4fb0-878d-4a4a46a3619d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 18:35:06 crc kubenswrapper[4749]: I0219 18:35:06.958156 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7152bd9-31db-4fb0-878d-4a4a46a3619d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c7152bd9-31db-4fb0-878d-4a4a46a3619d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 18:35:06 crc kubenswrapper[4749]: I0219 18:35:06.958262 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7152bd9-31db-4fb0-878d-4a4a46a3619d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c7152bd9-31db-4fb0-878d-4a4a46a3619d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 18:35:06 crc kubenswrapper[4749]: I0219 18:35:06.958391 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7152bd9-31db-4fb0-878d-4a4a46a3619d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c7152bd9-31db-4fb0-878d-4a4a46a3619d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 18:35:06 crc kubenswrapper[4749]: I0219 18:35:06.977504 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7152bd9-31db-4fb0-878d-4a4a46a3619d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c7152bd9-31db-4fb0-878d-4a4a46a3619d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 18:35:07 crc kubenswrapper[4749]: I0219 18:35:07.045801 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 18:35:07 crc kubenswrapper[4749]: E0219 18:35:07.538450 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 19 18:35:07 crc kubenswrapper[4749]: E0219 18:35:07.538798 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7dx7n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-5psfv_openshift-marketplace(f9165e83-4c09-4c44-b185-8f8922fcdad7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 18:35:07 crc kubenswrapper[4749]: E0219 18:35:07.540137 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-5psfv" podUID="f9165e83-4c09-4c44-b185-8f8922fcdad7" Feb 19 18:35:07 crc kubenswrapper[4749]: E0219 18:35:07.637200 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 19 18:35:07 crc kubenswrapper[4749]: E0219 18:35:07.637553 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wmzbj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-rmg2b_openshift-marketplace(e3563ed4-2c84-4219-9843-c95a8bba26ac): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 18:35:07 crc kubenswrapper[4749]: E0219 18:35:07.638763 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-rmg2b" podUID="e3563ed4-2c84-4219-9843-c95a8bba26ac" Feb 19 18:35:07 crc kubenswrapper[4749]: E0219 18:35:07.662204 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 19 18:35:07 crc kubenswrapper[4749]: E0219 18:35:07.662978 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qn4q6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-tscgs_openshift-marketplace(8955e517-c79b-4f52-9e06-a399c24532cf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 18:35:07 crc kubenswrapper[4749]: E0219 18:35:07.664279 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-tscgs" podUID="8955e517-c79b-4f52-9e06-a399c24532cf" Feb 19 18:35:09 crc kubenswrapper[4749]: E0219 18:35:09.261965 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-rmg2b" podUID="e3563ed4-2c84-4219-9843-c95a8bba26ac" Feb 19 18:35:09 crc kubenswrapper[4749]: E0219 18:35:09.261981 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-tscgs" podUID="8955e517-c79b-4f52-9e06-a399c24532cf" Feb 19 18:35:09 crc kubenswrapper[4749]: E0219 18:35:09.261981 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-5psfv" podUID="f9165e83-4c09-4c44-b185-8f8922fcdad7" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.324432 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" Feb 19 18:35:09 crc kubenswrapper[4749]: E0219 18:35:09.363541 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 19 18:35:09 crc kubenswrapper[4749]: E0219 18:35:09.363710 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gmq7w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-2n4lk_openshift-marketplace(e19c61ad-b387-457b-814b-e382b0265880): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 18:35:09 crc kubenswrapper[4749]: E0219 18:35:09.365305 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-2n4lk" podUID="e19c61ad-b387-457b-814b-e382b0265880" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.372133 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-8tc74_4f215af7-dad4-4dd1-9cc7-20c611eacace/kube-multus-additional-cni-plugins/0.log" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.372205 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" event={"ID":"4f215af7-dad4-4dd1-9cc7-20c611eacace","Type":"ContainerDied","Data":"a2410d7ed1ca93b0dc77f073b57939615bef27b8edc58ce7964b9a5d1380e39a"} Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.372229 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2410d7ed1ca93b0dc77f073b57939615bef27b8edc58ce7964b9a5d1380e39a" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.373754 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" event={"ID":"b1a11162-6554-4080-9b1f-e0864a79ec01","Type":"ContainerDied","Data":"50de07c05a930148df516e42956f444c5c12beca8e6ace09e9f1db62b7267fb3"} Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.373798 4749 scope.go:117] "RemoveContainer" containerID="3fe30a1ac9c81d9e353c6e3b53345e544f400fb05c6e4578342a26b0af123e2d" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.373867 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.394194 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-8tc74_4f215af7-dad4-4dd1-9cc7-20c611eacace/kube-multus-additional-cni-plugins/0.log" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.394291 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" Feb 19 18:35:09 crc kubenswrapper[4749]: E0219 18:35:09.408909 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 19 18:35:09 crc kubenswrapper[4749]: E0219 18:35:09.409083 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4wtxh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-pdxkf_openshift-marketplace(81536517-730c-4da7-b371-efe28f18a1f3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 18:35:09 crc kubenswrapper[4749]: E0219 18:35:09.410239 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-pdxkf" podUID="81536517-730c-4da7-b371-efe28f18a1f3" Feb 19 18:35:09 crc kubenswrapper[4749]: E0219 18:35:09.459494 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 19 18:35:09 crc kubenswrapper[4749]: E0219 18:35:09.459630 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pfdxc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-zqrzw_openshift-marketplace(ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 18:35:09 crc kubenswrapper[4749]: E0219 18:35:09.460795 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-zqrzw" podUID="ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.495368 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6s22\" (UniqueName: \"kubernetes.io/projected/b1a11162-6554-4080-9b1f-e0864a79ec01-kube-api-access-z6s22\") pod \"b1a11162-6554-4080-9b1f-e0864a79ec01\" (UID: \"b1a11162-6554-4080-9b1f-e0864a79ec01\") " Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.495724 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1a11162-6554-4080-9b1f-e0864a79ec01-client-ca\") pod \"b1a11162-6554-4080-9b1f-e0864a79ec01\" (UID: \"b1a11162-6554-4080-9b1f-e0864a79ec01\") " Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.495746 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1a11162-6554-4080-9b1f-e0864a79ec01-serving-cert\") pod \"b1a11162-6554-4080-9b1f-e0864a79ec01\" (UID: \"b1a11162-6554-4080-9b1f-e0864a79ec01\") " Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.495777 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/4f215af7-dad4-4dd1-9cc7-20c611eacace-ready\") pod \"4f215af7-dad4-4dd1-9cc7-20c611eacace\" (UID: \"4f215af7-dad4-4dd1-9cc7-20c611eacace\") " Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.495821 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1a11162-6554-4080-9b1f-e0864a79ec01-config\") pod \"b1a11162-6554-4080-9b1f-e0864a79ec01\" (UID: \"b1a11162-6554-4080-9b1f-e0864a79ec01\") " Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.495845 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4f215af7-dad4-4dd1-9cc7-20c611eacace-cni-sysctl-allowlist\") pod \"4f215af7-dad4-4dd1-9cc7-20c611eacace\" (UID: \"4f215af7-dad4-4dd1-9cc7-20c611eacace\") " Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.495861 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4f215af7-dad4-4dd1-9cc7-20c611eacace-tuning-conf-dir\") pod \"4f215af7-dad4-4dd1-9cc7-20c611eacace\" (UID: \"4f215af7-dad4-4dd1-9cc7-20c611eacace\") " Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.495887 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6djc\" (UniqueName: \"kubernetes.io/projected/4f215af7-dad4-4dd1-9cc7-20c611eacace-kube-api-access-f6djc\") pod \"4f215af7-dad4-4dd1-9cc7-20c611eacace\" (UID: \"4f215af7-dad4-4dd1-9cc7-20c611eacace\") " Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.497207 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1a11162-6554-4080-9b1f-e0864a79ec01-client-ca" (OuterVolumeSpecName: "client-ca") pod "b1a11162-6554-4080-9b1f-e0864a79ec01" (UID: "b1a11162-6554-4080-9b1f-e0864a79ec01"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.499107 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f215af7-dad4-4dd1-9cc7-20c611eacace-ready" (OuterVolumeSpecName: "ready") pod "4f215af7-dad4-4dd1-9cc7-20c611eacace" (UID: "4f215af7-dad4-4dd1-9cc7-20c611eacace"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.499193 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4f215af7-dad4-4dd1-9cc7-20c611eacace-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "4f215af7-dad4-4dd1-9cc7-20c611eacace" (UID: "4f215af7-dad4-4dd1-9cc7-20c611eacace"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.499248 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f215af7-dad4-4dd1-9cc7-20c611eacace-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "4f215af7-dad4-4dd1-9cc7-20c611eacace" (UID: "4f215af7-dad4-4dd1-9cc7-20c611eacace"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.499648 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1a11162-6554-4080-9b1f-e0864a79ec01-config" (OuterVolumeSpecName: "config") pod "b1a11162-6554-4080-9b1f-e0864a79ec01" (UID: "b1a11162-6554-4080-9b1f-e0864a79ec01"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.502109 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1a11162-6554-4080-9b1f-e0864a79ec01-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b1a11162-6554-4080-9b1f-e0864a79ec01" (UID: "b1a11162-6554-4080-9b1f-e0864a79ec01"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.504165 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f215af7-dad4-4dd1-9cc7-20c611eacace-kube-api-access-f6djc" (OuterVolumeSpecName: "kube-api-access-f6djc") pod "4f215af7-dad4-4dd1-9cc7-20c611eacace" (UID: "4f215af7-dad4-4dd1-9cc7-20c611eacace"). InnerVolumeSpecName "kube-api-access-f6djc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.505552 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1a11162-6554-4080-9b1f-e0864a79ec01-kube-api-access-z6s22" (OuterVolumeSpecName: "kube-api-access-z6s22") pod "b1a11162-6554-4080-9b1f-e0864a79ec01" (UID: "b1a11162-6554-4080-9b1f-e0864a79ec01"). InnerVolumeSpecName "kube-api-access-z6s22". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.540149 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.596762 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdhhw\" (UniqueName: \"kubernetes.io/projected/a1cfc2e6-b878-4af9-969b-33a513042b75-kube-api-access-fdhhw\") pod \"a1cfc2e6-b878-4af9-969b-33a513042b75\" (UID: \"a1cfc2e6-b878-4af9-969b-33a513042b75\") " Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.596815 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1cfc2e6-b878-4af9-969b-33a513042b75-config\") pod \"a1cfc2e6-b878-4af9-969b-33a513042b75\" (UID: \"a1cfc2e6-b878-4af9-969b-33a513042b75\") " Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.596848 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1cfc2e6-b878-4af9-969b-33a513042b75-serving-cert\") pod \"a1cfc2e6-b878-4af9-969b-33a513042b75\" (UID: \"a1cfc2e6-b878-4af9-969b-33a513042b75\") " Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.596867 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1cfc2e6-b878-4af9-969b-33a513042b75-proxy-ca-bundles\") pod \"a1cfc2e6-b878-4af9-969b-33a513042b75\" (UID: \"a1cfc2e6-b878-4af9-969b-33a513042b75\") " Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.596891 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1cfc2e6-b878-4af9-969b-33a513042b75-client-ca\") pod \"a1cfc2e6-b878-4af9-969b-33a513042b75\" (UID: \"a1cfc2e6-b878-4af9-969b-33a513042b75\") " Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.597053 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1a11162-6554-4080-9b1f-e0864a79ec01-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.597064 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1a11162-6554-4080-9b1f-e0864a79ec01-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.597072 4749 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/4f215af7-dad4-4dd1-9cc7-20c611eacace-ready\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.597081 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1a11162-6554-4080-9b1f-e0864a79ec01-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.597089 4749 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4f215af7-dad4-4dd1-9cc7-20c611eacace-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.597097 4749 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4f215af7-dad4-4dd1-9cc7-20c611eacace-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.597106 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6djc\" (UniqueName: \"kubernetes.io/projected/4f215af7-dad4-4dd1-9cc7-20c611eacace-kube-api-access-f6djc\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.597115 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6s22\" (UniqueName: \"kubernetes.io/projected/b1a11162-6554-4080-9b1f-e0864a79ec01-kube-api-access-z6s22\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.597717 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1cfc2e6-b878-4af9-969b-33a513042b75-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a1cfc2e6-b878-4af9-969b-33a513042b75" (UID: "a1cfc2e6-b878-4af9-969b-33a513042b75"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.597807 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1cfc2e6-b878-4af9-969b-33a513042b75-client-ca" (OuterVolumeSpecName: "client-ca") pod "a1cfc2e6-b878-4af9-969b-33a513042b75" (UID: "a1cfc2e6-b878-4af9-969b-33a513042b75"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.598048 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1cfc2e6-b878-4af9-969b-33a513042b75-config" (OuterVolumeSpecName: "config") pod "a1cfc2e6-b878-4af9-969b-33a513042b75" (UID: "a1cfc2e6-b878-4af9-969b-33a513042b75"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.601045 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1cfc2e6-b878-4af9-969b-33a513042b75-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a1cfc2e6-b878-4af9-969b-33a513042b75" (UID: "a1cfc2e6-b878-4af9-969b-33a513042b75"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.601538 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1cfc2e6-b878-4af9-969b-33a513042b75-kube-api-access-fdhhw" (OuterVolumeSpecName: "kube-api-access-fdhhw") pod "a1cfc2e6-b878-4af9-969b-33a513042b75" (UID: "a1cfc2e6-b878-4af9-969b-33a513042b75"). InnerVolumeSpecName "kube-api-access-fdhhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.697904 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdhhw\" (UniqueName: \"kubernetes.io/projected/a1cfc2e6-b878-4af9-969b-33a513042b75-kube-api-access-fdhhw\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.697937 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1cfc2e6-b878-4af9-969b-33a513042b75-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.697954 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1cfc2e6-b878-4af9-969b-33a513042b75-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.697980 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1cfc2e6-b878-4af9-969b-33a513042b75-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.698017 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1cfc2e6-b878-4af9-969b-33a513042b75-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.698746 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.703746 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm"] Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.706224 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-87stm"] Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.711862 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5c95786564-fnpmk"] Feb 19 18:35:09 crc kubenswrapper[4749]: E0219 18:35:09.712137 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1cfc2e6-b878-4af9-969b-33a513042b75" containerName="controller-manager" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.712157 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1cfc2e6-b878-4af9-969b-33a513042b75" containerName="controller-manager" Feb 19 18:35:09 crc kubenswrapper[4749]: E0219 18:35:09.712168 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f215af7-dad4-4dd1-9cc7-20c611eacace" containerName="kube-multus-additional-cni-plugins" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.712178 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f215af7-dad4-4dd1-9cc7-20c611eacace" containerName="kube-multus-additional-cni-plugins" Feb 19 18:35:09 crc kubenswrapper[4749]: E0219 18:35:09.712189 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1a11162-6554-4080-9b1f-e0864a79ec01" containerName="route-controller-manager" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.712196 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1a11162-6554-4080-9b1f-e0864a79ec01" containerName="route-controller-manager" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.712310 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1a11162-6554-4080-9b1f-e0864a79ec01" containerName="route-controller-manager" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.712326 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1cfc2e6-b878-4af9-969b-33a513042b75" containerName="controller-manager" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.712340 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f215af7-dad4-4dd1-9cc7-20c611eacace" containerName="kube-multus-additional-cni-plugins" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.712786 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c95786564-fnpmk" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.720487 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg"] Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.721156 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.724416 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.724871 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.725142 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.725325 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.725423 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.725490 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.726604 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c95786564-fnpmk"] Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.733166 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg"] Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.776561 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=0.776540888 podStartE2EDuration="776.540888ms" podCreationTimestamp="2026-02-19 18:35:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:35:09.754500666 +0000 UTC m=+83.715720620" watchObservedRunningTime="2026-02-19 18:35:09.776540888 +0000 UTC m=+83.737760852" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.779532 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 19 18:35:09 crc kubenswrapper[4749]: W0219 18:35:09.785777 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc7152bd9_31db_4fb0_878d_4a4a46a3619d.slice/crio-4940459b31e0402f4290ca719dbd47fb7b9ee390af2cd344201be846a4cd2328 WatchSource:0}: Error finding container 4940459b31e0402f4290ca719dbd47fb7b9ee390af2cd344201be846a4cd2328: Status 404 returned error can't find the container with id 4940459b31e0402f4290ca719dbd47fb7b9ee390af2cd344201be846a4cd2328 Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.801130 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a367aab-4b47-43f2-b158-c0dc8cf9d797-config\") pod \"route-controller-manager-7548bb6cfb-zbzlg\" (UID: \"7a367aab-4b47-43f2-b158-c0dc8cf9d797\") " pod="openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.801165 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phs5q\" (UniqueName: \"kubernetes.io/projected/cce85d47-c65c-4051-afad-8b9667558414-kube-api-access-phs5q\") pod \"controller-manager-5c95786564-fnpmk\" (UID: \"cce85d47-c65c-4051-afad-8b9667558414\") " pod="openshift-controller-manager/controller-manager-5c95786564-fnpmk" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.801185 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cce85d47-c65c-4051-afad-8b9667558414-serving-cert\") pod \"controller-manager-5c95786564-fnpmk\" (UID: \"cce85d47-c65c-4051-afad-8b9667558414\") " pod="openshift-controller-manager/controller-manager-5c95786564-fnpmk" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.801203 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a367aab-4b47-43f2-b158-c0dc8cf9d797-serving-cert\") pod \"route-controller-manager-7548bb6cfb-zbzlg\" (UID: \"7a367aab-4b47-43f2-b158-c0dc8cf9d797\") " pod="openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.801222 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cce85d47-c65c-4051-afad-8b9667558414-client-ca\") pod \"controller-manager-5c95786564-fnpmk\" (UID: \"cce85d47-c65c-4051-afad-8b9667558414\") " pod="openshift-controller-manager/controller-manager-5c95786564-fnpmk" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.801242 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a367aab-4b47-43f2-b158-c0dc8cf9d797-client-ca\") pod \"route-controller-manager-7548bb6cfb-zbzlg\" (UID: \"7a367aab-4b47-43f2-b158-c0dc8cf9d797\") " pod="openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.801256 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pdrb\" (UniqueName: \"kubernetes.io/projected/7a367aab-4b47-43f2-b158-c0dc8cf9d797-kube-api-access-9pdrb\") pod \"route-controller-manager-7548bb6cfb-zbzlg\" (UID: \"7a367aab-4b47-43f2-b158-c0dc8cf9d797\") " pod="openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.801275 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cce85d47-c65c-4051-afad-8b9667558414-proxy-ca-bundles\") pod \"controller-manager-5c95786564-fnpmk\" (UID: \"cce85d47-c65c-4051-afad-8b9667558414\") " pod="openshift-controller-manager/controller-manager-5c95786564-fnpmk" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.801303 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce85d47-c65c-4051-afad-8b9667558414-config\") pod \"controller-manager-5c95786564-fnpmk\" (UID: \"cce85d47-c65c-4051-afad-8b9667558414\") " pod="openshift-controller-manager/controller-manager-5c95786564-fnpmk" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.902376 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cce85d47-c65c-4051-afad-8b9667558414-client-ca\") pod \"controller-manager-5c95786564-fnpmk\" (UID: \"cce85d47-c65c-4051-afad-8b9667558414\") " pod="openshift-controller-manager/controller-manager-5c95786564-fnpmk" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.902649 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pdrb\" (UniqueName: \"kubernetes.io/projected/7a367aab-4b47-43f2-b158-c0dc8cf9d797-kube-api-access-9pdrb\") pod \"route-controller-manager-7548bb6cfb-zbzlg\" (UID: \"7a367aab-4b47-43f2-b158-c0dc8cf9d797\") " pod="openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.902669 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a367aab-4b47-43f2-b158-c0dc8cf9d797-client-ca\") pod \"route-controller-manager-7548bb6cfb-zbzlg\" (UID: \"7a367aab-4b47-43f2-b158-c0dc8cf9d797\") " pod="openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.902713 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cce85d47-c65c-4051-afad-8b9667558414-proxy-ca-bundles\") pod \"controller-manager-5c95786564-fnpmk\" (UID: \"cce85d47-c65c-4051-afad-8b9667558414\") " pod="openshift-controller-manager/controller-manager-5c95786564-fnpmk" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.902839 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce85d47-c65c-4051-afad-8b9667558414-config\") pod \"controller-manager-5c95786564-fnpmk\" (UID: \"cce85d47-c65c-4051-afad-8b9667558414\") " pod="openshift-controller-manager/controller-manager-5c95786564-fnpmk" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.902916 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a367aab-4b47-43f2-b158-c0dc8cf9d797-config\") pod \"route-controller-manager-7548bb6cfb-zbzlg\" (UID: \"7a367aab-4b47-43f2-b158-c0dc8cf9d797\") " pod="openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.902937 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phs5q\" (UniqueName: \"kubernetes.io/projected/cce85d47-c65c-4051-afad-8b9667558414-kube-api-access-phs5q\") pod \"controller-manager-5c95786564-fnpmk\" (UID: \"cce85d47-c65c-4051-afad-8b9667558414\") " pod="openshift-controller-manager/controller-manager-5c95786564-fnpmk" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.903751 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cce85d47-c65c-4051-afad-8b9667558414-serving-cert\") pod \"controller-manager-5c95786564-fnpmk\" (UID: \"cce85d47-c65c-4051-afad-8b9667558414\") " pod="openshift-controller-manager/controller-manager-5c95786564-fnpmk" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.903826 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a367aab-4b47-43f2-b158-c0dc8cf9d797-serving-cert\") pod \"route-controller-manager-7548bb6cfb-zbzlg\" (UID: \"7a367aab-4b47-43f2-b158-c0dc8cf9d797\") " pod="openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.905366 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cce85d47-c65c-4051-afad-8b9667558414-client-ca\") pod \"controller-manager-5c95786564-fnpmk\" (UID: \"cce85d47-c65c-4051-afad-8b9667558414\") " pod="openshift-controller-manager/controller-manager-5c95786564-fnpmk" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.906554 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a367aab-4b47-43f2-b158-c0dc8cf9d797-config\") pod \"route-controller-manager-7548bb6cfb-zbzlg\" (UID: \"7a367aab-4b47-43f2-b158-c0dc8cf9d797\") " pod="openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.906717 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce85d47-c65c-4051-afad-8b9667558414-config\") pod \"controller-manager-5c95786564-fnpmk\" (UID: \"cce85d47-c65c-4051-afad-8b9667558414\") " pod="openshift-controller-manager/controller-manager-5c95786564-fnpmk" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.906774 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a367aab-4b47-43f2-b158-c0dc8cf9d797-client-ca\") pod \"route-controller-manager-7548bb6cfb-zbzlg\" (UID: \"7a367aab-4b47-43f2-b158-c0dc8cf9d797\") " pod="openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.908743 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cce85d47-c65c-4051-afad-8b9667558414-proxy-ca-bundles\") pod \"controller-manager-5c95786564-fnpmk\" (UID: \"cce85d47-c65c-4051-afad-8b9667558414\") " pod="openshift-controller-manager/controller-manager-5c95786564-fnpmk" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.911546 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cce85d47-c65c-4051-afad-8b9667558414-serving-cert\") pod \"controller-manager-5c95786564-fnpmk\" (UID: \"cce85d47-c65c-4051-afad-8b9667558414\") " pod="openshift-controller-manager/controller-manager-5c95786564-fnpmk" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.914793 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a367aab-4b47-43f2-b158-c0dc8cf9d797-serving-cert\") pod \"route-controller-manager-7548bb6cfb-zbzlg\" (UID: \"7a367aab-4b47-43f2-b158-c0dc8cf9d797\") " pod="openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.921051 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pdrb\" (UniqueName: \"kubernetes.io/projected/7a367aab-4b47-43f2-b158-c0dc8cf9d797-kube-api-access-9pdrb\") pod \"route-controller-manager-7548bb6cfb-zbzlg\" (UID: \"7a367aab-4b47-43f2-b158-c0dc8cf9d797\") " pod="openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg" Feb 19 18:35:09 crc kubenswrapper[4749]: I0219 18:35:09.926493 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phs5q\" (UniqueName: \"kubernetes.io/projected/cce85d47-c65c-4051-afad-8b9667558414-kube-api-access-phs5q\") pod \"controller-manager-5c95786564-fnpmk\" (UID: \"cce85d47-c65c-4051-afad-8b9667558414\") " pod="openshift-controller-manager/controller-manager-5c95786564-fnpmk" Feb 19 18:35:10 crc kubenswrapper[4749]: I0219 18:35:10.028937 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c95786564-fnpmk" Feb 19 18:35:10 crc kubenswrapper[4749]: I0219 18:35:10.089626 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg" Feb 19 18:35:10 crc kubenswrapper[4749]: I0219 18:35:10.218319 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c95786564-fnpmk"] Feb 19 18:35:10 crc kubenswrapper[4749]: I0219 18:35:10.293618 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg"] Feb 19 18:35:10 crc kubenswrapper[4749]: W0219 18:35:10.302940 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a367aab_4b47_43f2_b158_c0dc8cf9d797.slice/crio-4b3f151cee3bbb3b524875d7d65065ea2246aedb2ce83077a285b14ed5ed0eba WatchSource:0}: Error finding container 4b3f151cee3bbb3b524875d7d65065ea2246aedb2ce83077a285b14ed5ed0eba: Status 404 returned error can't find the container with id 4b3f151cee3bbb3b524875d7d65065ea2246aedb2ce83077a285b14ed5ed0eba Feb 19 18:35:10 crc kubenswrapper[4749]: I0219 18:35:10.381006 4749 generic.go:334] "Generic (PLEG): container finished" podID="69277352-22e8-4094-944f-bb38a3fb3a83" containerID="306f0f4cfccfc898010693b446028e5667f6ec4dba2ed545165d44aff25071e8" exitCode=0 Feb 19 18:35:10 crc kubenswrapper[4749]: I0219 18:35:10.381095 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mw89w" event={"ID":"69277352-22e8-4094-944f-bb38a3fb3a83","Type":"ContainerDied","Data":"306f0f4cfccfc898010693b446028e5667f6ec4dba2ed545165d44aff25071e8"} Feb 19 18:35:10 crc kubenswrapper[4749]: I0219 18:35:10.383428 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" Feb 19 18:35:10 crc kubenswrapper[4749]: I0219 18:35:10.383435 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-d49ll" event={"ID":"a1cfc2e6-b878-4af9-969b-33a513042b75","Type":"ContainerDied","Data":"dbd546725d915e3e7eaca96ba66453ecddbd8264ca281f22e92bb45de474be15"} Feb 19 18:35:10 crc kubenswrapper[4749]: I0219 18:35:10.383494 4749 scope.go:117] "RemoveContainer" containerID="f5d9981eb5f4573cb0abba742594063e8269b0bca49482f1a49b479e43c06a9e" Feb 19 18:35:10 crc kubenswrapper[4749]: I0219 18:35:10.386048 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c95786564-fnpmk" event={"ID":"cce85d47-c65c-4051-afad-8b9667558414","Type":"ContainerStarted","Data":"ad2438e3f94a9cb6ca352cdc6a2c65331ebf883fa2044d4fa9cf3f971679365c"} Feb 19 18:35:10 crc kubenswrapper[4749]: I0219 18:35:10.387078 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg" event={"ID":"7a367aab-4b47-43f2-b158-c0dc8cf9d797","Type":"ContainerStarted","Data":"4b3f151cee3bbb3b524875d7d65065ea2246aedb2ce83077a285b14ed5ed0eba"} Feb 19 18:35:10 crc kubenswrapper[4749]: I0219 18:35:10.388886 4749 generic.go:334] "Generic (PLEG): container finished" podID="856ea752-2729-4936-96aa-423c76975a34" containerID="bcbdcc71bf02babd5c0ee38cfe6585b024f0ce3b6f6ce35a66bdc41c9a6895d7" exitCode=0 Feb 19 18:35:10 crc kubenswrapper[4749]: I0219 18:35:10.388981 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tb5cv" event={"ID":"856ea752-2729-4936-96aa-423c76975a34","Type":"ContainerDied","Data":"bcbdcc71bf02babd5c0ee38cfe6585b024f0ce3b6f6ce35a66bdc41c9a6895d7"} Feb 19 18:35:10 crc kubenswrapper[4749]: I0219 18:35:10.391213 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c7152bd9-31db-4fb0-878d-4a4a46a3619d","Type":"ContainerStarted","Data":"eb00765bd0af3540240ee9527d8cee2d3a99fc7997d989a4097b45f6bbf7ccd3"} Feb 19 18:35:10 crc kubenswrapper[4749]: I0219 18:35:10.391243 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c7152bd9-31db-4fb0-878d-4a4a46a3619d","Type":"ContainerStarted","Data":"4940459b31e0402f4290ca719dbd47fb7b9ee390af2cd344201be846a4cd2328"} Feb 19 18:35:10 crc kubenswrapper[4749]: I0219 18:35:10.391573 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-8tc74" Feb 19 18:35:10 crc kubenswrapper[4749]: E0219 18:35:10.393218 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-pdxkf" podUID="81536517-730c-4da7-b371-efe28f18a1f3" Feb 19 18:35:10 crc kubenswrapper[4749]: E0219 18:35:10.393419 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-2n4lk" podUID="e19c61ad-b387-457b-814b-e382b0265880" Feb 19 18:35:10 crc kubenswrapper[4749]: E0219 18:35:10.397806 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-zqrzw" podUID="ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2" Feb 19 18:35:10 crc kubenswrapper[4749]: I0219 18:35:10.455505 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=4.455013882 podStartE2EDuration="4.455013882s" podCreationTimestamp="2026-02-19 18:35:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:35:10.448884533 +0000 UTC m=+84.410104497" watchObservedRunningTime="2026-02-19 18:35:10.455013882 +0000 UTC m=+84.416233856" Feb 19 18:35:10 crc kubenswrapper[4749]: I0219 18:35:10.490298 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-8tc74"] Feb 19 18:35:10 crc kubenswrapper[4749]: I0219 18:35:10.496359 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-8tc74"] Feb 19 18:35:10 crc kubenswrapper[4749]: I0219 18:35:10.505185 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-d49ll"] Feb 19 18:35:10 crc kubenswrapper[4749]: I0219 18:35:10.509193 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-d49ll"] Feb 19 18:35:10 crc kubenswrapper[4749]: I0219 18:35:10.686117 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f215af7-dad4-4dd1-9cc7-20c611eacace" path="/var/lib/kubelet/pods/4f215af7-dad4-4dd1-9cc7-20c611eacace/volumes" Feb 19 18:35:10 crc kubenswrapper[4749]: I0219 18:35:10.687159 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1cfc2e6-b878-4af9-969b-33a513042b75" path="/var/lib/kubelet/pods/a1cfc2e6-b878-4af9-969b-33a513042b75/volumes" Feb 19 18:35:10 crc kubenswrapper[4749]: I0219 18:35:10.687653 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1a11162-6554-4080-9b1f-e0864a79ec01" path="/var/lib/kubelet/pods/b1a11162-6554-4080-9b1f-e0864a79ec01/volumes" Feb 19 18:35:11 crc kubenswrapper[4749]: I0219 18:35:11.397218 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg" event={"ID":"7a367aab-4b47-43f2-b158-c0dc8cf9d797","Type":"ContainerStarted","Data":"f31058129b2afcd5320b0a118ae2901602977fb32040db991669deb419f24057"} Feb 19 18:35:11 crc kubenswrapper[4749]: I0219 18:35:11.398743 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg" Feb 19 18:35:11 crc kubenswrapper[4749]: I0219 18:35:11.401711 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tb5cv" event={"ID":"856ea752-2729-4936-96aa-423c76975a34","Type":"ContainerStarted","Data":"3913492dbe1ea845ea6516bcf80e77f399733013c65daee92b46ec493aaa8379"} Feb 19 18:35:11 crc kubenswrapper[4749]: I0219 18:35:11.403843 4749 generic.go:334] "Generic (PLEG): container finished" podID="c7152bd9-31db-4fb0-878d-4a4a46a3619d" containerID="eb00765bd0af3540240ee9527d8cee2d3a99fc7997d989a4097b45f6bbf7ccd3" exitCode=0 Feb 19 18:35:11 crc kubenswrapper[4749]: I0219 18:35:11.403909 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c7152bd9-31db-4fb0-878d-4a4a46a3619d","Type":"ContainerDied","Data":"eb00765bd0af3540240ee9527d8cee2d3a99fc7997d989a4097b45f6bbf7ccd3"} Feb 19 18:35:11 crc kubenswrapper[4749]: I0219 18:35:11.403938 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg" Feb 19 18:35:11 crc kubenswrapper[4749]: I0219 18:35:11.405883 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mw89w" event={"ID":"69277352-22e8-4094-944f-bb38a3fb3a83","Type":"ContainerStarted","Data":"c6924bb6bd831cc3ea14dd7bfc20730677ce22775f25736361ed2ef2eba435f4"} Feb 19 18:35:11 crc kubenswrapper[4749]: I0219 18:35:11.409053 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c95786564-fnpmk" event={"ID":"cce85d47-c65c-4051-afad-8b9667558414","Type":"ContainerStarted","Data":"d898ab105bd1f03bf1d63221ba2ce75ea43c99e73897e9522a451b2fb1864a20"} Feb 19 18:35:11 crc kubenswrapper[4749]: I0219 18:35:11.409308 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5c95786564-fnpmk" Feb 19 18:35:11 crc kubenswrapper[4749]: I0219 18:35:11.414795 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5c95786564-fnpmk" Feb 19 18:35:11 crc kubenswrapper[4749]: I0219 18:35:11.418980 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg" podStartSLOduration=3.418957488 podStartE2EDuration="3.418957488s" podCreationTimestamp="2026-02-19 18:35:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:35:11.415965 +0000 UTC m=+85.377184974" watchObservedRunningTime="2026-02-19 18:35:11.418957488 +0000 UTC m=+85.380177442" Feb 19 18:35:11 crc kubenswrapper[4749]: I0219 18:35:11.448004 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5c95786564-fnpmk" podStartSLOduration=3.447982938 podStartE2EDuration="3.447982938s" podCreationTimestamp="2026-02-19 18:35:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:35:11.438800399 +0000 UTC m=+85.400020363" watchObservedRunningTime="2026-02-19 18:35:11.447982938 +0000 UTC m=+85.409202902" Feb 19 18:35:11 crc kubenswrapper[4749]: I0219 18:35:11.467109 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tb5cv" podStartSLOduration=2.559632654 podStartE2EDuration="41.467048521s" podCreationTimestamp="2026-02-19 18:34:30 +0000 UTC" firstStartedPulling="2026-02-19 18:34:31.960230957 +0000 UTC m=+45.921450911" lastFinishedPulling="2026-02-19 18:35:10.867646824 +0000 UTC m=+84.828866778" observedRunningTime="2026-02-19 18:35:11.461129757 +0000 UTC m=+85.422349721" watchObservedRunningTime="2026-02-19 18:35:11.467048521 +0000 UTC m=+85.428268475" Feb 19 18:35:11 crc kubenswrapper[4749]: I0219 18:35:11.504092 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mw89w" podStartSLOduration=2.604638232 podStartE2EDuration="40.504076783s" podCreationTimestamp="2026-02-19 18:34:31 +0000 UTC" firstStartedPulling="2026-02-19 18:34:33.039520114 +0000 UTC m=+47.000740068" lastFinishedPulling="2026-02-19 18:35:10.938958665 +0000 UTC m=+84.900178619" observedRunningTime="2026-02-19 18:35:11.49996118 +0000 UTC m=+85.461181144" watchObservedRunningTime="2026-02-19 18:35:11.504076783 +0000 UTC m=+85.465296737" Feb 19 18:35:11 crc kubenswrapper[4749]: I0219 18:35:11.773127 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mw89w" Feb 19 18:35:11 crc kubenswrapper[4749]: I0219 18:35:11.773470 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mw89w" Feb 19 18:35:12 crc kubenswrapper[4749]: I0219 18:35:12.648068 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 18:35:12 crc kubenswrapper[4749]: I0219 18:35:12.742389 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7152bd9-31db-4fb0-878d-4a4a46a3619d-kubelet-dir\") pod \"c7152bd9-31db-4fb0-878d-4a4a46a3619d\" (UID: \"c7152bd9-31db-4fb0-878d-4a4a46a3619d\") " Feb 19 18:35:12 crc kubenswrapper[4749]: I0219 18:35:12.742440 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7152bd9-31db-4fb0-878d-4a4a46a3619d-kube-api-access\") pod \"c7152bd9-31db-4fb0-878d-4a4a46a3619d\" (UID: \"c7152bd9-31db-4fb0-878d-4a4a46a3619d\") " Feb 19 18:35:12 crc kubenswrapper[4749]: I0219 18:35:12.742550 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7152bd9-31db-4fb0-878d-4a4a46a3619d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c7152bd9-31db-4fb0-878d-4a4a46a3619d" (UID: "c7152bd9-31db-4fb0-878d-4a4a46a3619d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:35:12 crc kubenswrapper[4749]: I0219 18:35:12.743139 4749 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7152bd9-31db-4fb0-878d-4a4a46a3619d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:12 crc kubenswrapper[4749]: I0219 18:35:12.749238 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7152bd9-31db-4fb0-878d-4a4a46a3619d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c7152bd9-31db-4fb0-878d-4a4a46a3619d" (UID: "c7152bd9-31db-4fb0-878d-4a4a46a3619d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:35:12 crc kubenswrapper[4749]: I0219 18:35:12.844713 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7152bd9-31db-4fb0-878d-4a4a46a3619d-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:12 crc kubenswrapper[4749]: I0219 18:35:12.971786 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mw89w" podUID="69277352-22e8-4094-944f-bb38a3fb3a83" containerName="registry-server" probeResult="failure" output=< Feb 19 18:35:12 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Feb 19 18:35:12 crc kubenswrapper[4749]: > Feb 19 18:35:13 crc kubenswrapper[4749]: I0219 18:35:13.423262 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c7152bd9-31db-4fb0-878d-4a4a46a3619d","Type":"ContainerDied","Data":"4940459b31e0402f4290ca719dbd47fb7b9ee390af2cd344201be846a4cd2328"} Feb 19 18:35:13 crc kubenswrapper[4749]: I0219 18:35:13.423356 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4940459b31e0402f4290ca719dbd47fb7b9ee390af2cd344201be846a4cd2328" Feb 19 18:35:13 crc kubenswrapper[4749]: I0219 18:35:13.423542 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 18:35:13 crc kubenswrapper[4749]: I0219 18:35:13.514265 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 19 18:35:13 crc kubenswrapper[4749]: E0219 18:35:13.514556 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7152bd9-31db-4fb0-878d-4a4a46a3619d" containerName="pruner" Feb 19 18:35:13 crc kubenswrapper[4749]: I0219 18:35:13.514580 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7152bd9-31db-4fb0-878d-4a4a46a3619d" containerName="pruner" Feb 19 18:35:13 crc kubenswrapper[4749]: I0219 18:35:13.514753 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7152bd9-31db-4fb0-878d-4a4a46a3619d" containerName="pruner" Feb 19 18:35:13 crc kubenswrapper[4749]: I0219 18:35:13.515367 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 18:35:13 crc kubenswrapper[4749]: I0219 18:35:13.519403 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 19 18:35:13 crc kubenswrapper[4749]: I0219 18:35:13.519966 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 19 18:35:13 crc kubenswrapper[4749]: I0219 18:35:13.556234 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5e6539f5-4700-4a9d-9428-6752835bbe20-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5e6539f5-4700-4a9d-9428-6752835bbe20\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 18:35:13 crc kubenswrapper[4749]: I0219 18:35:13.556428 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5e6539f5-4700-4a9d-9428-6752835bbe20-var-lock\") pod \"installer-9-crc\" (UID: \"5e6539f5-4700-4a9d-9428-6752835bbe20\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 18:35:13 crc kubenswrapper[4749]: I0219 18:35:13.556476 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5e6539f5-4700-4a9d-9428-6752835bbe20-kube-api-access\") pod \"installer-9-crc\" (UID: \"5e6539f5-4700-4a9d-9428-6752835bbe20\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 18:35:13 crc kubenswrapper[4749]: I0219 18:35:13.557839 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 19 18:35:13 crc kubenswrapper[4749]: I0219 18:35:13.657976 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5e6539f5-4700-4a9d-9428-6752835bbe20-var-lock\") pod \"installer-9-crc\" (UID: \"5e6539f5-4700-4a9d-9428-6752835bbe20\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 18:35:13 crc kubenswrapper[4749]: I0219 18:35:13.658025 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5e6539f5-4700-4a9d-9428-6752835bbe20-kube-api-access\") pod \"installer-9-crc\" (UID: \"5e6539f5-4700-4a9d-9428-6752835bbe20\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 18:35:13 crc kubenswrapper[4749]: I0219 18:35:13.658097 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5e6539f5-4700-4a9d-9428-6752835bbe20-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5e6539f5-4700-4a9d-9428-6752835bbe20\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 18:35:13 crc kubenswrapper[4749]: I0219 18:35:13.658110 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5e6539f5-4700-4a9d-9428-6752835bbe20-var-lock\") pod \"installer-9-crc\" (UID: \"5e6539f5-4700-4a9d-9428-6752835bbe20\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 18:35:13 crc kubenswrapper[4749]: I0219 18:35:13.658186 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5e6539f5-4700-4a9d-9428-6752835bbe20-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5e6539f5-4700-4a9d-9428-6752835bbe20\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 18:35:13 crc kubenswrapper[4749]: I0219 18:35:13.678064 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5e6539f5-4700-4a9d-9428-6752835bbe20-kube-api-access\") pod \"installer-9-crc\" (UID: \"5e6539f5-4700-4a9d-9428-6752835bbe20\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 18:35:13 crc kubenswrapper[4749]: I0219 18:35:13.831967 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 18:35:14 crc kubenswrapper[4749]: I0219 18:35:14.295624 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 19 18:35:14 crc kubenswrapper[4749]: W0219 18:35:14.314794 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5e6539f5_4700_4a9d_9428_6752835bbe20.slice/crio-61d47eb17d54c9f17e2ac716e9908a391157247e33fe3543cc757081b46db643 WatchSource:0}: Error finding container 61d47eb17d54c9f17e2ac716e9908a391157247e33fe3543cc757081b46db643: Status 404 returned error can't find the container with id 61d47eb17d54c9f17e2ac716e9908a391157247e33fe3543cc757081b46db643 Feb 19 18:35:14 crc kubenswrapper[4749]: I0219 18:35:14.430526 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5e6539f5-4700-4a9d-9428-6752835bbe20","Type":"ContainerStarted","Data":"61d47eb17d54c9f17e2ac716e9908a391157247e33fe3543cc757081b46db643"} Feb 19 18:35:15 crc kubenswrapper[4749]: I0219 18:35:15.435578 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5e6539f5-4700-4a9d-9428-6752835bbe20","Type":"ContainerStarted","Data":"4656ad09f93735fc310b822ee2c412ec5fc90c47600db1cd180c08d0b5286139"} Feb 19 18:35:15 crc kubenswrapper[4749]: I0219 18:35:15.450028 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.450004031 podStartE2EDuration="2.450004031s" podCreationTimestamp="2026-02-19 18:35:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:35:15.449353276 +0000 UTC m=+89.410573230" watchObservedRunningTime="2026-02-19 18:35:15.450004031 +0000 UTC m=+89.411223995" Feb 19 18:35:20 crc kubenswrapper[4749]: I0219 18:35:20.828104 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tb5cv" Feb 19 18:35:20 crc kubenswrapper[4749]: I0219 18:35:20.828783 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tb5cv" Feb 19 18:35:20 crc kubenswrapper[4749]: I0219 18:35:20.872637 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tb5cv" Feb 19 18:35:21 crc kubenswrapper[4749]: I0219 18:35:21.498717 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tb5cv" Feb 19 18:35:21 crc kubenswrapper[4749]: I0219 18:35:21.851086 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mw89w" Feb 19 18:35:21 crc kubenswrapper[4749]: I0219 18:35:21.892210 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mw89w" Feb 19 18:35:22 crc kubenswrapper[4749]: I0219 18:35:22.112382 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tb5cv"] Feb 19 18:35:23 crc kubenswrapper[4749]: I0219 18:35:23.471369 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tb5cv" podUID="856ea752-2729-4936-96aa-423c76975a34" containerName="registry-server" containerID="cri-o://3913492dbe1ea845ea6516bcf80e77f399733013c65daee92b46ec493aaa8379" gracePeriod=2 Feb 19 18:35:23 crc kubenswrapper[4749]: I0219 18:35:23.964776 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tb5cv" Feb 19 18:35:23 crc kubenswrapper[4749]: I0219 18:35:23.998584 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/856ea752-2729-4936-96aa-423c76975a34-utilities\") pod \"856ea752-2729-4936-96aa-423c76975a34\" (UID: \"856ea752-2729-4936-96aa-423c76975a34\") " Feb 19 18:35:23 crc kubenswrapper[4749]: I0219 18:35:23.998704 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/856ea752-2729-4936-96aa-423c76975a34-catalog-content\") pod \"856ea752-2729-4936-96aa-423c76975a34\" (UID: \"856ea752-2729-4936-96aa-423c76975a34\") " Feb 19 18:35:23 crc kubenswrapper[4749]: I0219 18:35:23.998782 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq7ss\" (UniqueName: \"kubernetes.io/projected/856ea752-2729-4936-96aa-423c76975a34-kube-api-access-pq7ss\") pod \"856ea752-2729-4936-96aa-423c76975a34\" (UID: \"856ea752-2729-4936-96aa-423c76975a34\") " Feb 19 18:35:23 crc kubenswrapper[4749]: I0219 18:35:23.999318 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/856ea752-2729-4936-96aa-423c76975a34-utilities" (OuterVolumeSpecName: "utilities") pod "856ea752-2729-4936-96aa-423c76975a34" (UID: "856ea752-2729-4936-96aa-423c76975a34"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.004931 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/856ea752-2729-4936-96aa-423c76975a34-kube-api-access-pq7ss" (OuterVolumeSpecName: "kube-api-access-pq7ss") pod "856ea752-2729-4936-96aa-423c76975a34" (UID: "856ea752-2729-4936-96aa-423c76975a34"). InnerVolumeSpecName "kube-api-access-pq7ss". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.024826 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/856ea752-2729-4936-96aa-423c76975a34-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "856ea752-2729-4936-96aa-423c76975a34" (UID: "856ea752-2729-4936-96aa-423c76975a34"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.099762 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/856ea752-2729-4936-96aa-423c76975a34-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.099794 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/856ea752-2729-4936-96aa-423c76975a34-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.099808 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq7ss\" (UniqueName: \"kubernetes.io/projected/856ea752-2729-4936-96aa-423c76975a34-kube-api-access-pq7ss\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.477791 4749 generic.go:334] "Generic (PLEG): container finished" podID="856ea752-2729-4936-96aa-423c76975a34" containerID="3913492dbe1ea845ea6516bcf80e77f399733013c65daee92b46ec493aaa8379" exitCode=0 Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.477840 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tb5cv" Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.477868 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tb5cv" event={"ID":"856ea752-2729-4936-96aa-423c76975a34","Type":"ContainerDied","Data":"3913492dbe1ea845ea6516bcf80e77f399733013c65daee92b46ec493aaa8379"} Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.478258 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tb5cv" event={"ID":"856ea752-2729-4936-96aa-423c76975a34","Type":"ContainerDied","Data":"9ed5a059c4db7002016b2370e9270cac7436ce74b6f6c6ce8ed82e84e2ed6e4e"} Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.478277 4749 scope.go:117] "RemoveContainer" containerID="3913492dbe1ea845ea6516bcf80e77f399733013c65daee92b46ec493aaa8379" Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.479903 4749 generic.go:334] "Generic (PLEG): container finished" podID="f9165e83-4c09-4c44-b185-8f8922fcdad7" containerID="a1c99cc01e261fa29c40c85fa0b7f1d6a0c233f77180bb31566d2edadb41d805" exitCode=0 Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.479959 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5psfv" event={"ID":"f9165e83-4c09-4c44-b185-8f8922fcdad7","Type":"ContainerDied","Data":"a1c99cc01e261fa29c40c85fa0b7f1d6a0c233f77180bb31566d2edadb41d805"} Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.481644 4749 generic.go:334] "Generic (PLEG): container finished" podID="e3563ed4-2c84-4219-9843-c95a8bba26ac" containerID="e7114aea0dc129d324329047caa7cf90446804babda602b13b65c687f559f069" exitCode=0 Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.481702 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rmg2b" event={"ID":"e3563ed4-2c84-4219-9843-c95a8bba26ac","Type":"ContainerDied","Data":"e7114aea0dc129d324329047caa7cf90446804babda602b13b65c687f559f069"} Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.483599 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tscgs" event={"ID":"8955e517-c79b-4f52-9e06-a399c24532cf","Type":"ContainerStarted","Data":"17e3d31959da0414bef01bac0ee8edbbfa1f9e079591bcf66422ec767644d585"} Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.486314 4749 generic.go:334] "Generic (PLEG): container finished" podID="81536517-730c-4da7-b371-efe28f18a1f3" containerID="ff83d5885b32b3e1d0b0a8ec1a6d13cb1cbcdfba0e2e1e19bc101e6b6b5eadb7" exitCode=0 Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.486344 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pdxkf" event={"ID":"81536517-730c-4da7-b371-efe28f18a1f3","Type":"ContainerDied","Data":"ff83d5885b32b3e1d0b0a8ec1a6d13cb1cbcdfba0e2e1e19bc101e6b6b5eadb7"} Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.504259 4749 scope.go:117] "RemoveContainer" containerID="bcbdcc71bf02babd5c0ee38cfe6585b024f0ce3b6f6ce35a66bdc41c9a6895d7" Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.520613 4749 scope.go:117] "RemoveContainer" containerID="6362d0d09159f8ba698ea5c8d41fe299a2d0e9c032024692a20595568f08f2b1" Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.537170 4749 scope.go:117] "RemoveContainer" containerID="3913492dbe1ea845ea6516bcf80e77f399733013c65daee92b46ec493aaa8379" Feb 19 18:35:24 crc kubenswrapper[4749]: E0219 18:35:24.538940 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3913492dbe1ea845ea6516bcf80e77f399733013c65daee92b46ec493aaa8379\": container with ID starting with 3913492dbe1ea845ea6516bcf80e77f399733013c65daee92b46ec493aaa8379 not found: ID does not exist" containerID="3913492dbe1ea845ea6516bcf80e77f399733013c65daee92b46ec493aaa8379" Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.538988 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3913492dbe1ea845ea6516bcf80e77f399733013c65daee92b46ec493aaa8379"} err="failed to get container status \"3913492dbe1ea845ea6516bcf80e77f399733013c65daee92b46ec493aaa8379\": rpc error: code = NotFound desc = could not find container \"3913492dbe1ea845ea6516bcf80e77f399733013c65daee92b46ec493aaa8379\": container with ID starting with 3913492dbe1ea845ea6516bcf80e77f399733013c65daee92b46ec493aaa8379 not found: ID does not exist" Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.539060 4749 scope.go:117] "RemoveContainer" containerID="bcbdcc71bf02babd5c0ee38cfe6585b024f0ce3b6f6ce35a66bdc41c9a6895d7" Feb 19 18:35:24 crc kubenswrapper[4749]: E0219 18:35:24.539405 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcbdcc71bf02babd5c0ee38cfe6585b024f0ce3b6f6ce35a66bdc41c9a6895d7\": container with ID starting with bcbdcc71bf02babd5c0ee38cfe6585b024f0ce3b6f6ce35a66bdc41c9a6895d7 not found: ID does not exist" containerID="bcbdcc71bf02babd5c0ee38cfe6585b024f0ce3b6f6ce35a66bdc41c9a6895d7" Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.539432 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcbdcc71bf02babd5c0ee38cfe6585b024f0ce3b6f6ce35a66bdc41c9a6895d7"} err="failed to get container status \"bcbdcc71bf02babd5c0ee38cfe6585b024f0ce3b6f6ce35a66bdc41c9a6895d7\": rpc error: code = NotFound desc = could not find container \"bcbdcc71bf02babd5c0ee38cfe6585b024f0ce3b6f6ce35a66bdc41c9a6895d7\": container with ID starting with bcbdcc71bf02babd5c0ee38cfe6585b024f0ce3b6f6ce35a66bdc41c9a6895d7 not found: ID does not exist" Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.539451 4749 scope.go:117] "RemoveContainer" containerID="6362d0d09159f8ba698ea5c8d41fe299a2d0e9c032024692a20595568f08f2b1" Feb 19 18:35:24 crc kubenswrapper[4749]: E0219 18:35:24.539681 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6362d0d09159f8ba698ea5c8d41fe299a2d0e9c032024692a20595568f08f2b1\": container with ID starting with 6362d0d09159f8ba698ea5c8d41fe299a2d0e9c032024692a20595568f08f2b1 not found: ID does not exist" containerID="6362d0d09159f8ba698ea5c8d41fe299a2d0e9c032024692a20595568f08f2b1" Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.539705 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6362d0d09159f8ba698ea5c8d41fe299a2d0e9c032024692a20595568f08f2b1"} err="failed to get container status \"6362d0d09159f8ba698ea5c8d41fe299a2d0e9c032024692a20595568f08f2b1\": rpc error: code = NotFound desc = could not find container \"6362d0d09159f8ba698ea5c8d41fe299a2d0e9c032024692a20595568f08f2b1\": container with ID starting with 6362d0d09159f8ba698ea5c8d41fe299a2d0e9c032024692a20595568f08f2b1 not found: ID does not exist" Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.570554 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tb5cv"] Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.575123 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tb5cv"] Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.689305 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="856ea752-2729-4936-96aa-423c76975a34" path="/var/lib/kubelet/pods/856ea752-2729-4936-96aa-423c76975a34/volumes" Feb 19 18:35:24 crc kubenswrapper[4749]: I0219 18:35:24.843116 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2dbbd"] Feb 19 18:35:25 crc kubenswrapper[4749]: I0219 18:35:25.494000 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rmg2b" event={"ID":"e3563ed4-2c84-4219-9843-c95a8bba26ac","Type":"ContainerStarted","Data":"aac3b8d1fd4cd455ba50f08065bf1b842477ab4e6064cc68519ae0907496d179"} Feb 19 18:35:25 crc kubenswrapper[4749]: I0219 18:35:25.497506 4749 generic.go:334] "Generic (PLEG): container finished" podID="8955e517-c79b-4f52-9e06-a399c24532cf" containerID="17e3d31959da0414bef01bac0ee8edbbfa1f9e079591bcf66422ec767644d585" exitCode=0 Feb 19 18:35:25 crc kubenswrapper[4749]: I0219 18:35:25.497559 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tscgs" event={"ID":"8955e517-c79b-4f52-9e06-a399c24532cf","Type":"ContainerDied","Data":"17e3d31959da0414bef01bac0ee8edbbfa1f9e079591bcf66422ec767644d585"} Feb 19 18:35:25 crc kubenswrapper[4749]: I0219 18:35:25.500273 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pdxkf" event={"ID":"81536517-730c-4da7-b371-efe28f18a1f3","Type":"ContainerStarted","Data":"c74d5aabbdd09d1bd9a8a16830e0bad0afe21534a920f308bc1adbb5aa91ef46"} Feb 19 18:35:25 crc kubenswrapper[4749]: I0219 18:35:25.504587 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5psfv" event={"ID":"f9165e83-4c09-4c44-b185-8f8922fcdad7","Type":"ContainerStarted","Data":"79c420ff0c280697eff14168d9b1bd532f25df855e04ae0afe8f73f071036838"} Feb 19 18:35:25 crc kubenswrapper[4749]: I0219 18:35:25.506118 4749 generic.go:334] "Generic (PLEG): container finished" podID="ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2" containerID="1627e72b097131221e09b8f0a5379b7bd8d53f17bf4a97521131a0f62bd155f8" exitCode=0 Feb 19 18:35:25 crc kubenswrapper[4749]: I0219 18:35:25.506158 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zqrzw" event={"ID":"ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2","Type":"ContainerDied","Data":"1627e72b097131221e09b8f0a5379b7bd8d53f17bf4a97521131a0f62bd155f8"} Feb 19 18:35:25 crc kubenswrapper[4749]: I0219 18:35:25.509016 4749 generic.go:334] "Generic (PLEG): container finished" podID="e19c61ad-b387-457b-814b-e382b0265880" containerID="021ab6d95986618edbe6ae1d8d8362f6b09613200000a3026e94528ab4e46f3d" exitCode=0 Feb 19 18:35:25 crc kubenswrapper[4749]: I0219 18:35:25.509052 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2n4lk" event={"ID":"e19c61ad-b387-457b-814b-e382b0265880","Type":"ContainerDied","Data":"021ab6d95986618edbe6ae1d8d8362f6b09613200000a3026e94528ab4e46f3d"} Feb 19 18:35:25 crc kubenswrapper[4749]: I0219 18:35:25.516418 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rmg2b" podStartSLOduration=2.256617006 podStartE2EDuration="57.516402971s" podCreationTimestamp="2026-02-19 18:34:28 +0000 UTC" firstStartedPulling="2026-02-19 18:34:29.842703325 +0000 UTC m=+43.803923279" lastFinishedPulling="2026-02-19 18:35:25.10248929 +0000 UTC m=+99.063709244" observedRunningTime="2026-02-19 18:35:25.514584089 +0000 UTC m=+99.475804033" watchObservedRunningTime="2026-02-19 18:35:25.516402971 +0000 UTC m=+99.477622925" Feb 19 18:35:25 crc kubenswrapper[4749]: I0219 18:35:25.586989 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5psfv" podStartSLOduration=2.515132962 podStartE2EDuration="57.586970205s" podCreationTimestamp="2026-02-19 18:34:28 +0000 UTC" firstStartedPulling="2026-02-19 18:34:29.861016321 +0000 UTC m=+43.822236275" lastFinishedPulling="2026-02-19 18:35:24.932853564 +0000 UTC m=+98.894073518" observedRunningTime="2026-02-19 18:35:25.585044332 +0000 UTC m=+99.546264286" watchObservedRunningTime="2026-02-19 18:35:25.586970205 +0000 UTC m=+99.548190159" Feb 19 18:35:26 crc kubenswrapper[4749]: I0219 18:35:26.515149 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tscgs" event={"ID":"8955e517-c79b-4f52-9e06-a399c24532cf","Type":"ContainerStarted","Data":"3f2717897c1aa54349ba14cab1558a202e4d64d7460d570a079357a93a17f72d"} Feb 19 18:35:26 crc kubenswrapper[4749]: I0219 18:35:26.517877 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zqrzw" event={"ID":"ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2","Type":"ContainerStarted","Data":"57053058ce9dba1ad132b46be3808a9c08bcce8abeab544fe1107d1a539677c0"} Feb 19 18:35:26 crc kubenswrapper[4749]: I0219 18:35:26.519997 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2n4lk" event={"ID":"e19c61ad-b387-457b-814b-e382b0265880","Type":"ContainerStarted","Data":"bf71ebb1f626a916784e5d96dac0e63da11729d1b2eab51122f2a1fcf658658c"} Feb 19 18:35:26 crc kubenswrapper[4749]: I0219 18:35:26.535820 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pdxkf" podStartSLOduration=3.516005627 podStartE2EDuration="58.535803927s" podCreationTimestamp="2026-02-19 18:34:28 +0000 UTC" firstStartedPulling="2026-02-19 18:34:29.850834659 +0000 UTC m=+43.812054613" lastFinishedPulling="2026-02-19 18:35:24.870632959 +0000 UTC m=+98.831852913" observedRunningTime="2026-02-19 18:35:25.613755604 +0000 UTC m=+99.574975558" watchObservedRunningTime="2026-02-19 18:35:26.535803927 +0000 UTC m=+100.497023881" Feb 19 18:35:26 crc kubenswrapper[4749]: I0219 18:35:26.553593 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2n4lk" podStartSLOduration=2.359690348 podStartE2EDuration="58.55357351s" podCreationTimestamp="2026-02-19 18:34:28 +0000 UTC" firstStartedPulling="2026-02-19 18:34:29.833110857 +0000 UTC m=+43.794330811" lastFinishedPulling="2026-02-19 18:35:26.026994019 +0000 UTC m=+99.988213973" observedRunningTime="2026-02-19 18:35:26.551974664 +0000 UTC m=+100.513194618" watchObservedRunningTime="2026-02-19 18:35:26.55357351 +0000 UTC m=+100.514793464" Feb 19 18:35:26 crc kubenswrapper[4749]: I0219 18:35:26.555927 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tscgs" podStartSLOduration=2.68591591 podStartE2EDuration="55.555915814s" podCreationTimestamp="2026-02-19 18:34:31 +0000 UTC" firstStartedPulling="2026-02-19 18:34:32.996499356 +0000 UTC m=+46.957719310" lastFinishedPulling="2026-02-19 18:35:25.86649926 +0000 UTC m=+99.827719214" observedRunningTime="2026-02-19 18:35:26.537893854 +0000 UTC m=+100.499113818" watchObservedRunningTime="2026-02-19 18:35:26.555915814 +0000 UTC m=+100.517135768" Feb 19 18:35:26 crc kubenswrapper[4749]: I0219 18:35:26.567772 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zqrzw" podStartSLOduration=2.474855626 podStartE2EDuration="56.567755653s" podCreationTimestamp="2026-02-19 18:34:30 +0000 UTC" firstStartedPulling="2026-02-19 18:34:32.006895977 +0000 UTC m=+45.968115931" lastFinishedPulling="2026-02-19 18:35:26.099796004 +0000 UTC m=+100.061015958" observedRunningTime="2026-02-19 18:35:26.566502395 +0000 UTC m=+100.527722369" watchObservedRunningTime="2026-02-19 18:35:26.567755653 +0000 UTC m=+100.528975607" Feb 19 18:35:28 crc kubenswrapper[4749]: I0219 18:35:28.347073 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2n4lk" Feb 19 18:35:28 crc kubenswrapper[4749]: I0219 18:35:28.347406 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2n4lk" Feb 19 18:35:28 crc kubenswrapper[4749]: I0219 18:35:28.350534 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c95786564-fnpmk"] Feb 19 18:35:28 crc kubenswrapper[4749]: I0219 18:35:28.350699 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5c95786564-fnpmk" podUID="cce85d47-c65c-4051-afad-8b9667558414" containerName="controller-manager" containerID="cri-o://d898ab105bd1f03bf1d63221ba2ce75ea43c99e73897e9522a451b2fb1864a20" gracePeriod=30 Feb 19 18:35:28 crc kubenswrapper[4749]: I0219 18:35:28.401884 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2n4lk" Feb 19 18:35:28 crc kubenswrapper[4749]: I0219 18:35:28.456267 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg"] Feb 19 18:35:28 crc kubenswrapper[4749]: I0219 18:35:28.456724 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg" podUID="7a367aab-4b47-43f2-b158-c0dc8cf9d797" containerName="route-controller-manager" containerID="cri-o://f31058129b2afcd5320b0a118ae2901602977fb32040db991669deb419f24057" gracePeriod=30 Feb 19 18:35:28 crc kubenswrapper[4749]: I0219 18:35:28.531265 4749 generic.go:334] "Generic (PLEG): container finished" podID="cce85d47-c65c-4051-afad-8b9667558414" containerID="d898ab105bd1f03bf1d63221ba2ce75ea43c99e73897e9522a451b2fb1864a20" exitCode=0 Feb 19 18:35:28 crc kubenswrapper[4749]: I0219 18:35:28.531339 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c95786564-fnpmk" event={"ID":"cce85d47-c65c-4051-afad-8b9667558414","Type":"ContainerDied","Data":"d898ab105bd1f03bf1d63221ba2ce75ea43c99e73897e9522a451b2fb1864a20"} Feb 19 18:35:28 crc kubenswrapper[4749]: I0219 18:35:28.550506 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5psfv" Feb 19 18:35:28 crc kubenswrapper[4749]: I0219 18:35:28.550611 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5psfv" Feb 19 18:35:28 crc kubenswrapper[4749]: I0219 18:35:28.588298 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5psfv" Feb 19 18:35:28 crc kubenswrapper[4749]: I0219 18:35:28.764464 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pdxkf" Feb 19 18:35:28 crc kubenswrapper[4749]: I0219 18:35:28.764532 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pdxkf" Feb 19 18:35:28 crc kubenswrapper[4749]: I0219 18:35:28.799374 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pdxkf" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.014503 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rmg2b" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.014799 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rmg2b" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.026655 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c95786564-fnpmk" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.057294 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rmg2b" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.076486 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phs5q\" (UniqueName: \"kubernetes.io/projected/cce85d47-c65c-4051-afad-8b9667558414-kube-api-access-phs5q\") pod \"cce85d47-c65c-4051-afad-8b9667558414\" (UID: \"cce85d47-c65c-4051-afad-8b9667558414\") " Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.076571 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce85d47-c65c-4051-afad-8b9667558414-config\") pod \"cce85d47-c65c-4051-afad-8b9667558414\" (UID: \"cce85d47-c65c-4051-afad-8b9667558414\") " Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.076636 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cce85d47-c65c-4051-afad-8b9667558414-proxy-ca-bundles\") pod \"cce85d47-c65c-4051-afad-8b9667558414\" (UID: \"cce85d47-c65c-4051-afad-8b9667558414\") " Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.076680 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cce85d47-c65c-4051-afad-8b9667558414-serving-cert\") pod \"cce85d47-c65c-4051-afad-8b9667558414\" (UID: \"cce85d47-c65c-4051-afad-8b9667558414\") " Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.076701 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cce85d47-c65c-4051-afad-8b9667558414-client-ca\") pod \"cce85d47-c65c-4051-afad-8b9667558414\" (UID: \"cce85d47-c65c-4051-afad-8b9667558414\") " Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.077413 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cce85d47-c65c-4051-afad-8b9667558414-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "cce85d47-c65c-4051-afad-8b9667558414" (UID: "cce85d47-c65c-4051-afad-8b9667558414"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.077511 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cce85d47-c65c-4051-afad-8b9667558414-client-ca" (OuterVolumeSpecName: "client-ca") pod "cce85d47-c65c-4051-afad-8b9667558414" (UID: "cce85d47-c65c-4051-afad-8b9667558414"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.077570 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cce85d47-c65c-4051-afad-8b9667558414-config" (OuterVolumeSpecName: "config") pod "cce85d47-c65c-4051-afad-8b9667558414" (UID: "cce85d47-c65c-4051-afad-8b9667558414"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.081911 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cce85d47-c65c-4051-afad-8b9667558414-kube-api-access-phs5q" (OuterVolumeSpecName: "kube-api-access-phs5q") pod "cce85d47-c65c-4051-afad-8b9667558414" (UID: "cce85d47-c65c-4051-afad-8b9667558414"). InnerVolumeSpecName "kube-api-access-phs5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.085582 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cce85d47-c65c-4051-afad-8b9667558414-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cce85d47-c65c-4051-afad-8b9667558414" (UID: "cce85d47-c65c-4051-afad-8b9667558414"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.177948 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phs5q\" (UniqueName: \"kubernetes.io/projected/cce85d47-c65c-4051-afad-8b9667558414-kube-api-access-phs5q\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.177980 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce85d47-c65c-4051-afad-8b9667558414-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.177989 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cce85d47-c65c-4051-afad-8b9667558414-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.177998 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cce85d47-c65c-4051-afad-8b9667558414-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.178006 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cce85d47-c65c-4051-afad-8b9667558414-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.537139 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c95786564-fnpmk" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.537147 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c95786564-fnpmk" event={"ID":"cce85d47-c65c-4051-afad-8b9667558414","Type":"ContainerDied","Data":"ad2438e3f94a9cb6ca352cdc6a2c65331ebf883fa2044d4fa9cf3f971679365c"} Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.538219 4749 scope.go:117] "RemoveContainer" containerID="d898ab105bd1f03bf1d63221ba2ce75ea43c99e73897e9522a451b2fb1864a20" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.541586 4749 generic.go:334] "Generic (PLEG): container finished" podID="7a367aab-4b47-43f2-b158-c0dc8cf9d797" containerID="f31058129b2afcd5320b0a118ae2901602977fb32040db991669deb419f24057" exitCode=0 Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.541707 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg" event={"ID":"7a367aab-4b47-43f2-b158-c0dc8cf9d797","Type":"ContainerDied","Data":"f31058129b2afcd5320b0a118ae2901602977fb32040db991669deb419f24057"} Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.571106 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c95786564-fnpmk"] Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.573940 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5c95786564-fnpmk"] Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.731386 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-84c477c4f8-v6q6d"] Feb 19 18:35:29 crc kubenswrapper[4749]: E0219 18:35:29.731617 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="856ea752-2729-4936-96aa-423c76975a34" containerName="extract-utilities" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.731634 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="856ea752-2729-4936-96aa-423c76975a34" containerName="extract-utilities" Feb 19 18:35:29 crc kubenswrapper[4749]: E0219 18:35:29.731653 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cce85d47-c65c-4051-afad-8b9667558414" containerName="controller-manager" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.731661 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cce85d47-c65c-4051-afad-8b9667558414" containerName="controller-manager" Feb 19 18:35:29 crc kubenswrapper[4749]: E0219 18:35:29.731675 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="856ea752-2729-4936-96aa-423c76975a34" containerName="extract-content" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.731682 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="856ea752-2729-4936-96aa-423c76975a34" containerName="extract-content" Feb 19 18:35:29 crc kubenswrapper[4749]: E0219 18:35:29.731692 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="856ea752-2729-4936-96aa-423c76975a34" containerName="registry-server" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.731697 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="856ea752-2729-4936-96aa-423c76975a34" containerName="registry-server" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.731778 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="856ea752-2729-4936-96aa-423c76975a34" containerName="registry-server" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.731792 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cce85d47-c65c-4051-afad-8b9667558414" containerName="controller-manager" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.732263 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84c477c4f8-v6q6d" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.734817 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.734877 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.734903 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.734826 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.735208 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.735378 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.741532 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.749692 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-84c477c4f8-v6q6d"] Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.785330 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e263af71-4df5-47b4-82af-3569f91dcf3e-serving-cert\") pod \"controller-manager-84c477c4f8-v6q6d\" (UID: \"e263af71-4df5-47b4-82af-3569f91dcf3e\") " pod="openshift-controller-manager/controller-manager-84c477c4f8-v6q6d" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.785410 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e263af71-4df5-47b4-82af-3569f91dcf3e-config\") pod \"controller-manager-84c477c4f8-v6q6d\" (UID: \"e263af71-4df5-47b4-82af-3569f91dcf3e\") " pod="openshift-controller-manager/controller-manager-84c477c4f8-v6q6d" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.785440 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e263af71-4df5-47b4-82af-3569f91dcf3e-client-ca\") pod \"controller-manager-84c477c4f8-v6q6d\" (UID: \"e263af71-4df5-47b4-82af-3569f91dcf3e\") " pod="openshift-controller-manager/controller-manager-84c477c4f8-v6q6d" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.785583 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e263af71-4df5-47b4-82af-3569f91dcf3e-proxy-ca-bundles\") pod \"controller-manager-84c477c4f8-v6q6d\" (UID: \"e263af71-4df5-47b4-82af-3569f91dcf3e\") " pod="openshift-controller-manager/controller-manager-84c477c4f8-v6q6d" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.785617 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b78l4\" (UniqueName: \"kubernetes.io/projected/e263af71-4df5-47b4-82af-3569f91dcf3e-kube-api-access-b78l4\") pod \"controller-manager-84c477c4f8-v6q6d\" (UID: \"e263af71-4df5-47b4-82af-3569f91dcf3e\") " pod="openshift-controller-manager/controller-manager-84c477c4f8-v6q6d" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.887058 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e263af71-4df5-47b4-82af-3569f91dcf3e-proxy-ca-bundles\") pod \"controller-manager-84c477c4f8-v6q6d\" (UID: \"e263af71-4df5-47b4-82af-3569f91dcf3e\") " pod="openshift-controller-manager/controller-manager-84c477c4f8-v6q6d" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.887101 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b78l4\" (UniqueName: \"kubernetes.io/projected/e263af71-4df5-47b4-82af-3569f91dcf3e-kube-api-access-b78l4\") pod \"controller-manager-84c477c4f8-v6q6d\" (UID: \"e263af71-4df5-47b4-82af-3569f91dcf3e\") " pod="openshift-controller-manager/controller-manager-84c477c4f8-v6q6d" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.887126 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e263af71-4df5-47b4-82af-3569f91dcf3e-serving-cert\") pod \"controller-manager-84c477c4f8-v6q6d\" (UID: \"e263af71-4df5-47b4-82af-3569f91dcf3e\") " pod="openshift-controller-manager/controller-manager-84c477c4f8-v6q6d" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.887160 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e263af71-4df5-47b4-82af-3569f91dcf3e-config\") pod \"controller-manager-84c477c4f8-v6q6d\" (UID: \"e263af71-4df5-47b4-82af-3569f91dcf3e\") " pod="openshift-controller-manager/controller-manager-84c477c4f8-v6q6d" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.887178 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e263af71-4df5-47b4-82af-3569f91dcf3e-client-ca\") pod \"controller-manager-84c477c4f8-v6q6d\" (UID: \"e263af71-4df5-47b4-82af-3569f91dcf3e\") " pod="openshift-controller-manager/controller-manager-84c477c4f8-v6q6d" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.887998 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e263af71-4df5-47b4-82af-3569f91dcf3e-client-ca\") pod \"controller-manager-84c477c4f8-v6q6d\" (UID: \"e263af71-4df5-47b4-82af-3569f91dcf3e\") " pod="openshift-controller-manager/controller-manager-84c477c4f8-v6q6d" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.888787 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e263af71-4df5-47b4-82af-3569f91dcf3e-proxy-ca-bundles\") pod \"controller-manager-84c477c4f8-v6q6d\" (UID: \"e263af71-4df5-47b4-82af-3569f91dcf3e\") " pod="openshift-controller-manager/controller-manager-84c477c4f8-v6q6d" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.890482 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e263af71-4df5-47b4-82af-3569f91dcf3e-config\") pod \"controller-manager-84c477c4f8-v6q6d\" (UID: \"e263af71-4df5-47b4-82af-3569f91dcf3e\") " pod="openshift-controller-manager/controller-manager-84c477c4f8-v6q6d" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.903308 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e263af71-4df5-47b4-82af-3569f91dcf3e-serving-cert\") pod \"controller-manager-84c477c4f8-v6q6d\" (UID: \"e263af71-4df5-47b4-82af-3569f91dcf3e\") " pod="openshift-controller-manager/controller-manager-84c477c4f8-v6q6d" Feb 19 18:35:29 crc kubenswrapper[4749]: I0219 18:35:29.909447 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b78l4\" (UniqueName: \"kubernetes.io/projected/e263af71-4df5-47b4-82af-3569f91dcf3e-kube-api-access-b78l4\") pod \"controller-manager-84c477c4f8-v6q6d\" (UID: \"e263af71-4df5-47b4-82af-3569f91dcf3e\") " pod="openshift-controller-manager/controller-manager-84c477c4f8-v6q6d" Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.012461 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg" Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.050443 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84c477c4f8-v6q6d" Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.089269 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a367aab-4b47-43f2-b158-c0dc8cf9d797-serving-cert\") pod \"7a367aab-4b47-43f2-b158-c0dc8cf9d797\" (UID: \"7a367aab-4b47-43f2-b158-c0dc8cf9d797\") " Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.089334 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a367aab-4b47-43f2-b158-c0dc8cf9d797-config\") pod \"7a367aab-4b47-43f2-b158-c0dc8cf9d797\" (UID: \"7a367aab-4b47-43f2-b158-c0dc8cf9d797\") " Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.089381 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pdrb\" (UniqueName: \"kubernetes.io/projected/7a367aab-4b47-43f2-b158-c0dc8cf9d797-kube-api-access-9pdrb\") pod \"7a367aab-4b47-43f2-b158-c0dc8cf9d797\" (UID: \"7a367aab-4b47-43f2-b158-c0dc8cf9d797\") " Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.089428 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a367aab-4b47-43f2-b158-c0dc8cf9d797-client-ca\") pod \"7a367aab-4b47-43f2-b158-c0dc8cf9d797\" (UID: \"7a367aab-4b47-43f2-b158-c0dc8cf9d797\") " Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.090122 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a367aab-4b47-43f2-b158-c0dc8cf9d797-client-ca" (OuterVolumeSpecName: "client-ca") pod "7a367aab-4b47-43f2-b158-c0dc8cf9d797" (UID: "7a367aab-4b47-43f2-b158-c0dc8cf9d797"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.090554 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a367aab-4b47-43f2-b158-c0dc8cf9d797-config" (OuterVolumeSpecName: "config") pod "7a367aab-4b47-43f2-b158-c0dc8cf9d797" (UID: "7a367aab-4b47-43f2-b158-c0dc8cf9d797"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.093173 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a367aab-4b47-43f2-b158-c0dc8cf9d797-kube-api-access-9pdrb" (OuterVolumeSpecName: "kube-api-access-9pdrb") pod "7a367aab-4b47-43f2-b158-c0dc8cf9d797" (UID: "7a367aab-4b47-43f2-b158-c0dc8cf9d797"). InnerVolumeSpecName "kube-api-access-9pdrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.093511 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a367aab-4b47-43f2-b158-c0dc8cf9d797-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7a367aab-4b47-43f2-b158-c0dc8cf9d797" (UID: "7a367aab-4b47-43f2-b158-c0dc8cf9d797"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.190970 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pdrb\" (UniqueName: \"kubernetes.io/projected/7a367aab-4b47-43f2-b158-c0dc8cf9d797-kube-api-access-9pdrb\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.191011 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a367aab-4b47-43f2-b158-c0dc8cf9d797-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.191039 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a367aab-4b47-43f2-b158-c0dc8cf9d797-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.191051 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a367aab-4b47-43f2-b158-c0dc8cf9d797-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.370875 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zqrzw" Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.370937 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zqrzw" Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.413559 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zqrzw" Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.454628 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-84c477c4f8-v6q6d"] Feb 19 18:35:30 crc kubenswrapper[4749]: W0219 18:35:30.459973 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode263af71_4df5_47b4_82af_3569f91dcf3e.slice/crio-08f90d28a3c11b3d88d5fc874f9e1afab31f801c4c1e0d6c79741fcd692ae8a0 WatchSource:0}: Error finding container 08f90d28a3c11b3d88d5fc874f9e1afab31f801c4c1e0d6c79741fcd692ae8a0: Status 404 returned error can't find the container with id 08f90d28a3c11b3d88d5fc874f9e1afab31f801c4c1e0d6c79741fcd692ae8a0 Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.547586 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg" event={"ID":"7a367aab-4b47-43f2-b158-c0dc8cf9d797","Type":"ContainerDied","Data":"4b3f151cee3bbb3b524875d7d65065ea2246aedb2ce83077a285b14ed5ed0eba"} Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.547622 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg" Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.547929 4749 scope.go:117] "RemoveContainer" containerID="f31058129b2afcd5320b0a118ae2901602977fb32040db991669deb419f24057" Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.548463 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84c477c4f8-v6q6d" event={"ID":"e263af71-4df5-47b4-82af-3569f91dcf3e","Type":"ContainerStarted","Data":"08f90d28a3c11b3d88d5fc874f9e1afab31f801c4c1e0d6c79741fcd692ae8a0"} Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.576947 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg"] Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.580606 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7548bb6cfb-zbzlg"] Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.586554 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5psfv" Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.596479 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rmg2b" Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.685672 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a367aab-4b47-43f2-b158-c0dc8cf9d797" path="/var/lib/kubelet/pods/7a367aab-4b47-43f2-b158-c0dc8cf9d797/volumes" Feb 19 18:35:30 crc kubenswrapper[4749]: I0219 18:35:30.686182 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cce85d47-c65c-4051-afad-8b9667558414" path="/var/lib/kubelet/pods/cce85d47-c65c-4051-afad-8b9667558414/volumes" Feb 19 18:35:31 crc kubenswrapper[4749]: I0219 18:35:31.556445 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84c477c4f8-v6q6d" event={"ID":"e263af71-4df5-47b4-82af-3569f91dcf3e","Type":"ContainerStarted","Data":"b90d317ddb02752e5fc56ca67788e601b198208953b13c65dd60ac5761e1c82f"} Feb 19 18:35:31 crc kubenswrapper[4749]: I0219 18:35:31.556677 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-84c477c4f8-v6q6d" Feb 19 18:35:31 crc kubenswrapper[4749]: I0219 18:35:31.564611 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-84c477c4f8-v6q6d" Feb 19 18:35:31 crc kubenswrapper[4749]: I0219 18:35:31.585264 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-84c477c4f8-v6q6d" podStartSLOduration=3.585249326 podStartE2EDuration="3.585249326s" podCreationTimestamp="2026-02-19 18:35:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:35:31.582504843 +0000 UTC m=+105.543724817" watchObservedRunningTime="2026-02-19 18:35:31.585249326 +0000 UTC m=+105.546469280" Feb 19 18:35:31 crc kubenswrapper[4749]: I0219 18:35:31.732465 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp"] Feb 19 18:35:31 crc kubenswrapper[4749]: E0219 18:35:31.733053 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a367aab-4b47-43f2-b158-c0dc8cf9d797" containerName="route-controller-manager" Feb 19 18:35:31 crc kubenswrapper[4749]: I0219 18:35:31.733069 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a367aab-4b47-43f2-b158-c0dc8cf9d797" containerName="route-controller-manager" Feb 19 18:35:31 crc kubenswrapper[4749]: I0219 18:35:31.733221 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a367aab-4b47-43f2-b158-c0dc8cf9d797" containerName="route-controller-manager" Feb 19 18:35:31 crc kubenswrapper[4749]: I0219 18:35:31.734282 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp" Feb 19 18:35:31 crc kubenswrapper[4749]: I0219 18:35:31.744544 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 18:35:31 crc kubenswrapper[4749]: I0219 18:35:31.744921 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 18:35:31 crc kubenswrapper[4749]: I0219 18:35:31.745186 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 18:35:31 crc kubenswrapper[4749]: I0219 18:35:31.745453 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 18:35:31 crc kubenswrapper[4749]: I0219 18:35:31.745784 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 18:35:31 crc kubenswrapper[4749]: I0219 18:35:31.746303 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 18:35:31 crc kubenswrapper[4749]: I0219 18:35:31.753266 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp"] Feb 19 18:35:31 crc kubenswrapper[4749]: I0219 18:35:31.811224 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bef40c66-77f5-43c6-a6a2-637587bcded2-serving-cert\") pod \"route-controller-manager-7b859fbb45-jwtnp\" (UID: \"bef40c66-77f5-43c6-a6a2-637587bcded2\") " pod="openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp" Feb 19 18:35:31 crc kubenswrapper[4749]: I0219 18:35:31.811316 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bef40c66-77f5-43c6-a6a2-637587bcded2-client-ca\") pod \"route-controller-manager-7b859fbb45-jwtnp\" (UID: \"bef40c66-77f5-43c6-a6a2-637587bcded2\") " pod="openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp" Feb 19 18:35:31 crc kubenswrapper[4749]: I0219 18:35:31.811342 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhk5w\" (UniqueName: \"kubernetes.io/projected/bef40c66-77f5-43c6-a6a2-637587bcded2-kube-api-access-nhk5w\") pod \"route-controller-manager-7b859fbb45-jwtnp\" (UID: \"bef40c66-77f5-43c6-a6a2-637587bcded2\") " pod="openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp" Feb 19 18:35:31 crc kubenswrapper[4749]: I0219 18:35:31.811387 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bef40c66-77f5-43c6-a6a2-637587bcded2-config\") pod \"route-controller-manager-7b859fbb45-jwtnp\" (UID: \"bef40c66-77f5-43c6-a6a2-637587bcded2\") " pod="openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp" Feb 19 18:35:31 crc kubenswrapper[4749]: I0219 18:35:31.912789 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bef40c66-77f5-43c6-a6a2-637587bcded2-serving-cert\") pod \"route-controller-manager-7b859fbb45-jwtnp\" (UID: \"bef40c66-77f5-43c6-a6a2-637587bcded2\") " pod="openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp" Feb 19 18:35:31 crc kubenswrapper[4749]: I0219 18:35:31.912845 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bef40c66-77f5-43c6-a6a2-637587bcded2-client-ca\") pod \"route-controller-manager-7b859fbb45-jwtnp\" (UID: \"bef40c66-77f5-43c6-a6a2-637587bcded2\") " pod="openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp" Feb 19 18:35:31 crc kubenswrapper[4749]: I0219 18:35:31.912872 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhk5w\" (UniqueName: \"kubernetes.io/projected/bef40c66-77f5-43c6-a6a2-637587bcded2-kube-api-access-nhk5w\") pod \"route-controller-manager-7b859fbb45-jwtnp\" (UID: \"bef40c66-77f5-43c6-a6a2-637587bcded2\") " pod="openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp" Feb 19 18:35:31 crc kubenswrapper[4749]: I0219 18:35:31.912921 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bef40c66-77f5-43c6-a6a2-637587bcded2-config\") pod \"route-controller-manager-7b859fbb45-jwtnp\" (UID: \"bef40c66-77f5-43c6-a6a2-637587bcded2\") " pod="openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp" Feb 19 18:35:31 crc kubenswrapper[4749]: I0219 18:35:31.914121 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bef40c66-77f5-43c6-a6a2-637587bcded2-client-ca\") pod \"route-controller-manager-7b859fbb45-jwtnp\" (UID: \"bef40c66-77f5-43c6-a6a2-637587bcded2\") " pod="openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp" Feb 19 18:35:31 crc kubenswrapper[4749]: I0219 18:35:31.914313 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bef40c66-77f5-43c6-a6a2-637587bcded2-config\") pod \"route-controller-manager-7b859fbb45-jwtnp\" (UID: \"bef40c66-77f5-43c6-a6a2-637587bcded2\") " pod="openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp" Feb 19 18:35:31 crc kubenswrapper[4749]: I0219 18:35:31.922884 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bef40c66-77f5-43c6-a6a2-637587bcded2-serving-cert\") pod \"route-controller-manager-7b859fbb45-jwtnp\" (UID: \"bef40c66-77f5-43c6-a6a2-637587bcded2\") " pod="openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp" Feb 19 18:35:31 crc kubenswrapper[4749]: I0219 18:35:31.929774 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhk5w\" (UniqueName: \"kubernetes.io/projected/bef40c66-77f5-43c6-a6a2-637587bcded2-kube-api-access-nhk5w\") pod \"route-controller-manager-7b859fbb45-jwtnp\" (UID: \"bef40c66-77f5-43c6-a6a2-637587bcded2\") " pod="openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp" Feb 19 18:35:32 crc kubenswrapper[4749]: I0219 18:35:32.066377 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp" Feb 19 18:35:32 crc kubenswrapper[4749]: I0219 18:35:32.201690 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tscgs" Feb 19 18:35:32 crc kubenswrapper[4749]: I0219 18:35:32.201753 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tscgs" Feb 19 18:35:32 crc kubenswrapper[4749]: I0219 18:35:32.245308 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tscgs" Feb 19 18:35:32 crc kubenswrapper[4749]: I0219 18:35:32.545555 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp"] Feb 19 18:35:32 crc kubenswrapper[4749]: W0219 18:35:32.559101 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbef40c66_77f5_43c6_a6a2_637587bcded2.slice/crio-e451ab7c37af480a0e0a87c30ab4b2d018cdca41887a7e387c425221d652b2f1 WatchSource:0}: Error finding container e451ab7c37af480a0e0a87c30ab4b2d018cdca41887a7e387c425221d652b2f1: Status 404 returned error can't find the container with id e451ab7c37af480a0e0a87c30ab4b2d018cdca41887a7e387c425221d652b2f1 Feb 19 18:35:32 crc kubenswrapper[4749]: I0219 18:35:32.600682 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tscgs" Feb 19 18:35:33 crc kubenswrapper[4749]: I0219 18:35:33.571751 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp" event={"ID":"bef40c66-77f5-43c6-a6a2-637587bcded2","Type":"ContainerStarted","Data":"3d90c0225f14c4f5946a079e426d80a31157c3c9689dc7eb24e639d51cee8082"} Feb 19 18:35:33 crc kubenswrapper[4749]: I0219 18:35:33.572139 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp" event={"ID":"bef40c66-77f5-43c6-a6a2-637587bcded2","Type":"ContainerStarted","Data":"e451ab7c37af480a0e0a87c30ab4b2d018cdca41887a7e387c425221d652b2f1"} Feb 19 18:35:33 crc kubenswrapper[4749]: I0219 18:35:33.572193 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp" Feb 19 18:35:33 crc kubenswrapper[4749]: I0219 18:35:33.577517 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp" Feb 19 18:35:33 crc kubenswrapper[4749]: I0219 18:35:33.589115 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp" podStartSLOduration=5.589097783 podStartE2EDuration="5.589097783s" podCreationTimestamp="2026-02-19 18:35:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:35:33.585599144 +0000 UTC m=+107.546819098" watchObservedRunningTime="2026-02-19 18:35:33.589097783 +0000 UTC m=+107.550317737" Feb 19 18:35:33 crc kubenswrapper[4749]: I0219 18:35:33.908552 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rmg2b"] Feb 19 18:35:33 crc kubenswrapper[4749]: I0219 18:35:33.908769 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rmg2b" podUID="e3563ed4-2c84-4219-9843-c95a8bba26ac" containerName="registry-server" containerID="cri-o://aac3b8d1fd4cd455ba50f08065bf1b842477ab4e6064cc68519ae0907496d179" gracePeriod=2 Feb 19 18:35:34 crc kubenswrapper[4749]: I0219 18:35:34.305663 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rmg2b" Feb 19 18:35:34 crc kubenswrapper[4749]: I0219 18:35:34.343464 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmzbj\" (UniqueName: \"kubernetes.io/projected/e3563ed4-2c84-4219-9843-c95a8bba26ac-kube-api-access-wmzbj\") pod \"e3563ed4-2c84-4219-9843-c95a8bba26ac\" (UID: \"e3563ed4-2c84-4219-9843-c95a8bba26ac\") " Feb 19 18:35:34 crc kubenswrapper[4749]: I0219 18:35:34.346136 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3563ed4-2c84-4219-9843-c95a8bba26ac-utilities\") pod \"e3563ed4-2c84-4219-9843-c95a8bba26ac\" (UID: \"e3563ed4-2c84-4219-9843-c95a8bba26ac\") " Feb 19 18:35:34 crc kubenswrapper[4749]: I0219 18:35:34.346390 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3563ed4-2c84-4219-9843-c95a8bba26ac-catalog-content\") pod \"e3563ed4-2c84-4219-9843-c95a8bba26ac\" (UID: \"e3563ed4-2c84-4219-9843-c95a8bba26ac\") " Feb 19 18:35:34 crc kubenswrapper[4749]: I0219 18:35:34.348071 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3563ed4-2c84-4219-9843-c95a8bba26ac-utilities" (OuterVolumeSpecName: "utilities") pod "e3563ed4-2c84-4219-9843-c95a8bba26ac" (UID: "e3563ed4-2c84-4219-9843-c95a8bba26ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:35:34 crc kubenswrapper[4749]: I0219 18:35:34.348349 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3563ed4-2c84-4219-9843-c95a8bba26ac-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:34 crc kubenswrapper[4749]: I0219 18:35:34.351362 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3563ed4-2c84-4219-9843-c95a8bba26ac-kube-api-access-wmzbj" (OuterVolumeSpecName: "kube-api-access-wmzbj") pod "e3563ed4-2c84-4219-9843-c95a8bba26ac" (UID: "e3563ed4-2c84-4219-9843-c95a8bba26ac"). InnerVolumeSpecName "kube-api-access-wmzbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:35:34 crc kubenswrapper[4749]: I0219 18:35:34.409515 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3563ed4-2c84-4219-9843-c95a8bba26ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3563ed4-2c84-4219-9843-c95a8bba26ac" (UID: "e3563ed4-2c84-4219-9843-c95a8bba26ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:35:34 crc kubenswrapper[4749]: I0219 18:35:34.449594 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3563ed4-2c84-4219-9843-c95a8bba26ac-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:34 crc kubenswrapper[4749]: I0219 18:35:34.449627 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmzbj\" (UniqueName: \"kubernetes.io/projected/e3563ed4-2c84-4219-9843-c95a8bba26ac-kube-api-access-wmzbj\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:34 crc kubenswrapper[4749]: I0219 18:35:34.585872 4749 generic.go:334] "Generic (PLEG): container finished" podID="e3563ed4-2c84-4219-9843-c95a8bba26ac" containerID="aac3b8d1fd4cd455ba50f08065bf1b842477ab4e6064cc68519ae0907496d179" exitCode=0 Feb 19 18:35:34 crc kubenswrapper[4749]: I0219 18:35:34.586695 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rmg2b" Feb 19 18:35:34 crc kubenswrapper[4749]: I0219 18:35:34.586678 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rmg2b" event={"ID":"e3563ed4-2c84-4219-9843-c95a8bba26ac","Type":"ContainerDied","Data":"aac3b8d1fd4cd455ba50f08065bf1b842477ab4e6064cc68519ae0907496d179"} Feb 19 18:35:34 crc kubenswrapper[4749]: I0219 18:35:34.586831 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rmg2b" event={"ID":"e3563ed4-2c84-4219-9843-c95a8bba26ac","Type":"ContainerDied","Data":"99b07d02a0f5faceb901e01468eaee2fef56e89ded85c8ec2020b22782ca5a07"} Feb 19 18:35:34 crc kubenswrapper[4749]: I0219 18:35:34.586854 4749 scope.go:117] "RemoveContainer" containerID="aac3b8d1fd4cd455ba50f08065bf1b842477ab4e6064cc68519ae0907496d179" Feb 19 18:35:34 crc kubenswrapper[4749]: I0219 18:35:34.609869 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rmg2b"] Feb 19 18:35:34 crc kubenswrapper[4749]: I0219 18:35:34.610837 4749 scope.go:117] "RemoveContainer" containerID="e7114aea0dc129d324329047caa7cf90446804babda602b13b65c687f559f069" Feb 19 18:35:34 crc kubenswrapper[4749]: I0219 18:35:34.622837 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rmg2b"] Feb 19 18:35:34 crc kubenswrapper[4749]: I0219 18:35:34.640135 4749 scope.go:117] "RemoveContainer" containerID="8555d99d575efeec2e792717f9565c6d006c8bdd0e007bc0a347ad813d06fd80" Feb 19 18:35:34 crc kubenswrapper[4749]: I0219 18:35:34.653118 4749 scope.go:117] "RemoveContainer" containerID="aac3b8d1fd4cd455ba50f08065bf1b842477ab4e6064cc68519ae0907496d179" Feb 19 18:35:34 crc kubenswrapper[4749]: E0219 18:35:34.653504 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aac3b8d1fd4cd455ba50f08065bf1b842477ab4e6064cc68519ae0907496d179\": container with ID starting with aac3b8d1fd4cd455ba50f08065bf1b842477ab4e6064cc68519ae0907496d179 not found: ID does not exist" containerID="aac3b8d1fd4cd455ba50f08065bf1b842477ab4e6064cc68519ae0907496d179" Feb 19 18:35:34 crc kubenswrapper[4749]: I0219 18:35:34.653537 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aac3b8d1fd4cd455ba50f08065bf1b842477ab4e6064cc68519ae0907496d179"} err="failed to get container status \"aac3b8d1fd4cd455ba50f08065bf1b842477ab4e6064cc68519ae0907496d179\": rpc error: code = NotFound desc = could not find container \"aac3b8d1fd4cd455ba50f08065bf1b842477ab4e6064cc68519ae0907496d179\": container with ID starting with aac3b8d1fd4cd455ba50f08065bf1b842477ab4e6064cc68519ae0907496d179 not found: ID does not exist" Feb 19 18:35:34 crc kubenswrapper[4749]: I0219 18:35:34.653558 4749 scope.go:117] "RemoveContainer" containerID="e7114aea0dc129d324329047caa7cf90446804babda602b13b65c687f559f069" Feb 19 18:35:34 crc kubenswrapper[4749]: E0219 18:35:34.653748 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7114aea0dc129d324329047caa7cf90446804babda602b13b65c687f559f069\": container with ID starting with e7114aea0dc129d324329047caa7cf90446804babda602b13b65c687f559f069 not found: ID does not exist" containerID="e7114aea0dc129d324329047caa7cf90446804babda602b13b65c687f559f069" Feb 19 18:35:34 crc kubenswrapper[4749]: I0219 18:35:34.653785 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7114aea0dc129d324329047caa7cf90446804babda602b13b65c687f559f069"} err="failed to get container status \"e7114aea0dc129d324329047caa7cf90446804babda602b13b65c687f559f069\": rpc error: code = NotFound desc = could not find container \"e7114aea0dc129d324329047caa7cf90446804babda602b13b65c687f559f069\": container with ID starting with e7114aea0dc129d324329047caa7cf90446804babda602b13b65c687f559f069 not found: ID does not exist" Feb 19 18:35:34 crc kubenswrapper[4749]: I0219 18:35:34.653797 4749 scope.go:117] "RemoveContainer" containerID="8555d99d575efeec2e792717f9565c6d006c8bdd0e007bc0a347ad813d06fd80" Feb 19 18:35:34 crc kubenswrapper[4749]: E0219 18:35:34.653969 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8555d99d575efeec2e792717f9565c6d006c8bdd0e007bc0a347ad813d06fd80\": container with ID starting with 8555d99d575efeec2e792717f9565c6d006c8bdd0e007bc0a347ad813d06fd80 not found: ID does not exist" containerID="8555d99d575efeec2e792717f9565c6d006c8bdd0e007bc0a347ad813d06fd80" Feb 19 18:35:34 crc kubenswrapper[4749]: I0219 18:35:34.654000 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8555d99d575efeec2e792717f9565c6d006c8bdd0e007bc0a347ad813d06fd80"} err="failed to get container status \"8555d99d575efeec2e792717f9565c6d006c8bdd0e007bc0a347ad813d06fd80\": rpc error: code = NotFound desc = could not find container \"8555d99d575efeec2e792717f9565c6d006c8bdd0e007bc0a347ad813d06fd80\": container with ID starting with 8555d99d575efeec2e792717f9565c6d006c8bdd0e007bc0a347ad813d06fd80 not found: ID does not exist" Feb 19 18:35:34 crc kubenswrapper[4749]: I0219 18:35:34.685011 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3563ed4-2c84-4219-9843-c95a8bba26ac" path="/var/lib/kubelet/pods/e3563ed4-2c84-4219-9843-c95a8bba26ac/volumes" Feb 19 18:35:36 crc kubenswrapper[4749]: I0219 18:35:36.308172 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tscgs"] Feb 19 18:35:36 crc kubenswrapper[4749]: I0219 18:35:36.308735 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tscgs" podUID="8955e517-c79b-4f52-9e06-a399c24532cf" containerName="registry-server" containerID="cri-o://3f2717897c1aa54349ba14cab1558a202e4d64d7460d570a079357a93a17f72d" gracePeriod=2 Feb 19 18:35:37 crc kubenswrapper[4749]: I0219 18:35:37.623612 4749 generic.go:334] "Generic (PLEG): container finished" podID="8955e517-c79b-4f52-9e06-a399c24532cf" containerID="3f2717897c1aa54349ba14cab1558a202e4d64d7460d570a079357a93a17f72d" exitCode=0 Feb 19 18:35:37 crc kubenswrapper[4749]: I0219 18:35:37.623672 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tscgs" event={"ID":"8955e517-c79b-4f52-9e06-a399c24532cf","Type":"ContainerDied","Data":"3f2717897c1aa54349ba14cab1558a202e4d64d7460d570a079357a93a17f72d"} Feb 19 18:35:38 crc kubenswrapper[4749]: I0219 18:35:38.391149 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2n4lk" Feb 19 18:35:38 crc kubenswrapper[4749]: I0219 18:35:38.753925 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tscgs" Feb 19 18:35:38 crc kubenswrapper[4749]: I0219 18:35:38.800154 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pdxkf" Feb 19 18:35:38 crc kubenswrapper[4749]: I0219 18:35:38.838990 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8955e517-c79b-4f52-9e06-a399c24532cf-catalog-content\") pod \"8955e517-c79b-4f52-9e06-a399c24532cf\" (UID: \"8955e517-c79b-4f52-9e06-a399c24532cf\") " Feb 19 18:35:38 crc kubenswrapper[4749]: I0219 18:35:38.839102 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qn4q6\" (UniqueName: \"kubernetes.io/projected/8955e517-c79b-4f52-9e06-a399c24532cf-kube-api-access-qn4q6\") pod \"8955e517-c79b-4f52-9e06-a399c24532cf\" (UID: \"8955e517-c79b-4f52-9e06-a399c24532cf\") " Feb 19 18:35:38 crc kubenswrapper[4749]: I0219 18:35:38.839138 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8955e517-c79b-4f52-9e06-a399c24532cf-utilities\") pod \"8955e517-c79b-4f52-9e06-a399c24532cf\" (UID: \"8955e517-c79b-4f52-9e06-a399c24532cf\") " Feb 19 18:35:38 crc kubenswrapper[4749]: I0219 18:35:38.841558 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8955e517-c79b-4f52-9e06-a399c24532cf-utilities" (OuterVolumeSpecName: "utilities") pod "8955e517-c79b-4f52-9e06-a399c24532cf" (UID: "8955e517-c79b-4f52-9e06-a399c24532cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:35:38 crc kubenswrapper[4749]: I0219 18:35:38.858391 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8955e517-c79b-4f52-9e06-a399c24532cf-kube-api-access-qn4q6" (OuterVolumeSpecName: "kube-api-access-qn4q6") pod "8955e517-c79b-4f52-9e06-a399c24532cf" (UID: "8955e517-c79b-4f52-9e06-a399c24532cf"). InnerVolumeSpecName "kube-api-access-qn4q6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:35:38 crc kubenswrapper[4749]: I0219 18:35:38.940924 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8955e517-c79b-4f52-9e06-a399c24532cf-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:38 crc kubenswrapper[4749]: I0219 18:35:38.940966 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qn4q6\" (UniqueName: \"kubernetes.io/projected/8955e517-c79b-4f52-9e06-a399c24532cf-kube-api-access-qn4q6\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:39 crc kubenswrapper[4749]: I0219 18:35:39.121668 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8955e517-c79b-4f52-9e06-a399c24532cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8955e517-c79b-4f52-9e06-a399c24532cf" (UID: "8955e517-c79b-4f52-9e06-a399c24532cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:35:39 crc kubenswrapper[4749]: I0219 18:35:39.143687 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8955e517-c79b-4f52-9e06-a399c24532cf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:39 crc kubenswrapper[4749]: I0219 18:35:39.644470 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tscgs" event={"ID":"8955e517-c79b-4f52-9e06-a399c24532cf","Type":"ContainerDied","Data":"41df819f21891fed71c7987d061f4f46f1145f3e2b3e03e9b70c9e65a0d1c322"} Feb 19 18:35:39 crc kubenswrapper[4749]: I0219 18:35:39.644554 4749 scope.go:117] "RemoveContainer" containerID="3f2717897c1aa54349ba14cab1558a202e4d64d7460d570a079357a93a17f72d" Feb 19 18:35:39 crc kubenswrapper[4749]: I0219 18:35:39.644558 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tscgs" Feb 19 18:35:39 crc kubenswrapper[4749]: I0219 18:35:39.667957 4749 scope.go:117] "RemoveContainer" containerID="17e3d31959da0414bef01bac0ee8edbbfa1f9e079591bcf66422ec767644d585" Feb 19 18:35:39 crc kubenswrapper[4749]: I0219 18:35:39.688171 4749 scope.go:117] "RemoveContainer" containerID="00be5b4119000da1bf864d74735e0f2aee7811b98ab0bc3f7076534ffdc1ceff" Feb 19 18:35:39 crc kubenswrapper[4749]: I0219 18:35:39.695177 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tscgs"] Feb 19 18:35:39 crc kubenswrapper[4749]: I0219 18:35:39.698723 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tscgs"] Feb 19 18:35:40 crc kubenswrapper[4749]: I0219 18:35:40.420574 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zqrzw" Feb 19 18:35:40 crc kubenswrapper[4749]: I0219 18:35:40.511579 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pdxkf"] Feb 19 18:35:40 crc kubenswrapper[4749]: I0219 18:35:40.511992 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pdxkf" podUID="81536517-730c-4da7-b371-efe28f18a1f3" containerName="registry-server" containerID="cri-o://c74d5aabbdd09d1bd9a8a16830e0bad0afe21534a920f308bc1adbb5aa91ef46" gracePeriod=2 Feb 19 18:35:40 crc kubenswrapper[4749]: I0219 18:35:40.653590 4749 generic.go:334] "Generic (PLEG): container finished" podID="81536517-730c-4da7-b371-efe28f18a1f3" containerID="c74d5aabbdd09d1bd9a8a16830e0bad0afe21534a920f308bc1adbb5aa91ef46" exitCode=0 Feb 19 18:35:40 crc kubenswrapper[4749]: I0219 18:35:40.653748 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pdxkf" event={"ID":"81536517-730c-4da7-b371-efe28f18a1f3","Type":"ContainerDied","Data":"c74d5aabbdd09d1bd9a8a16830e0bad0afe21534a920f308bc1adbb5aa91ef46"} Feb 19 18:35:40 crc kubenswrapper[4749]: I0219 18:35:40.685549 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8955e517-c79b-4f52-9e06-a399c24532cf" path="/var/lib/kubelet/pods/8955e517-c79b-4f52-9e06-a399c24532cf/volumes" Feb 19 18:35:41 crc kubenswrapper[4749]: I0219 18:35:41.042723 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pdxkf" Feb 19 18:35:41 crc kubenswrapper[4749]: I0219 18:35:41.169130 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81536517-730c-4da7-b371-efe28f18a1f3-catalog-content\") pod \"81536517-730c-4da7-b371-efe28f18a1f3\" (UID: \"81536517-730c-4da7-b371-efe28f18a1f3\") " Feb 19 18:35:41 crc kubenswrapper[4749]: I0219 18:35:41.169219 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81536517-730c-4da7-b371-efe28f18a1f3-utilities\") pod \"81536517-730c-4da7-b371-efe28f18a1f3\" (UID: \"81536517-730c-4da7-b371-efe28f18a1f3\") " Feb 19 18:35:41 crc kubenswrapper[4749]: I0219 18:35:41.169265 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wtxh\" (UniqueName: \"kubernetes.io/projected/81536517-730c-4da7-b371-efe28f18a1f3-kube-api-access-4wtxh\") pod \"81536517-730c-4da7-b371-efe28f18a1f3\" (UID: \"81536517-730c-4da7-b371-efe28f18a1f3\") " Feb 19 18:35:41 crc kubenswrapper[4749]: I0219 18:35:41.170290 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81536517-730c-4da7-b371-efe28f18a1f3-utilities" (OuterVolumeSpecName: "utilities") pod "81536517-730c-4da7-b371-efe28f18a1f3" (UID: "81536517-730c-4da7-b371-efe28f18a1f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:35:41 crc kubenswrapper[4749]: I0219 18:35:41.174073 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81536517-730c-4da7-b371-efe28f18a1f3-kube-api-access-4wtxh" (OuterVolumeSpecName: "kube-api-access-4wtxh") pod "81536517-730c-4da7-b371-efe28f18a1f3" (UID: "81536517-730c-4da7-b371-efe28f18a1f3"). InnerVolumeSpecName "kube-api-access-4wtxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:35:41 crc kubenswrapper[4749]: I0219 18:35:41.250128 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81536517-730c-4da7-b371-efe28f18a1f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81536517-730c-4da7-b371-efe28f18a1f3" (UID: "81536517-730c-4da7-b371-efe28f18a1f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:35:41 crc kubenswrapper[4749]: I0219 18:35:41.270913 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wtxh\" (UniqueName: \"kubernetes.io/projected/81536517-730c-4da7-b371-efe28f18a1f3-kube-api-access-4wtxh\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:41 crc kubenswrapper[4749]: I0219 18:35:41.270953 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81536517-730c-4da7-b371-efe28f18a1f3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:41 crc kubenswrapper[4749]: I0219 18:35:41.270966 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81536517-730c-4da7-b371-efe28f18a1f3-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:41 crc kubenswrapper[4749]: I0219 18:35:41.663131 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pdxkf" event={"ID":"81536517-730c-4da7-b371-efe28f18a1f3","Type":"ContainerDied","Data":"f3f2cf87c27b40ab42ea2ac6f61063e28fcf1cf84b86338bb990aaa53d21e39d"} Feb 19 18:35:41 crc kubenswrapper[4749]: I0219 18:35:41.663193 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pdxkf" Feb 19 18:35:41 crc kubenswrapper[4749]: I0219 18:35:41.663213 4749 scope.go:117] "RemoveContainer" containerID="c74d5aabbdd09d1bd9a8a16830e0bad0afe21534a920f308bc1adbb5aa91ef46" Feb 19 18:35:41 crc kubenswrapper[4749]: I0219 18:35:41.678505 4749 scope.go:117] "RemoveContainer" containerID="ff83d5885b32b3e1d0b0a8ec1a6d13cb1cbcdfba0e2e1e19bc101e6b6b5eadb7" Feb 19 18:35:41 crc kubenswrapper[4749]: I0219 18:35:41.693763 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pdxkf"] Feb 19 18:35:41 crc kubenswrapper[4749]: I0219 18:35:41.699288 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pdxkf"] Feb 19 18:35:41 crc kubenswrapper[4749]: I0219 18:35:41.699819 4749 scope.go:117] "RemoveContainer" containerID="5513c34640b835ca8331f826dd692abc13d4195f0163476d0e0db5e09bb115a0" Feb 19 18:35:42 crc kubenswrapper[4749]: I0219 18:35:42.685460 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81536517-730c-4da7-b371-efe28f18a1f3" path="/var/lib/kubelet/pods/81536517-730c-4da7-b371-efe28f18a1f3/volumes" Feb 19 18:35:48 crc kubenswrapper[4749]: I0219 18:35:48.357440 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-84c477c4f8-v6q6d"] Feb 19 18:35:48 crc kubenswrapper[4749]: I0219 18:35:48.357638 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-84c477c4f8-v6q6d" podUID="e263af71-4df5-47b4-82af-3569f91dcf3e" containerName="controller-manager" containerID="cri-o://b90d317ddb02752e5fc56ca67788e601b198208953b13c65dd60ac5761e1c82f" gracePeriod=30 Feb 19 18:35:48 crc kubenswrapper[4749]: I0219 18:35:48.374314 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp"] Feb 19 18:35:48 crc kubenswrapper[4749]: I0219 18:35:48.374846 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp" podUID="bef40c66-77f5-43c6-a6a2-637587bcded2" containerName="route-controller-manager" containerID="cri-o://3d90c0225f14c4f5946a079e426d80a31157c3c9689dc7eb24e639d51cee8082" gracePeriod=30 Feb 19 18:35:48 crc kubenswrapper[4749]: I0219 18:35:48.698696 4749 generic.go:334] "Generic (PLEG): container finished" podID="e263af71-4df5-47b4-82af-3569f91dcf3e" containerID="b90d317ddb02752e5fc56ca67788e601b198208953b13c65dd60ac5761e1c82f" exitCode=0 Feb 19 18:35:48 crc kubenswrapper[4749]: I0219 18:35:48.698778 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84c477c4f8-v6q6d" event={"ID":"e263af71-4df5-47b4-82af-3569f91dcf3e","Type":"ContainerDied","Data":"b90d317ddb02752e5fc56ca67788e601b198208953b13c65dd60ac5761e1c82f"} Feb 19 18:35:48 crc kubenswrapper[4749]: I0219 18:35:48.700508 4749 generic.go:334] "Generic (PLEG): container finished" podID="bef40c66-77f5-43c6-a6a2-637587bcded2" containerID="3d90c0225f14c4f5946a079e426d80a31157c3c9689dc7eb24e639d51cee8082" exitCode=0 Feb 19 18:35:48 crc kubenswrapper[4749]: I0219 18:35:48.700551 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp" event={"ID":"bef40c66-77f5-43c6-a6a2-637587bcded2","Type":"ContainerDied","Data":"3d90c0225f14c4f5946a079e426d80a31157c3c9689dc7eb24e639d51cee8082"} Feb 19 18:35:48 crc kubenswrapper[4749]: I0219 18:35:48.887559 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp" Feb 19 18:35:48 crc kubenswrapper[4749]: I0219 18:35:48.895787 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84c477c4f8-v6q6d" Feb 19 18:35:48 crc kubenswrapper[4749]: I0219 18:35:48.974137 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bef40c66-77f5-43c6-a6a2-637587bcded2-serving-cert\") pod \"bef40c66-77f5-43c6-a6a2-637587bcded2\" (UID: \"bef40c66-77f5-43c6-a6a2-637587bcded2\") " Feb 19 18:35:48 crc kubenswrapper[4749]: I0219 18:35:48.974234 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bef40c66-77f5-43c6-a6a2-637587bcded2-config\") pod \"bef40c66-77f5-43c6-a6a2-637587bcded2\" (UID: \"bef40c66-77f5-43c6-a6a2-637587bcded2\") " Feb 19 18:35:48 crc kubenswrapper[4749]: I0219 18:35:48.974304 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhk5w\" (UniqueName: \"kubernetes.io/projected/bef40c66-77f5-43c6-a6a2-637587bcded2-kube-api-access-nhk5w\") pod \"bef40c66-77f5-43c6-a6a2-637587bcded2\" (UID: \"bef40c66-77f5-43c6-a6a2-637587bcded2\") " Feb 19 18:35:48 crc kubenswrapper[4749]: I0219 18:35:48.974337 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bef40c66-77f5-43c6-a6a2-637587bcded2-client-ca\") pod \"bef40c66-77f5-43c6-a6a2-637587bcded2\" (UID: \"bef40c66-77f5-43c6-a6a2-637587bcded2\") " Feb 19 18:35:48 crc kubenswrapper[4749]: I0219 18:35:48.975083 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bef40c66-77f5-43c6-a6a2-637587bcded2-client-ca" (OuterVolumeSpecName: "client-ca") pod "bef40c66-77f5-43c6-a6a2-637587bcded2" (UID: "bef40c66-77f5-43c6-a6a2-637587bcded2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:35:48 crc kubenswrapper[4749]: I0219 18:35:48.975133 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bef40c66-77f5-43c6-a6a2-637587bcded2-config" (OuterVolumeSpecName: "config") pod "bef40c66-77f5-43c6-a6a2-637587bcded2" (UID: "bef40c66-77f5-43c6-a6a2-637587bcded2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:35:48 crc kubenswrapper[4749]: I0219 18:35:48.979633 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bef40c66-77f5-43c6-a6a2-637587bcded2-kube-api-access-nhk5w" (OuterVolumeSpecName: "kube-api-access-nhk5w") pod "bef40c66-77f5-43c6-a6a2-637587bcded2" (UID: "bef40c66-77f5-43c6-a6a2-637587bcded2"). InnerVolumeSpecName "kube-api-access-nhk5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:35:48 crc kubenswrapper[4749]: I0219 18:35:48.980351 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bef40c66-77f5-43c6-a6a2-637587bcded2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bef40c66-77f5-43c6-a6a2-637587bcded2" (UID: "bef40c66-77f5-43c6-a6a2-637587bcded2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.074925 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b78l4\" (UniqueName: \"kubernetes.io/projected/e263af71-4df5-47b4-82af-3569f91dcf3e-kube-api-access-b78l4\") pod \"e263af71-4df5-47b4-82af-3569f91dcf3e\" (UID: \"e263af71-4df5-47b4-82af-3569f91dcf3e\") " Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.075091 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e263af71-4df5-47b4-82af-3569f91dcf3e-config\") pod \"e263af71-4df5-47b4-82af-3569f91dcf3e\" (UID: \"e263af71-4df5-47b4-82af-3569f91dcf3e\") " Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.075120 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e263af71-4df5-47b4-82af-3569f91dcf3e-client-ca\") pod \"e263af71-4df5-47b4-82af-3569f91dcf3e\" (UID: \"e263af71-4df5-47b4-82af-3569f91dcf3e\") " Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.075157 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e263af71-4df5-47b4-82af-3569f91dcf3e-proxy-ca-bundles\") pod \"e263af71-4df5-47b4-82af-3569f91dcf3e\" (UID: \"e263af71-4df5-47b4-82af-3569f91dcf3e\") " Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.075186 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e263af71-4df5-47b4-82af-3569f91dcf3e-serving-cert\") pod \"e263af71-4df5-47b4-82af-3569f91dcf3e\" (UID: \"e263af71-4df5-47b4-82af-3569f91dcf3e\") " Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.075417 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhk5w\" (UniqueName: \"kubernetes.io/projected/bef40c66-77f5-43c6-a6a2-637587bcded2-kube-api-access-nhk5w\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.075444 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bef40c66-77f5-43c6-a6a2-637587bcded2-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.075456 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bef40c66-77f5-43c6-a6a2-637587bcded2-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.075471 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bef40c66-77f5-43c6-a6a2-637587bcded2-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.075831 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e263af71-4df5-47b4-82af-3569f91dcf3e-client-ca" (OuterVolumeSpecName: "client-ca") pod "e263af71-4df5-47b4-82af-3569f91dcf3e" (UID: "e263af71-4df5-47b4-82af-3569f91dcf3e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.075869 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e263af71-4df5-47b4-82af-3569f91dcf3e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e263af71-4df5-47b4-82af-3569f91dcf3e" (UID: "e263af71-4df5-47b4-82af-3569f91dcf3e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.075923 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e263af71-4df5-47b4-82af-3569f91dcf3e-config" (OuterVolumeSpecName: "config") pod "e263af71-4df5-47b4-82af-3569f91dcf3e" (UID: "e263af71-4df5-47b4-82af-3569f91dcf3e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.078818 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e263af71-4df5-47b4-82af-3569f91dcf3e-kube-api-access-b78l4" (OuterVolumeSpecName: "kube-api-access-b78l4") pod "e263af71-4df5-47b4-82af-3569f91dcf3e" (UID: "e263af71-4df5-47b4-82af-3569f91dcf3e"). InnerVolumeSpecName "kube-api-access-b78l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.079363 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e263af71-4df5-47b4-82af-3569f91dcf3e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e263af71-4df5-47b4-82af-3569f91dcf3e" (UID: "e263af71-4df5-47b4-82af-3569f91dcf3e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.176495 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e263af71-4df5-47b4-82af-3569f91dcf3e-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.176562 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b78l4\" (UniqueName: \"kubernetes.io/projected/e263af71-4df5-47b4-82af-3569f91dcf3e-kube-api-access-b78l4\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.176587 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e263af71-4df5-47b4-82af-3569f91dcf3e-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.176607 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e263af71-4df5-47b4-82af-3569f91dcf3e-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.176629 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e263af71-4df5-47b4-82af-3569f91dcf3e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.710852 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84c477c4f8-v6q6d" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.710833 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84c477c4f8-v6q6d" event={"ID":"e263af71-4df5-47b4-82af-3569f91dcf3e","Type":"ContainerDied","Data":"08f90d28a3c11b3d88d5fc874f9e1afab31f801c4c1e0d6c79741fcd692ae8a0"} Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.710966 4749 scope.go:117] "RemoveContainer" containerID="b90d317ddb02752e5fc56ca67788e601b198208953b13c65dd60ac5761e1c82f" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.713778 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp" event={"ID":"bef40c66-77f5-43c6-a6a2-637587bcded2","Type":"ContainerDied","Data":"e451ab7c37af480a0e0a87c30ab4b2d018cdca41887a7e387c425221d652b2f1"} Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.713834 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.734572 4749 scope.go:117] "RemoveContainer" containerID="3d90c0225f14c4f5946a079e426d80a31157c3c9689dc7eb24e639d51cee8082" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.749418 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-84c477c4f8-v6q6d"] Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.758114 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-84c477c4f8-v6q6d"] Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.769246 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4"] Feb 19 18:35:49 crc kubenswrapper[4749]: E0219 18:35:49.769716 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8955e517-c79b-4f52-9e06-a399c24532cf" containerName="registry-server" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.769746 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8955e517-c79b-4f52-9e06-a399c24532cf" containerName="registry-server" Feb 19 18:35:49 crc kubenswrapper[4749]: E0219 18:35:49.769762 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8955e517-c79b-4f52-9e06-a399c24532cf" containerName="extract-content" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.769775 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8955e517-c79b-4f52-9e06-a399c24532cf" containerName="extract-content" Feb 19 18:35:49 crc kubenswrapper[4749]: E0219 18:35:49.769798 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8955e517-c79b-4f52-9e06-a399c24532cf" containerName="extract-utilities" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.769810 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8955e517-c79b-4f52-9e06-a399c24532cf" containerName="extract-utilities" Feb 19 18:35:49 crc kubenswrapper[4749]: E0219 18:35:49.769825 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81536517-730c-4da7-b371-efe28f18a1f3" containerName="extract-content" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.769833 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="81536517-730c-4da7-b371-efe28f18a1f3" containerName="extract-content" Feb 19 18:35:49 crc kubenswrapper[4749]: E0219 18:35:49.769849 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3563ed4-2c84-4219-9843-c95a8bba26ac" containerName="extract-content" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.769857 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3563ed4-2c84-4219-9843-c95a8bba26ac" containerName="extract-content" Feb 19 18:35:49 crc kubenswrapper[4749]: E0219 18:35:49.769869 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81536517-730c-4da7-b371-efe28f18a1f3" containerName="registry-server" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.769877 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="81536517-730c-4da7-b371-efe28f18a1f3" containerName="registry-server" Feb 19 18:35:49 crc kubenswrapper[4749]: E0219 18:35:49.769888 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3563ed4-2c84-4219-9843-c95a8bba26ac" containerName="registry-server" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.769896 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3563ed4-2c84-4219-9843-c95a8bba26ac" containerName="registry-server" Feb 19 18:35:49 crc kubenswrapper[4749]: E0219 18:35:49.769908 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3563ed4-2c84-4219-9843-c95a8bba26ac" containerName="extract-utilities" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.769916 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3563ed4-2c84-4219-9843-c95a8bba26ac" containerName="extract-utilities" Feb 19 18:35:49 crc kubenswrapper[4749]: E0219 18:35:49.769925 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e263af71-4df5-47b4-82af-3569f91dcf3e" containerName="controller-manager" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.769933 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e263af71-4df5-47b4-82af-3569f91dcf3e" containerName="controller-manager" Feb 19 18:35:49 crc kubenswrapper[4749]: E0219 18:35:49.769941 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef40c66-77f5-43c6-a6a2-637587bcded2" containerName="route-controller-manager" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.769949 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef40c66-77f5-43c6-a6a2-637587bcded2" containerName="route-controller-manager" Feb 19 18:35:49 crc kubenswrapper[4749]: E0219 18:35:49.769962 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81536517-730c-4da7-b371-efe28f18a1f3" containerName="extract-utilities" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.769970 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="81536517-730c-4da7-b371-efe28f18a1f3" containerName="extract-utilities" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.770142 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bef40c66-77f5-43c6-a6a2-637587bcded2" containerName="route-controller-manager" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.770170 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8955e517-c79b-4f52-9e06-a399c24532cf" containerName="registry-server" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.770186 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e263af71-4df5-47b4-82af-3569f91dcf3e" containerName="controller-manager" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.770201 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="81536517-730c-4da7-b371-efe28f18a1f3" containerName="registry-server" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.770214 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3563ed4-2c84-4219-9843-c95a8bba26ac" containerName="registry-server" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.770895 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.773158 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.774882 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.775188 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.775324 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.775516 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.775693 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.775990 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-698dc6f945-nl4lq"] Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.777516 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.781413 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4"] Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.783218 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.783377 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.783530 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.783628 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.783732 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.786125 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-698dc6f945-nl4lq"] Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.786273 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.789730 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp"] Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.790016 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.794924 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b859fbb45-jwtnp"] Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.872069 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" podUID="f8f1b64c-d615-49e7-8b6a-e0f038a58a40" containerName="oauth-openshift" containerID="cri-o://15f3b08b277ba33865c6359b4e545e70cb1e2d3b638af0cca8d466af5ba87c10" gracePeriod=15 Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.883319 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87af11fa-6465-4c84-af15-d570aea6592d-serving-cert\") pod \"route-controller-manager-57f5b44cd7-lvmb4\" (UID: \"87af11fa-6465-4c84-af15-d570aea6592d\") " pod="openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.883375 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87af11fa-6465-4c84-af15-d570aea6592d-client-ca\") pod \"route-controller-manager-57f5b44cd7-lvmb4\" (UID: \"87af11fa-6465-4c84-af15-d570aea6592d\") " pod="openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.883410 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87af11fa-6465-4c84-af15-d570aea6592d-config\") pod \"route-controller-manager-57f5b44cd7-lvmb4\" (UID: \"87af11fa-6465-4c84-af15-d570aea6592d\") " pod="openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.883442 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcx47\" (UniqueName: \"kubernetes.io/projected/8f53d12d-2978-4387-aff5-365a6839966c-kube-api-access-rcx47\") pod \"controller-manager-698dc6f945-nl4lq\" (UID: \"8f53d12d-2978-4387-aff5-365a6839966c\") " pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.883493 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f53d12d-2978-4387-aff5-365a6839966c-config\") pod \"controller-manager-698dc6f945-nl4lq\" (UID: \"8f53d12d-2978-4387-aff5-365a6839966c\") " pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.883510 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8f53d12d-2978-4387-aff5-365a6839966c-proxy-ca-bundles\") pod \"controller-manager-698dc6f945-nl4lq\" (UID: \"8f53d12d-2978-4387-aff5-365a6839966c\") " pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.883536 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f53d12d-2978-4387-aff5-365a6839966c-serving-cert\") pod \"controller-manager-698dc6f945-nl4lq\" (UID: \"8f53d12d-2978-4387-aff5-365a6839966c\") " pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.883559 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f53d12d-2978-4387-aff5-365a6839966c-client-ca\") pod \"controller-manager-698dc6f945-nl4lq\" (UID: \"8f53d12d-2978-4387-aff5-365a6839966c\") " pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.883584 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkm68\" (UniqueName: \"kubernetes.io/projected/87af11fa-6465-4c84-af15-d570aea6592d-kube-api-access-fkm68\") pod \"route-controller-manager-57f5b44cd7-lvmb4\" (UID: \"87af11fa-6465-4c84-af15-d570aea6592d\") " pod="openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.984595 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87af11fa-6465-4c84-af15-d570aea6592d-client-ca\") pod \"route-controller-manager-57f5b44cd7-lvmb4\" (UID: \"87af11fa-6465-4c84-af15-d570aea6592d\") " pod="openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.984819 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87af11fa-6465-4c84-af15-d570aea6592d-config\") pod \"route-controller-manager-57f5b44cd7-lvmb4\" (UID: \"87af11fa-6465-4c84-af15-d570aea6592d\") " pod="openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.984981 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcx47\" (UniqueName: \"kubernetes.io/projected/8f53d12d-2978-4387-aff5-365a6839966c-kube-api-access-rcx47\") pod \"controller-manager-698dc6f945-nl4lq\" (UID: \"8f53d12d-2978-4387-aff5-365a6839966c\") " pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.985664 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f53d12d-2978-4387-aff5-365a6839966c-config\") pod \"controller-manager-698dc6f945-nl4lq\" (UID: \"8f53d12d-2978-4387-aff5-365a6839966c\") " pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.985811 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8f53d12d-2978-4387-aff5-365a6839966c-proxy-ca-bundles\") pod \"controller-manager-698dc6f945-nl4lq\" (UID: \"8f53d12d-2978-4387-aff5-365a6839966c\") " pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.987051 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f53d12d-2978-4387-aff5-365a6839966c-config\") pod \"controller-manager-698dc6f945-nl4lq\" (UID: \"8f53d12d-2978-4387-aff5-365a6839966c\") " pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.986278 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87af11fa-6465-4c84-af15-d570aea6592d-config\") pod \"route-controller-manager-57f5b44cd7-lvmb4\" (UID: \"87af11fa-6465-4c84-af15-d570aea6592d\") " pod="openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.986857 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8f53d12d-2978-4387-aff5-365a6839966c-proxy-ca-bundles\") pod \"controller-manager-698dc6f945-nl4lq\" (UID: \"8f53d12d-2978-4387-aff5-365a6839966c\") " pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.985695 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87af11fa-6465-4c84-af15-d570aea6592d-client-ca\") pod \"route-controller-manager-57f5b44cd7-lvmb4\" (UID: \"87af11fa-6465-4c84-af15-d570aea6592d\") " pod="openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.987414 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f53d12d-2978-4387-aff5-365a6839966c-serving-cert\") pod \"controller-manager-698dc6f945-nl4lq\" (UID: \"8f53d12d-2978-4387-aff5-365a6839966c\") " pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.988058 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f53d12d-2978-4387-aff5-365a6839966c-client-ca\") pod \"controller-manager-698dc6f945-nl4lq\" (UID: \"8f53d12d-2978-4387-aff5-365a6839966c\") " pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.988195 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkm68\" (UniqueName: \"kubernetes.io/projected/87af11fa-6465-4c84-af15-d570aea6592d-kube-api-access-fkm68\") pod \"route-controller-manager-57f5b44cd7-lvmb4\" (UID: \"87af11fa-6465-4c84-af15-d570aea6592d\") " pod="openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.988346 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87af11fa-6465-4c84-af15-d570aea6592d-serving-cert\") pod \"route-controller-manager-57f5b44cd7-lvmb4\" (UID: \"87af11fa-6465-4c84-af15-d570aea6592d\") " pod="openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.988760 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f53d12d-2978-4387-aff5-365a6839966c-client-ca\") pod \"controller-manager-698dc6f945-nl4lq\" (UID: \"8f53d12d-2978-4387-aff5-365a6839966c\") " pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.991655 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f53d12d-2978-4387-aff5-365a6839966c-serving-cert\") pod \"controller-manager-698dc6f945-nl4lq\" (UID: \"8f53d12d-2978-4387-aff5-365a6839966c\") " pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" Feb 19 18:35:49 crc kubenswrapper[4749]: I0219 18:35:49.992337 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87af11fa-6465-4c84-af15-d570aea6592d-serving-cert\") pod \"route-controller-manager-57f5b44cd7-lvmb4\" (UID: \"87af11fa-6465-4c84-af15-d570aea6592d\") " pod="openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.003271 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcx47\" (UniqueName: \"kubernetes.io/projected/8f53d12d-2978-4387-aff5-365a6839966c-kube-api-access-rcx47\") pod \"controller-manager-698dc6f945-nl4lq\" (UID: \"8f53d12d-2978-4387-aff5-365a6839966c\") " pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.006443 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkm68\" (UniqueName: \"kubernetes.io/projected/87af11fa-6465-4c84-af15-d570aea6592d-kube-api-access-fkm68\") pod \"route-controller-manager-57f5b44cd7-lvmb4\" (UID: \"87af11fa-6465-4c84-af15-d570aea6592d\") " pod="openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.102517 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.112728 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.211601 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.397714 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-user-template-provider-selection\") pod \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.397916 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-session\") pod \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.398068 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klmzb\" (UniqueName: \"kubernetes.io/projected/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-kube-api-access-klmzb\") pod \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.398197 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-ocp-branding-template\") pod \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.398279 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-audit-policies\") pod \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.398354 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-user-template-login\") pod \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.398435 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-serving-cert\") pod \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.398514 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-router-certs\") pod \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.398561 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-trusted-ca-bundle\") pod \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.398647 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-audit-dir\") pod \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.398718 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-user-idp-0-file-data\") pod \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.398784 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-user-template-error\") pod \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.398985 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-service-ca\") pod \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.399043 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f8f1b64c-d615-49e7-8b6a-e0f038a58a40" (UID: "f8f1b64c-d615-49e7-8b6a-e0f038a58a40"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.399226 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-cliconfig\") pod \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\" (UID: \"f8f1b64c-d615-49e7-8b6a-e0f038a58a40\") " Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.399939 4749 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.401319 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "f8f1b64c-d615-49e7-8b6a-e0f038a58a40" (UID: "f8f1b64c-d615-49e7-8b6a-e0f038a58a40"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.401800 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f8f1b64c-d615-49e7-8b6a-e0f038a58a40" (UID: "f8f1b64c-d615-49e7-8b6a-e0f038a58a40"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.401795 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "f8f1b64c-d615-49e7-8b6a-e0f038a58a40" (UID: "f8f1b64c-d615-49e7-8b6a-e0f038a58a40"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.401940 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "f8f1b64c-d615-49e7-8b6a-e0f038a58a40" (UID: "f8f1b64c-d615-49e7-8b6a-e0f038a58a40"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.405133 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "f8f1b64c-d615-49e7-8b6a-e0f038a58a40" (UID: "f8f1b64c-d615-49e7-8b6a-e0f038a58a40"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.405515 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-kube-api-access-klmzb" (OuterVolumeSpecName: "kube-api-access-klmzb") pod "f8f1b64c-d615-49e7-8b6a-e0f038a58a40" (UID: "f8f1b64c-d615-49e7-8b6a-e0f038a58a40"). InnerVolumeSpecName "kube-api-access-klmzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.405631 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "f8f1b64c-d615-49e7-8b6a-e0f038a58a40" (UID: "f8f1b64c-d615-49e7-8b6a-e0f038a58a40"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.406072 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "f8f1b64c-d615-49e7-8b6a-e0f038a58a40" (UID: "f8f1b64c-d615-49e7-8b6a-e0f038a58a40"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.406237 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "f8f1b64c-d615-49e7-8b6a-e0f038a58a40" (UID: "f8f1b64c-d615-49e7-8b6a-e0f038a58a40"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.407682 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "f8f1b64c-d615-49e7-8b6a-e0f038a58a40" (UID: "f8f1b64c-d615-49e7-8b6a-e0f038a58a40"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.407865 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "f8f1b64c-d615-49e7-8b6a-e0f038a58a40" (UID: "f8f1b64c-d615-49e7-8b6a-e0f038a58a40"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.408071 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "f8f1b64c-d615-49e7-8b6a-e0f038a58a40" (UID: "f8f1b64c-d615-49e7-8b6a-e0f038a58a40"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.408431 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "f8f1b64c-d615-49e7-8b6a-e0f038a58a40" (UID: "f8f1b64c-d615-49e7-8b6a-e0f038a58a40"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.501146 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.501194 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.501214 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.501232 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.501253 4749 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.501273 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.501291 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.501310 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.501331 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.501350 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klmzb\" (UniqueName: \"kubernetes.io/projected/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-kube-api-access-klmzb\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.501368 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.501385 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.501402 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8f1b64c-d615-49e7-8b6a-e0f038a58a40-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.519754 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4"] Feb 19 18:35:50 crc kubenswrapper[4749]: W0219 18:35:50.526173 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87af11fa_6465_4c84_af15_d570aea6592d.slice/crio-f88ab0761aeeefe72bf2b93727d7ba326ba23c34c29b800fb4ad45e1014c3481 WatchSource:0}: Error finding container f88ab0761aeeefe72bf2b93727d7ba326ba23c34c29b800fb4ad45e1014c3481: Status 404 returned error can't find the container with id f88ab0761aeeefe72bf2b93727d7ba326ba23c34c29b800fb4ad45e1014c3481 Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.572932 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-698dc6f945-nl4lq"] Feb 19 18:35:50 crc kubenswrapper[4749]: W0219 18:35:50.574631 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f53d12d_2978_4387_aff5_365a6839966c.slice/crio-c4c2e92eaa1bfcd2aebc79bf3e784ff8c0c396ebd7c3b967297c5a57bbf57969 WatchSource:0}: Error finding container c4c2e92eaa1bfcd2aebc79bf3e784ff8c0c396ebd7c3b967297c5a57bbf57969: Status 404 returned error can't find the container with id c4c2e92eaa1bfcd2aebc79bf3e784ff8c0c396ebd7c3b967297c5a57bbf57969 Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.685911 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bef40c66-77f5-43c6-a6a2-637587bcded2" path="/var/lib/kubelet/pods/bef40c66-77f5-43c6-a6a2-637587bcded2/volumes" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.688205 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e263af71-4df5-47b4-82af-3569f91dcf3e" path="/var/lib/kubelet/pods/e263af71-4df5-47b4-82af-3569f91dcf3e/volumes" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.722827 4749 generic.go:334] "Generic (PLEG): container finished" podID="f8f1b64c-d615-49e7-8b6a-e0f038a58a40" containerID="15f3b08b277ba33865c6359b4e545e70cb1e2d3b638af0cca8d466af5ba87c10" exitCode=0 Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.722904 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.722890 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" event={"ID":"f8f1b64c-d615-49e7-8b6a-e0f038a58a40","Type":"ContainerDied","Data":"15f3b08b277ba33865c6359b4e545e70cb1e2d3b638af0cca8d466af5ba87c10"} Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.722975 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2dbbd" event={"ID":"f8f1b64c-d615-49e7-8b6a-e0f038a58a40","Type":"ContainerDied","Data":"cae8e74ac12aae92c557e66c0c383bc703699b2c2ae90c92b4ea297987efef20"} Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.722997 4749 scope.go:117] "RemoveContainer" containerID="15f3b08b277ba33865c6359b4e545e70cb1e2d3b638af0cca8d466af5ba87c10" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.728940 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" event={"ID":"8f53d12d-2978-4387-aff5-365a6839966c","Type":"ContainerStarted","Data":"936fe1e4ffbc08e3376e7ed178b8743c5b9dac79bbd7e4ee9926a0d1fb6119a8"} Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.728972 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" event={"ID":"8f53d12d-2978-4387-aff5-365a6839966c","Type":"ContainerStarted","Data":"c4c2e92eaa1bfcd2aebc79bf3e784ff8c0c396ebd7c3b967297c5a57bbf57969"} Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.729362 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.731459 4749 patch_prober.go:28] interesting pod/controller-manager-698dc6f945-nl4lq container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" start-of-body= Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.731501 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" podUID="8f53d12d-2978-4387-aff5-365a6839966c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.731821 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4" event={"ID":"87af11fa-6465-4c84-af15-d570aea6592d","Type":"ContainerStarted","Data":"bc9e0a49bbdc83f5b1fccbc4ffdaf3378f80eadf81a398f2ccc19ce1e5ca1513"} Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.731862 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4" event={"ID":"87af11fa-6465-4c84-af15-d570aea6592d","Type":"ContainerStarted","Data":"f88ab0761aeeefe72bf2b93727d7ba326ba23c34c29b800fb4ad45e1014c3481"} Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.732260 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.737528 4749 patch_prober.go:28] interesting pod/route-controller-manager-57f5b44cd7-lvmb4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" start-of-body= Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.737599 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4" podUID="87af11fa-6465-4c84-af15-d570aea6592d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.738538 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2dbbd"] Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.743640 4749 scope.go:117] "RemoveContainer" containerID="15f3b08b277ba33865c6359b4e545e70cb1e2d3b638af0cca8d466af5ba87c10" Feb 19 18:35:50 crc kubenswrapper[4749]: E0219 18:35:50.744179 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15f3b08b277ba33865c6359b4e545e70cb1e2d3b638af0cca8d466af5ba87c10\": container with ID starting with 15f3b08b277ba33865c6359b4e545e70cb1e2d3b638af0cca8d466af5ba87c10 not found: ID does not exist" containerID="15f3b08b277ba33865c6359b4e545e70cb1e2d3b638af0cca8d466af5ba87c10" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.744214 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15f3b08b277ba33865c6359b4e545e70cb1e2d3b638af0cca8d466af5ba87c10"} err="failed to get container status \"15f3b08b277ba33865c6359b4e545e70cb1e2d3b638af0cca8d466af5ba87c10\": rpc error: code = NotFound desc = could not find container \"15f3b08b277ba33865c6359b4e545e70cb1e2d3b638af0cca8d466af5ba87c10\": container with ID starting with 15f3b08b277ba33865c6359b4e545e70cb1e2d3b638af0cca8d466af5ba87c10 not found: ID does not exist" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.746584 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2dbbd"] Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.760720 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4" podStartSLOduration=2.760700528 podStartE2EDuration="2.760700528s" podCreationTimestamp="2026-02-19 18:35:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:35:50.759312786 +0000 UTC m=+124.720532740" watchObservedRunningTime="2026-02-19 18:35:50.760700528 +0000 UTC m=+124.721920482" Feb 19 18:35:50 crc kubenswrapper[4749]: I0219 18:35:50.783247 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" podStartSLOduration=2.78322629 podStartE2EDuration="2.78322629s" podCreationTimestamp="2026-02-19 18:35:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:35:50.780790825 +0000 UTC m=+124.742010789" watchObservedRunningTime="2026-02-19 18:35:50.78322629 +0000 UTC m=+124.744446244" Feb 19 18:35:51 crc kubenswrapper[4749]: I0219 18:35:51.743611 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" Feb 19 18:35:51 crc kubenswrapper[4749]: I0219 18:35:51.745004 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.565387 4749 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.566208 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://69af99983784464ff8b25db0e73db93c849f9a6e704ca8f907228103ae2c0591" gracePeriod=15 Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.566385 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://a7d9f8e499616cc17dac442dfcb89aa8bf915a4393701c45e1e84e60b734518b" gracePeriod=15 Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.566355 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://e9b12443ad3ae04dfb8e1c84530aa6ceb0f813c57b0c18201a84a3e83b219f88" gracePeriod=15 Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.566397 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://46302fdc6234ad50fae4a7f00eb45d9d7fd624c9680683f332fb020fcca639fe" gracePeriod=15 Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.566376 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://3d2eea03700cf7f239720aedd551d3c3bf84b7a0c8d0b69214a3e868115b1f9e" gracePeriod=15 Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.638954 4749 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 18:35:52 crc kubenswrapper[4749]: E0219 18:35:52.639499 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.639534 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 19 18:35:52 crc kubenswrapper[4749]: E0219 18:35:52.639556 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.639571 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 19 18:35:52 crc kubenswrapper[4749]: E0219 18:35:52.639601 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.639617 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 19 18:35:52 crc kubenswrapper[4749]: E0219 18:35:52.639637 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.639650 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 18:35:52 crc kubenswrapper[4749]: E0219 18:35:52.639672 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.639685 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 19 18:35:52 crc kubenswrapper[4749]: E0219 18:35:52.639706 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.639719 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 18:35:52 crc kubenswrapper[4749]: E0219 18:35:52.639749 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.639764 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 19 18:35:52 crc kubenswrapper[4749]: E0219 18:35:52.639787 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8f1b64c-d615-49e7-8b6a-e0f038a58a40" containerName="oauth-openshift" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.639805 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8f1b64c-d615-49e7-8b6a-e0f038a58a40" containerName="oauth-openshift" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.640124 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.640163 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.640188 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8f1b64c-d615-49e7-8b6a-e0f038a58a40" containerName="oauth-openshift" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.640215 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.640243 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.640263 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.640285 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.645585 4749 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.646922 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.647851 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.128:6443: connect: connection refused" Feb 19 18:35:52 crc kubenswrapper[4749]: E0219 18:35:52.664076 4749 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.128:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.711232 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8f1b64c-d615-49e7-8b6a-e0f038a58a40" path="/var/lib/kubelet/pods/f8f1b64c-d615-49e7-8b6a-e0f038a58a40/volumes" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.747605 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.749258 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.749760 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a7d9f8e499616cc17dac442dfcb89aa8bf915a4393701c45e1e84e60b734518b" exitCode=0 Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.749780 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e9b12443ad3ae04dfb8e1c84530aa6ceb0f813c57b0c18201a84a3e83b219f88" exitCode=0 Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.749788 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3d2eea03700cf7f239720aedd551d3c3bf84b7a0c8d0b69214a3e868115b1f9e" exitCode=0 Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.749795 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="46302fdc6234ad50fae4a7f00eb45d9d7fd624c9680683f332fb020fcca639fe" exitCode=2 Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.750533 4749 scope.go:117] "RemoveContainer" containerID="49098302a1c2e9d27f3beea4ed4b886169983707335b7599d69c4137e7e65016" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.832971 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.833351 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.833498 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.833635 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.833845 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.833892 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.833970 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.834543 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.935504 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.935614 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.935642 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.935682 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.935709 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.935743 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.935812 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.935836 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.935912 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.935957 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.936163 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.936395 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.936757 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.936791 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.936812 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.936835 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 18:35:52 crc kubenswrapper[4749]: I0219 18:35:52.965614 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 18:35:52 crc kubenswrapper[4749]: E0219 18:35:52.984932 4749 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.128:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895b9a672eabf17 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 18:35:52.983854871 +0000 UTC m=+126.945074825,LastTimestamp:2026-02-19 18:35:52.983854871 +0000 UTC m=+126.945074825,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 18:35:53 crc kubenswrapper[4749]: I0219 18:35:53.758346 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 18:35:53 crc kubenswrapper[4749]: I0219 18:35:53.761567 4749 generic.go:334] "Generic (PLEG): container finished" podID="5e6539f5-4700-4a9d-9428-6752835bbe20" containerID="4656ad09f93735fc310b822ee2c412ec5fc90c47600db1cd180c08d0b5286139" exitCode=0 Feb 19 18:35:53 crc kubenswrapper[4749]: I0219 18:35:53.761651 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5e6539f5-4700-4a9d-9428-6752835bbe20","Type":"ContainerDied","Data":"4656ad09f93735fc310b822ee2c412ec5fc90c47600db1cd180c08d0b5286139"} Feb 19 18:35:53 crc kubenswrapper[4749]: I0219 18:35:53.762400 4749 status_manager.go:851] "Failed to get status for pod" podUID="5e6539f5-4700-4a9d-9428-6752835bbe20" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.128:6443: connect: connection refused" Feb 19 18:35:53 crc kubenswrapper[4749]: I0219 18:35:53.763345 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"5c401f93c00f83173784642173aa87b0f3ddda9286d9457cf3eac1673a7e6fbf"} Feb 19 18:35:53 crc kubenswrapper[4749]: I0219 18:35:53.763382 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"1d4df9be0892addc51571beba0691f994c44dff86e1ddbccd6b526d713e2fa11"} Feb 19 18:35:53 crc kubenswrapper[4749]: I0219 18:35:53.764059 4749 status_manager.go:851] "Failed to get status for pod" podUID="5e6539f5-4700-4a9d-9428-6752835bbe20" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.128:6443: connect: connection refused" Feb 19 18:35:53 crc kubenswrapper[4749]: E0219 18:35:53.764096 4749 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.128:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 18:35:54 crc kubenswrapper[4749]: E0219 18:35:54.285959 4749 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.128:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895b9a672eabf17 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 18:35:52.983854871 +0000 UTC m=+126.945074825,LastTimestamp:2026-02-19 18:35:52.983854871 +0000 UTC m=+126.945074825,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 18:35:54 crc kubenswrapper[4749]: E0219 18:35:54.768277 4749 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.128:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 18:35:54 crc kubenswrapper[4749]: I0219 18:35:54.957178 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 18:35:54 crc kubenswrapper[4749]: I0219 18:35:54.957947 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:35:54 crc kubenswrapper[4749]: I0219 18:35:54.958868 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.128:6443: connect: connection refused" Feb 19 18:35:54 crc kubenswrapper[4749]: I0219 18:35:54.959588 4749 status_manager.go:851] "Failed to get status for pod" podUID="5e6539f5-4700-4a9d-9428-6752835bbe20" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.128:6443: connect: connection refused" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.039268 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.039997 4749 status_manager.go:851] "Failed to get status for pod" podUID="5e6539f5-4700-4a9d-9428-6752835bbe20" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.128:6443: connect: connection refused" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.040606 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.128:6443: connect: connection refused" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.060833 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.060999 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.061046 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.061039 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.061064 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.061216 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.061315 4749 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.061328 4749 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:55 crc kubenswrapper[4749]: E0219 18:35:55.134428 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.128:6443: connect: connection refused" Feb 19 18:35:55 crc kubenswrapper[4749]: E0219 18:35:55.135209 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.128:6443: connect: connection refused" Feb 19 18:35:55 crc kubenswrapper[4749]: E0219 18:35:55.135646 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.128:6443: connect: connection refused" Feb 19 18:35:55 crc kubenswrapper[4749]: E0219 18:35:55.136313 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.128:6443: connect: connection refused" Feb 19 18:35:55 crc kubenswrapper[4749]: E0219 18:35:55.136641 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.128:6443: connect: connection refused" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.136679 4749 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 19 18:35:55 crc kubenswrapper[4749]: E0219 18:35:55.136922 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.128:6443: connect: connection refused" interval="200ms" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.162438 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5e6539f5-4700-4a9d-9428-6752835bbe20-kubelet-dir\") pod \"5e6539f5-4700-4a9d-9428-6752835bbe20\" (UID: \"5e6539f5-4700-4a9d-9428-6752835bbe20\") " Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.162555 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5e6539f5-4700-4a9d-9428-6752835bbe20-kube-api-access\") pod \"5e6539f5-4700-4a9d-9428-6752835bbe20\" (UID: \"5e6539f5-4700-4a9d-9428-6752835bbe20\") " Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.162572 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5e6539f5-4700-4a9d-9428-6752835bbe20-var-lock\") pod \"5e6539f5-4700-4a9d-9428-6752835bbe20\" (UID: \"5e6539f5-4700-4a9d-9428-6752835bbe20\") " Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.162621 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e6539f5-4700-4a9d-9428-6752835bbe20-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5e6539f5-4700-4a9d-9428-6752835bbe20" (UID: "5e6539f5-4700-4a9d-9428-6752835bbe20"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.162691 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e6539f5-4700-4a9d-9428-6752835bbe20-var-lock" (OuterVolumeSpecName: "var-lock") pod "5e6539f5-4700-4a9d-9428-6752835bbe20" (UID: "5e6539f5-4700-4a9d-9428-6752835bbe20"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.163348 4749 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5e6539f5-4700-4a9d-9428-6752835bbe20-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.163378 4749 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.163392 4749 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5e6539f5-4700-4a9d-9428-6752835bbe20-var-lock\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.167525 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e6539f5-4700-4a9d-9428-6752835bbe20-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5e6539f5-4700-4a9d-9428-6752835bbe20" (UID: "5e6539f5-4700-4a9d-9428-6752835bbe20"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.265141 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5e6539f5-4700-4a9d-9428-6752835bbe20-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 18:35:55 crc kubenswrapper[4749]: E0219 18:35:55.337948 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.128:6443: connect: connection refused" interval="400ms" Feb 19 18:35:55 crc kubenswrapper[4749]: E0219 18:35:55.739661 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.128:6443: connect: connection refused" interval="800ms" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.777934 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.777933 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5e6539f5-4700-4a9d-9428-6752835bbe20","Type":"ContainerDied","Data":"61d47eb17d54c9f17e2ac716e9908a391157247e33fe3543cc757081b46db643"} Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.779069 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61d47eb17d54c9f17e2ac716e9908a391157247e33fe3543cc757081b46db643" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.781855 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.782582 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="69af99983784464ff8b25db0e73db93c849f9a6e704ca8f907228103ae2c0591" exitCode=0 Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.782630 4749 scope.go:117] "RemoveContainer" containerID="a7d9f8e499616cc17dac442dfcb89aa8bf915a4393701c45e1e84e60b734518b" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.782650 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.798847 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.128:6443: connect: connection refused" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.799488 4749 status_manager.go:851] "Failed to get status for pod" podUID="5e6539f5-4700-4a9d-9428-6752835bbe20" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.128:6443: connect: connection refused" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.799876 4749 status_manager.go:851] "Failed to get status for pod" podUID="5e6539f5-4700-4a9d-9428-6752835bbe20" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.128:6443: connect: connection refused" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.800163 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.128:6443: connect: connection refused" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.803218 4749 scope.go:117] "RemoveContainer" containerID="e9b12443ad3ae04dfb8e1c84530aa6ceb0f813c57b0c18201a84a3e83b219f88" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.818128 4749 scope.go:117] "RemoveContainer" containerID="3d2eea03700cf7f239720aedd551d3c3bf84b7a0c8d0b69214a3e868115b1f9e" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.843826 4749 scope.go:117] "RemoveContainer" containerID="46302fdc6234ad50fae4a7f00eb45d9d7fd624c9680683f332fb020fcca639fe" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.857865 4749 scope.go:117] "RemoveContainer" containerID="69af99983784464ff8b25db0e73db93c849f9a6e704ca8f907228103ae2c0591" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.879449 4749 scope.go:117] "RemoveContainer" containerID="de75e980cf20d25c7b5d236a8baa09a0c5aa9e5b1150f9d64a12c2e3df0a6302" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.900543 4749 scope.go:117] "RemoveContainer" containerID="a7d9f8e499616cc17dac442dfcb89aa8bf915a4393701c45e1e84e60b734518b" Feb 19 18:35:55 crc kubenswrapper[4749]: E0219 18:35:55.901412 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7d9f8e499616cc17dac442dfcb89aa8bf915a4393701c45e1e84e60b734518b\": container with ID starting with a7d9f8e499616cc17dac442dfcb89aa8bf915a4393701c45e1e84e60b734518b not found: ID does not exist" containerID="a7d9f8e499616cc17dac442dfcb89aa8bf915a4393701c45e1e84e60b734518b" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.901449 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7d9f8e499616cc17dac442dfcb89aa8bf915a4393701c45e1e84e60b734518b"} err="failed to get container status \"a7d9f8e499616cc17dac442dfcb89aa8bf915a4393701c45e1e84e60b734518b\": rpc error: code = NotFound desc = could not find container \"a7d9f8e499616cc17dac442dfcb89aa8bf915a4393701c45e1e84e60b734518b\": container with ID starting with a7d9f8e499616cc17dac442dfcb89aa8bf915a4393701c45e1e84e60b734518b not found: ID does not exist" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.901478 4749 scope.go:117] "RemoveContainer" containerID="e9b12443ad3ae04dfb8e1c84530aa6ceb0f813c57b0c18201a84a3e83b219f88" Feb 19 18:35:55 crc kubenswrapper[4749]: E0219 18:35:55.901950 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9b12443ad3ae04dfb8e1c84530aa6ceb0f813c57b0c18201a84a3e83b219f88\": container with ID starting with e9b12443ad3ae04dfb8e1c84530aa6ceb0f813c57b0c18201a84a3e83b219f88 not found: ID does not exist" containerID="e9b12443ad3ae04dfb8e1c84530aa6ceb0f813c57b0c18201a84a3e83b219f88" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.901977 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9b12443ad3ae04dfb8e1c84530aa6ceb0f813c57b0c18201a84a3e83b219f88"} err="failed to get container status \"e9b12443ad3ae04dfb8e1c84530aa6ceb0f813c57b0c18201a84a3e83b219f88\": rpc error: code = NotFound desc = could not find container \"e9b12443ad3ae04dfb8e1c84530aa6ceb0f813c57b0c18201a84a3e83b219f88\": container with ID starting with e9b12443ad3ae04dfb8e1c84530aa6ceb0f813c57b0c18201a84a3e83b219f88 not found: ID does not exist" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.901995 4749 scope.go:117] "RemoveContainer" containerID="3d2eea03700cf7f239720aedd551d3c3bf84b7a0c8d0b69214a3e868115b1f9e" Feb 19 18:35:55 crc kubenswrapper[4749]: E0219 18:35:55.902202 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d2eea03700cf7f239720aedd551d3c3bf84b7a0c8d0b69214a3e868115b1f9e\": container with ID starting with 3d2eea03700cf7f239720aedd551d3c3bf84b7a0c8d0b69214a3e868115b1f9e not found: ID does not exist" containerID="3d2eea03700cf7f239720aedd551d3c3bf84b7a0c8d0b69214a3e868115b1f9e" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.902268 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d2eea03700cf7f239720aedd551d3c3bf84b7a0c8d0b69214a3e868115b1f9e"} err="failed to get container status \"3d2eea03700cf7f239720aedd551d3c3bf84b7a0c8d0b69214a3e868115b1f9e\": rpc error: code = NotFound desc = could not find container \"3d2eea03700cf7f239720aedd551d3c3bf84b7a0c8d0b69214a3e868115b1f9e\": container with ID starting with 3d2eea03700cf7f239720aedd551d3c3bf84b7a0c8d0b69214a3e868115b1f9e not found: ID does not exist" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.902286 4749 scope.go:117] "RemoveContainer" containerID="46302fdc6234ad50fae4a7f00eb45d9d7fd624c9680683f332fb020fcca639fe" Feb 19 18:35:55 crc kubenswrapper[4749]: E0219 18:35:55.902622 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46302fdc6234ad50fae4a7f00eb45d9d7fd624c9680683f332fb020fcca639fe\": container with ID starting with 46302fdc6234ad50fae4a7f00eb45d9d7fd624c9680683f332fb020fcca639fe not found: ID does not exist" containerID="46302fdc6234ad50fae4a7f00eb45d9d7fd624c9680683f332fb020fcca639fe" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.902650 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46302fdc6234ad50fae4a7f00eb45d9d7fd624c9680683f332fb020fcca639fe"} err="failed to get container status \"46302fdc6234ad50fae4a7f00eb45d9d7fd624c9680683f332fb020fcca639fe\": rpc error: code = NotFound desc = could not find container \"46302fdc6234ad50fae4a7f00eb45d9d7fd624c9680683f332fb020fcca639fe\": container with ID starting with 46302fdc6234ad50fae4a7f00eb45d9d7fd624c9680683f332fb020fcca639fe not found: ID does not exist" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.902690 4749 scope.go:117] "RemoveContainer" containerID="69af99983784464ff8b25db0e73db93c849f9a6e704ca8f907228103ae2c0591" Feb 19 18:35:55 crc kubenswrapper[4749]: E0219 18:35:55.903831 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69af99983784464ff8b25db0e73db93c849f9a6e704ca8f907228103ae2c0591\": container with ID starting with 69af99983784464ff8b25db0e73db93c849f9a6e704ca8f907228103ae2c0591 not found: ID does not exist" containerID="69af99983784464ff8b25db0e73db93c849f9a6e704ca8f907228103ae2c0591" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.904006 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69af99983784464ff8b25db0e73db93c849f9a6e704ca8f907228103ae2c0591"} err="failed to get container status \"69af99983784464ff8b25db0e73db93c849f9a6e704ca8f907228103ae2c0591\": rpc error: code = NotFound desc = could not find container \"69af99983784464ff8b25db0e73db93c849f9a6e704ca8f907228103ae2c0591\": container with ID starting with 69af99983784464ff8b25db0e73db93c849f9a6e704ca8f907228103ae2c0591 not found: ID does not exist" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.904137 4749 scope.go:117] "RemoveContainer" containerID="de75e980cf20d25c7b5d236a8baa09a0c5aa9e5b1150f9d64a12c2e3df0a6302" Feb 19 18:35:55 crc kubenswrapper[4749]: E0219 18:35:55.904596 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de75e980cf20d25c7b5d236a8baa09a0c5aa9e5b1150f9d64a12c2e3df0a6302\": container with ID starting with de75e980cf20d25c7b5d236a8baa09a0c5aa9e5b1150f9d64a12c2e3df0a6302 not found: ID does not exist" containerID="de75e980cf20d25c7b5d236a8baa09a0c5aa9e5b1150f9d64a12c2e3df0a6302" Feb 19 18:35:55 crc kubenswrapper[4749]: I0219 18:35:55.904628 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de75e980cf20d25c7b5d236a8baa09a0c5aa9e5b1150f9d64a12c2e3df0a6302"} err="failed to get container status \"de75e980cf20d25c7b5d236a8baa09a0c5aa9e5b1150f9d64a12c2e3df0a6302\": rpc error: code = NotFound desc = could not find container \"de75e980cf20d25c7b5d236a8baa09a0c5aa9e5b1150f9d64a12c2e3df0a6302\": container with ID starting with de75e980cf20d25c7b5d236a8baa09a0c5aa9e5b1150f9d64a12c2e3df0a6302 not found: ID does not exist" Feb 19 18:35:56 crc kubenswrapper[4749]: E0219 18:35:56.540744 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.128:6443: connect: connection refused" interval="1.6s" Feb 19 18:35:56 crc kubenswrapper[4749]: I0219 18:35:56.680886 4749 status_manager.go:851] "Failed to get status for pod" podUID="5e6539f5-4700-4a9d-9428-6752835bbe20" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.128:6443: connect: connection refused" Feb 19 18:35:56 crc kubenswrapper[4749]: I0219 18:35:56.681354 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.128:6443: connect: connection refused" Feb 19 18:35:56 crc kubenswrapper[4749]: I0219 18:35:56.685887 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 19 18:35:58 crc kubenswrapper[4749]: E0219 18:35:58.141917 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.128:6443: connect: connection refused" interval="3.2s" Feb 19 18:36:01 crc kubenswrapper[4749]: E0219 18:36:01.343383 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.128:6443: connect: connection refused" interval="6.4s" Feb 19 18:36:04 crc kubenswrapper[4749]: E0219 18:36:04.287646 4749 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.128:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895b9a672eabf17 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 18:35:52.983854871 +0000 UTC m=+126.945074825,LastTimestamp:2026-02-19 18:35:52.983854871 +0000 UTC m=+126.945074825,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 18:36:04 crc kubenswrapper[4749]: I0219 18:36:04.677922 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:36:04 crc kubenswrapper[4749]: I0219 18:36:04.678983 4749 status_manager.go:851] "Failed to get status for pod" podUID="5e6539f5-4700-4a9d-9428-6752835bbe20" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.128:6443: connect: connection refused" Feb 19 18:36:04 crc kubenswrapper[4749]: I0219 18:36:04.692088 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9ce9cc97-651c-4136-a376-5152d4db2876" Feb 19 18:36:04 crc kubenswrapper[4749]: I0219 18:36:04.692121 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9ce9cc97-651c-4136-a376-5152d4db2876" Feb 19 18:36:04 crc kubenswrapper[4749]: E0219 18:36:04.693004 4749 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.128:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:36:04 crc kubenswrapper[4749]: I0219 18:36:04.693452 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:36:04 crc kubenswrapper[4749]: W0219 18:36:04.716013 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-1c1376da1a6d5e58add80cc9b864901e4443f56bde59cd56a4a5dcf8ce3d7fd0 WatchSource:0}: Error finding container 1c1376da1a6d5e58add80cc9b864901e4443f56bde59cd56a4a5dcf8ce3d7fd0: Status 404 returned error can't find the container with id 1c1376da1a6d5e58add80cc9b864901e4443f56bde59cd56a4a5dcf8ce3d7fd0 Feb 19 18:36:04 crc kubenswrapper[4749]: I0219 18:36:04.845736 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1c1376da1a6d5e58add80cc9b864901e4443f56bde59cd56a4a5dcf8ce3d7fd0"} Feb 19 18:36:05 crc kubenswrapper[4749]: I0219 18:36:05.852967 4749 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="a59fecfeae3cc896011bf48301dfa822eb620bbc2cd6294d4cf3865e06840458" exitCode=0 Feb 19 18:36:05 crc kubenswrapper[4749]: I0219 18:36:05.853216 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9ce9cc97-651c-4136-a376-5152d4db2876" Feb 19 18:36:05 crc kubenswrapper[4749]: I0219 18:36:05.853244 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9ce9cc97-651c-4136-a376-5152d4db2876" Feb 19 18:36:05 crc kubenswrapper[4749]: E0219 18:36:05.853837 4749 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.128:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:36:05 crc kubenswrapper[4749]: I0219 18:36:05.853116 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"a59fecfeae3cc896011bf48301dfa822eb620bbc2cd6294d4cf3865e06840458"} Feb 19 18:36:05 crc kubenswrapper[4749]: I0219 18:36:05.854084 4749 status_manager.go:851] "Failed to get status for pod" podUID="5e6539f5-4700-4a9d-9428-6752835bbe20" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.128:6443: connect: connection refused" Feb 19 18:36:06 crc kubenswrapper[4749]: I0219 18:36:06.871648 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7206b469f19a6efd55d41c155fa3070752e5f30814ee825012608077d306e27e"} Feb 19 18:36:06 crc kubenswrapper[4749]: I0219 18:36:06.872007 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"194fab35f7c5481018d09cb8f1e2ad302446b1fb89f75de72a21136d4a2eaf9f"} Feb 19 18:36:06 crc kubenswrapper[4749]: I0219 18:36:06.872018 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"423f31104cc1181894137c28f4f013ebbb8ecf995a2285606000a48e4b55d273"} Feb 19 18:36:06 crc kubenswrapper[4749]: I0219 18:36:06.874853 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 19 18:36:06 crc kubenswrapper[4749]: I0219 18:36:06.874897 4749 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="9b3881770674ef4130e3b94e5d7424d9474dc6a57df26af788808d649d3933ed" exitCode=1 Feb 19 18:36:06 crc kubenswrapper[4749]: I0219 18:36:06.874916 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"9b3881770674ef4130e3b94e5d7424d9474dc6a57df26af788808d649d3933ed"} Feb 19 18:36:06 crc kubenswrapper[4749]: I0219 18:36:06.875345 4749 scope.go:117] "RemoveContainer" containerID="9b3881770674ef4130e3b94e5d7424d9474dc6a57df26af788808d649d3933ed" Feb 19 18:36:07 crc kubenswrapper[4749]: I0219 18:36:07.885018 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"096bc8526f114b473527ebf0c78e81c18a1c833ab1a87bfa19d9699b6245b654"} Feb 19 18:36:07 crc kubenswrapper[4749]: I0219 18:36:07.885373 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"da4b75901a8ea74631f0ef5093ddf20184526b0cf0f322128f3a9b9ea4b1d20b"} Feb 19 18:36:07 crc kubenswrapper[4749]: I0219 18:36:07.885405 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:36:07 crc kubenswrapper[4749]: I0219 18:36:07.885491 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9ce9cc97-651c-4136-a376-5152d4db2876" Feb 19 18:36:07 crc kubenswrapper[4749]: I0219 18:36:07.885513 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9ce9cc97-651c-4136-a376-5152d4db2876" Feb 19 18:36:07 crc kubenswrapper[4749]: I0219 18:36:07.888757 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 19 18:36:07 crc kubenswrapper[4749]: I0219 18:36:07.888820 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0edc9f6a9720ae3e60a5cc4a40b85c2fc45f6325e7c28529267ef7a7f3b80508"} Feb 19 18:36:08 crc kubenswrapper[4749]: I0219 18:36:08.058195 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 18:36:08 crc kubenswrapper[4749]: I0219 18:36:08.058290 4749 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 19 18:36:08 crc kubenswrapper[4749]: I0219 18:36:08.058334 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 19 18:36:09 crc kubenswrapper[4749]: I0219 18:36:09.379795 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 18:36:09 crc kubenswrapper[4749]: I0219 18:36:09.694294 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:36:09 crc kubenswrapper[4749]: I0219 18:36:09.694347 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:36:09 crc kubenswrapper[4749]: I0219 18:36:09.700525 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:36:12 crc kubenswrapper[4749]: I0219 18:36:12.900740 4749 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:36:12 crc kubenswrapper[4749]: I0219 18:36:12.916749 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9ce9cc97-651c-4136-a376-5152d4db2876" Feb 19 18:36:12 crc kubenswrapper[4749]: I0219 18:36:12.916783 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9ce9cc97-651c-4136-a376-5152d4db2876" Feb 19 18:36:12 crc kubenswrapper[4749]: I0219 18:36:12.925969 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:36:12 crc kubenswrapper[4749]: I0219 18:36:12.928329 4749 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="727f0a0e-a669-464d-9b0d-2ee2c194ca06" Feb 19 18:36:13 crc kubenswrapper[4749]: I0219 18:36:13.924466 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9ce9cc97-651c-4136-a376-5152d4db2876" Feb 19 18:36:13 crc kubenswrapper[4749]: I0219 18:36:13.924525 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9ce9cc97-651c-4136-a376-5152d4db2876" Feb 19 18:36:16 crc kubenswrapper[4749]: I0219 18:36:16.696056 4749 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="727f0a0e-a669-464d-9b0d-2ee2c194ca06" Feb 19 18:36:18 crc kubenswrapper[4749]: I0219 18:36:18.058741 4749 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 19 18:36:18 crc kubenswrapper[4749]: I0219 18:36:18.058818 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 19 18:36:22 crc kubenswrapper[4749]: I0219 18:36:22.845657 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 18:36:23 crc kubenswrapper[4749]: I0219 18:36:23.337965 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 19 18:36:23 crc kubenswrapper[4749]: I0219 18:36:23.338919 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 19 18:36:23 crc kubenswrapper[4749]: I0219 18:36:23.513352 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 19 18:36:23 crc kubenswrapper[4749]: I0219 18:36:23.623810 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 19 18:36:23 crc kubenswrapper[4749]: I0219 18:36:23.718618 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 19 18:36:23 crc kubenswrapper[4749]: I0219 18:36:23.795521 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 19 18:36:24 crc kubenswrapper[4749]: I0219 18:36:24.301715 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 19 18:36:24 crc kubenswrapper[4749]: I0219 18:36:24.659518 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 19 18:36:24 crc kubenswrapper[4749]: I0219 18:36:24.726231 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 18:36:24 crc kubenswrapper[4749]: I0219 18:36:24.726330 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 18:36:24 crc kubenswrapper[4749]: I0219 18:36:24.810275 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 19 18:36:24 crc kubenswrapper[4749]: I0219 18:36:24.866884 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 19 18:36:24 crc kubenswrapper[4749]: I0219 18:36:24.893983 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 19 18:36:25 crc kubenswrapper[4749]: I0219 18:36:25.150182 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 19 18:36:25 crc kubenswrapper[4749]: I0219 18:36:25.343738 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 19 18:36:25 crc kubenswrapper[4749]: I0219 18:36:25.668119 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 19 18:36:25 crc kubenswrapper[4749]: I0219 18:36:25.669821 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 19 18:36:25 crc kubenswrapper[4749]: I0219 18:36:25.687309 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 19 18:36:25 crc kubenswrapper[4749]: I0219 18:36:25.819200 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 19 18:36:26 crc kubenswrapper[4749]: I0219 18:36:26.068804 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 19 18:36:26 crc kubenswrapper[4749]: I0219 18:36:26.221235 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 19 18:36:26 crc kubenswrapper[4749]: I0219 18:36:26.312562 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 19 18:36:26 crc kubenswrapper[4749]: I0219 18:36:26.336732 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 19 18:36:26 crc kubenswrapper[4749]: I0219 18:36:26.361592 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 19 18:36:26 crc kubenswrapper[4749]: I0219 18:36:26.362117 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 19 18:36:26 crc kubenswrapper[4749]: I0219 18:36:26.375775 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 19 18:36:26 crc kubenswrapper[4749]: I0219 18:36:26.508262 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 19 18:36:26 crc kubenswrapper[4749]: I0219 18:36:26.694140 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 18:36:26 crc kubenswrapper[4749]: I0219 18:36:26.818261 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 19 18:36:26 crc kubenswrapper[4749]: I0219 18:36:26.874012 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 18:36:26 crc kubenswrapper[4749]: I0219 18:36:26.896564 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 19 18:36:26 crc kubenswrapper[4749]: I0219 18:36:26.939852 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 18:36:26 crc kubenswrapper[4749]: I0219 18:36:26.952134 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 19 18:36:26 crc kubenswrapper[4749]: I0219 18:36:26.975059 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 19 18:36:27 crc kubenswrapper[4749]: I0219 18:36:27.003677 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 19 18:36:27 crc kubenswrapper[4749]: I0219 18:36:27.050626 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 19 18:36:27 crc kubenswrapper[4749]: I0219 18:36:27.053507 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 19 18:36:27 crc kubenswrapper[4749]: I0219 18:36:27.087921 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 19 18:36:27 crc kubenswrapper[4749]: I0219 18:36:27.146052 4749 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 19 18:36:27 crc kubenswrapper[4749]: I0219 18:36:27.154919 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 19 18:36:27 crc kubenswrapper[4749]: I0219 18:36:27.207401 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 18:36:27 crc kubenswrapper[4749]: I0219 18:36:27.207652 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 19 18:36:27 crc kubenswrapper[4749]: I0219 18:36:27.249067 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 19 18:36:27 crc kubenswrapper[4749]: I0219 18:36:27.286679 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 19 18:36:27 crc kubenswrapper[4749]: I0219 18:36:27.312636 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 19 18:36:27 crc kubenswrapper[4749]: I0219 18:36:27.336131 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 19 18:36:27 crc kubenswrapper[4749]: I0219 18:36:27.446575 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 19 18:36:27 crc kubenswrapper[4749]: I0219 18:36:27.531941 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 19 18:36:27 crc kubenswrapper[4749]: I0219 18:36:27.553370 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 19 18:36:27 crc kubenswrapper[4749]: I0219 18:36:27.560431 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 19 18:36:27 crc kubenswrapper[4749]: I0219 18:36:27.566106 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 19 18:36:27 crc kubenswrapper[4749]: I0219 18:36:27.733774 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 19 18:36:27 crc kubenswrapper[4749]: I0219 18:36:27.760393 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 19 18:36:27 crc kubenswrapper[4749]: I0219 18:36:27.851165 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 19 18:36:27 crc kubenswrapper[4749]: I0219 18:36:27.853571 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 19 18:36:27 crc kubenswrapper[4749]: I0219 18:36:27.892858 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 19 18:36:27 crc kubenswrapper[4749]: I0219 18:36:27.915907 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 19 18:36:27 crc kubenswrapper[4749]: I0219 18:36:27.947365 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 19 18:36:28 crc kubenswrapper[4749]: I0219 18:36:28.028468 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 19 18:36:28 crc kubenswrapper[4749]: I0219 18:36:28.044045 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 19 18:36:28 crc kubenswrapper[4749]: I0219 18:36:28.048559 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 19 18:36:28 crc kubenswrapper[4749]: I0219 18:36:28.062462 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 18:36:28 crc kubenswrapper[4749]: I0219 18:36:28.067169 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 18:36:28 crc kubenswrapper[4749]: I0219 18:36:28.075093 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 19 18:36:28 crc kubenswrapper[4749]: I0219 18:36:28.112094 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 19 18:36:28 crc kubenswrapper[4749]: I0219 18:36:28.158109 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 19 18:36:28 crc kubenswrapper[4749]: I0219 18:36:28.184439 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 19 18:36:28 crc kubenswrapper[4749]: I0219 18:36:28.365196 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 19 18:36:28 crc kubenswrapper[4749]: I0219 18:36:28.405572 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 19 18:36:28 crc kubenswrapper[4749]: I0219 18:36:28.414535 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 18:36:28 crc kubenswrapper[4749]: I0219 18:36:28.458692 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 19 18:36:28 crc kubenswrapper[4749]: I0219 18:36:28.466515 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 19 18:36:28 crc kubenswrapper[4749]: I0219 18:36:28.616944 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 19 18:36:28 crc kubenswrapper[4749]: I0219 18:36:28.851884 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 19 18:36:28 crc kubenswrapper[4749]: I0219 18:36:28.868334 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 19 18:36:28 crc kubenswrapper[4749]: I0219 18:36:28.931712 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 19 18:36:28 crc kubenswrapper[4749]: I0219 18:36:28.933832 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.089667 4749 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.093887 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.093936 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn","openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 18:36:29 crc kubenswrapper[4749]: E0219 18:36:29.094158 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e6539f5-4700-4a9d-9428-6752835bbe20" containerName="installer" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.094178 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e6539f5-4700-4a9d-9428-6752835bbe20" containerName="installer" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.094287 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e6539f5-4700-4a9d-9428-6752835bbe20" containerName="installer" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.094464 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9ce9cc97-651c-4136-a376-5152d4db2876" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.094497 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9ce9cc97-651c-4136-a376-5152d4db2876" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.094973 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.096558 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.097755 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.098952 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.098976 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.099322 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.099532 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.099542 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.099600 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.099802 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.100271 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.100278 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.100296 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.104732 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.106406 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.110864 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.112691 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.143359 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=17.14333754 podStartE2EDuration="17.14333754s" podCreationTimestamp="2026-02-19 18:36:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:36:29.133468711 +0000 UTC m=+163.094688735" watchObservedRunningTime="2026-02-19 18:36:29.14333754 +0000 UTC m=+163.104557554" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.209122 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.218906 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.236235 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.251864 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.279709 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6f3d662e-e30d-45ca-bd72-fee1d23a763a-audit-policies\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.279785 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.279813 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-user-template-login\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.279851 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.279875 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-system-service-ca\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.279892 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.279938 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6f3d662e-e30d-45ca-bd72-fee1d23a763a-audit-dir\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.279955 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-user-template-error\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.280015 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.280113 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-system-router-certs\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.280148 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-system-session\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.280187 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.280211 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.280228 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg89k\" (UniqueName: \"kubernetes.io/projected/6f3d662e-e30d-45ca-bd72-fee1d23a763a-kube-api-access-qg89k\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.287274 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.316325 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.345664 4749 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.367092 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.381521 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-system-session\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.381589 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.381642 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.381664 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg89k\" (UniqueName: \"kubernetes.io/projected/6f3d662e-e30d-45ca-bd72-fee1d23a763a-kube-api-access-qg89k\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.381687 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6f3d662e-e30d-45ca-bd72-fee1d23a763a-audit-policies\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.381705 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.381729 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-user-template-login\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.381745 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.381760 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-system-service-ca\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.381780 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.381818 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6f3d662e-e30d-45ca-bd72-fee1d23a763a-audit-dir\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.381842 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-user-template-error\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.381865 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.381900 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-system-router-certs\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.382493 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6f3d662e-e30d-45ca-bd72-fee1d23a763a-audit-dir\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.382862 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6f3d662e-e30d-45ca-bd72-fee1d23a763a-audit-policies\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.384395 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.385434 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.385610 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-system-service-ca\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.387058 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.387104 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.387279 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.388210 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-user-template-error\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.388370 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-system-session\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.388854 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-system-router-certs\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.390748 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.393574 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6f3d662e-e30d-45ca-bd72-fee1d23a763a-v4-0-config-user-template-login\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.404332 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg89k\" (UniqueName: \"kubernetes.io/projected/6f3d662e-e30d-45ca-bd72-fee1d23a763a-kube-api-access-qg89k\") pod \"oauth-openshift-6cc7c68bbf-mfnpn\" (UID: \"6f3d662e-e30d-45ca-bd72-fee1d23a763a\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.423115 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.543095 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.664512 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.718434 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.730165 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.733806 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.762959 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.808928 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 19 18:36:29 crc kubenswrapper[4749]: I0219 18:36:29.989881 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 19 18:36:30 crc kubenswrapper[4749]: I0219 18:36:30.004827 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 19 18:36:30 crc kubenswrapper[4749]: I0219 18:36:30.033239 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 19 18:36:30 crc kubenswrapper[4749]: I0219 18:36:30.066140 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 19 18:36:30 crc kubenswrapper[4749]: I0219 18:36:30.091729 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 19 18:36:30 crc kubenswrapper[4749]: I0219 18:36:30.139377 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 19 18:36:30 crc kubenswrapper[4749]: I0219 18:36:30.140693 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 19 18:36:30 crc kubenswrapper[4749]: I0219 18:36:30.227194 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 19 18:36:30 crc kubenswrapper[4749]: I0219 18:36:30.250715 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 19 18:36:30 crc kubenswrapper[4749]: I0219 18:36:30.371889 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 18:36:30 crc kubenswrapper[4749]: I0219 18:36:30.645270 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 19 18:36:30 crc kubenswrapper[4749]: I0219 18:36:30.798576 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 19 18:36:30 crc kubenswrapper[4749]: I0219 18:36:30.837875 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 19 18:36:30 crc kubenswrapper[4749]: I0219 18:36:30.866654 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 19 18:36:30 crc kubenswrapper[4749]: I0219 18:36:30.909143 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 19 18:36:30 crc kubenswrapper[4749]: I0219 18:36:30.917722 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 19 18:36:30 crc kubenswrapper[4749]: I0219 18:36:30.963515 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 19 18:36:31 crc kubenswrapper[4749]: I0219 18:36:31.126399 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 19 18:36:31 crc kubenswrapper[4749]: I0219 18:36:31.126845 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 19 18:36:31 crc kubenswrapper[4749]: I0219 18:36:31.151341 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 19 18:36:31 crc kubenswrapper[4749]: I0219 18:36:31.212518 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 19 18:36:31 crc kubenswrapper[4749]: I0219 18:36:31.270047 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 19 18:36:31 crc kubenswrapper[4749]: I0219 18:36:31.276990 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 19 18:36:31 crc kubenswrapper[4749]: I0219 18:36:31.609467 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 19 18:36:31 crc kubenswrapper[4749]: I0219 18:36:31.634496 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 19 18:36:31 crc kubenswrapper[4749]: I0219 18:36:31.649421 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 19 18:36:31 crc kubenswrapper[4749]: I0219 18:36:31.693140 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 19 18:36:31 crc kubenswrapper[4749]: I0219 18:36:31.714041 4749 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 19 18:36:31 crc kubenswrapper[4749]: I0219 18:36:31.747243 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 19 18:36:31 crc kubenswrapper[4749]: I0219 18:36:31.817197 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 19 18:36:31 crc kubenswrapper[4749]: I0219 18:36:31.889130 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 19 18:36:31 crc kubenswrapper[4749]: I0219 18:36:31.951087 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 19 18:36:31 crc kubenswrapper[4749]: I0219 18:36:31.958852 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 19 18:36:32 crc kubenswrapper[4749]: I0219 18:36:32.001762 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 19 18:36:32 crc kubenswrapper[4749]: I0219 18:36:32.009989 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 19 18:36:32 crc kubenswrapper[4749]: I0219 18:36:32.056258 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 19 18:36:32 crc kubenswrapper[4749]: I0219 18:36:32.108265 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 19 18:36:32 crc kubenswrapper[4749]: I0219 18:36:32.110881 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 19 18:36:32 crc kubenswrapper[4749]: I0219 18:36:32.157251 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 19 18:36:32 crc kubenswrapper[4749]: I0219 18:36:32.164331 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 19 18:36:32 crc kubenswrapper[4749]: I0219 18:36:32.275809 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 19 18:36:32 crc kubenswrapper[4749]: I0219 18:36:32.304453 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 18:36:32 crc kubenswrapper[4749]: I0219 18:36:32.344780 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 19 18:36:32 crc kubenswrapper[4749]: I0219 18:36:32.674821 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 19 18:36:32 crc kubenswrapper[4749]: I0219 18:36:32.695331 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 19 18:36:32 crc kubenswrapper[4749]: I0219 18:36:32.757120 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 19 18:36:32 crc kubenswrapper[4749]: I0219 18:36:32.759578 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 19 18:36:32 crc kubenswrapper[4749]: I0219 18:36:32.798057 4749 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 19 18:36:32 crc kubenswrapper[4749]: I0219 18:36:32.885958 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 19 18:36:32 crc kubenswrapper[4749]: I0219 18:36:32.935679 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 19 18:36:32 crc kubenswrapper[4749]: I0219 18:36:32.950016 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 19 18:36:32 crc kubenswrapper[4749]: I0219 18:36:32.988088 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 19 18:36:33 crc kubenswrapper[4749]: I0219 18:36:33.005820 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 19 18:36:33 crc kubenswrapper[4749]: I0219 18:36:33.026406 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 19 18:36:33 crc kubenswrapper[4749]: I0219 18:36:33.029905 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 19 18:36:33 crc kubenswrapper[4749]: I0219 18:36:33.075978 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 19 18:36:33 crc kubenswrapper[4749]: I0219 18:36:33.143305 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 18:36:33 crc kubenswrapper[4749]: I0219 18:36:33.182164 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 19 18:36:33 crc kubenswrapper[4749]: I0219 18:36:33.192140 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 19 18:36:33 crc kubenswrapper[4749]: I0219 18:36:33.300225 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 18:36:33 crc kubenswrapper[4749]: I0219 18:36:33.376583 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 19 18:36:33 crc kubenswrapper[4749]: I0219 18:36:33.376650 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 19 18:36:33 crc kubenswrapper[4749]: I0219 18:36:33.445346 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 19 18:36:33 crc kubenswrapper[4749]: I0219 18:36:33.445528 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn"] Feb 19 18:36:33 crc kubenswrapper[4749]: I0219 18:36:33.450147 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 19 18:36:33 crc kubenswrapper[4749]: I0219 18:36:33.570837 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 19 18:36:33 crc kubenswrapper[4749]: I0219 18:36:33.570862 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 19 18:36:33 crc kubenswrapper[4749]: I0219 18:36:33.584401 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 19 18:36:33 crc kubenswrapper[4749]: I0219 18:36:33.841807 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 19 18:36:33 crc kubenswrapper[4749]: I0219 18:36:33.842549 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn"] Feb 19 18:36:33 crc kubenswrapper[4749]: I0219 18:36:33.895648 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 19 18:36:33 crc kubenswrapper[4749]: I0219 18:36:33.941834 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 19 18:36:34 crc kubenswrapper[4749]: I0219 18:36:34.043143 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 19 18:36:34 crc kubenswrapper[4749]: I0219 18:36:34.050149 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 19 18:36:34 crc kubenswrapper[4749]: I0219 18:36:34.081783 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" event={"ID":"6f3d662e-e30d-45ca-bd72-fee1d23a763a","Type":"ContainerStarted","Data":"d5e10279a2eaf7ec9fe3839214f87feb9a5be9be09596e6f483eadd01ffb8fc1"} Feb 19 18:36:34 crc kubenswrapper[4749]: I0219 18:36:34.277486 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 19 18:36:34 crc kubenswrapper[4749]: I0219 18:36:34.420753 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 19 18:36:34 crc kubenswrapper[4749]: I0219 18:36:34.451137 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 19 18:36:34 crc kubenswrapper[4749]: I0219 18:36:34.606545 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 19 18:36:34 crc kubenswrapper[4749]: I0219 18:36:34.805905 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 19 18:36:34 crc kubenswrapper[4749]: I0219 18:36:34.861759 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 19 18:36:34 crc kubenswrapper[4749]: I0219 18:36:34.940427 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 19 18:36:35 crc kubenswrapper[4749]: I0219 18:36:35.091177 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" event={"ID":"6f3d662e-e30d-45ca-bd72-fee1d23a763a","Type":"ContainerStarted","Data":"1d1a26db26d3e04156a801aca4d934b82ea89f90efeeae546c08c82e0416eeb9"} Feb 19 18:36:35 crc kubenswrapper[4749]: I0219 18:36:35.091720 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:35 crc kubenswrapper[4749]: I0219 18:36:35.097100 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" Feb 19 18:36:35 crc kubenswrapper[4749]: I0219 18:36:35.139464 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6cc7c68bbf-mfnpn" podStartSLOduration=71.13944144 podStartE2EDuration="1m11.13944144s" podCreationTimestamp="2026-02-19 18:35:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:36:35.116621699 +0000 UTC m=+169.077841663" watchObservedRunningTime="2026-02-19 18:36:35.13944144 +0000 UTC m=+169.100661384" Feb 19 18:36:35 crc kubenswrapper[4749]: I0219 18:36:35.140693 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 19 18:36:35 crc kubenswrapper[4749]: I0219 18:36:35.173409 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 19 18:36:35 crc kubenswrapper[4749]: I0219 18:36:35.208966 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 18:36:35 crc kubenswrapper[4749]: I0219 18:36:35.265306 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 19 18:36:35 crc kubenswrapper[4749]: I0219 18:36:35.331609 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 19 18:36:35 crc kubenswrapper[4749]: I0219 18:36:35.407904 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 19 18:36:35 crc kubenswrapper[4749]: I0219 18:36:35.412184 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 19 18:36:35 crc kubenswrapper[4749]: I0219 18:36:35.485437 4749 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 19 18:36:35 crc kubenswrapper[4749]: I0219 18:36:35.485644 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://5c401f93c00f83173784642173aa87b0f3ddda9286d9457cf3eac1673a7e6fbf" gracePeriod=5 Feb 19 18:36:35 crc kubenswrapper[4749]: I0219 18:36:35.629969 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 19 18:36:35 crc kubenswrapper[4749]: I0219 18:36:35.630717 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 19 18:36:35 crc kubenswrapper[4749]: I0219 18:36:35.664040 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 19 18:36:35 crc kubenswrapper[4749]: I0219 18:36:35.709213 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 19 18:36:35 crc kubenswrapper[4749]: I0219 18:36:35.716472 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 19 18:36:35 crc kubenswrapper[4749]: I0219 18:36:35.766689 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 19 18:36:35 crc kubenswrapper[4749]: I0219 18:36:35.803604 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 19 18:36:35 crc kubenswrapper[4749]: I0219 18:36:35.827923 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 19 18:36:35 crc kubenswrapper[4749]: I0219 18:36:35.900739 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 19 18:36:35 crc kubenswrapper[4749]: I0219 18:36:35.965902 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 19 18:36:35 crc kubenswrapper[4749]: I0219 18:36:35.977653 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 19 18:36:36 crc kubenswrapper[4749]: I0219 18:36:36.063368 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 19 18:36:36 crc kubenswrapper[4749]: I0219 18:36:36.115601 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 19 18:36:36 crc kubenswrapper[4749]: I0219 18:36:36.250243 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 19 18:36:36 crc kubenswrapper[4749]: I0219 18:36:36.253771 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 19 18:36:36 crc kubenswrapper[4749]: I0219 18:36:36.372478 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 19 18:36:36 crc kubenswrapper[4749]: I0219 18:36:36.482726 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 19 18:36:36 crc kubenswrapper[4749]: I0219 18:36:36.496463 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 19 18:36:36 crc kubenswrapper[4749]: I0219 18:36:36.535069 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 19 18:36:36 crc kubenswrapper[4749]: I0219 18:36:36.604347 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 19 18:36:36 crc kubenswrapper[4749]: I0219 18:36:36.614855 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 19 18:36:36 crc kubenswrapper[4749]: I0219 18:36:36.622918 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 19 18:36:36 crc kubenswrapper[4749]: I0219 18:36:36.658756 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 19 18:36:36 crc kubenswrapper[4749]: I0219 18:36:36.972404 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 19 18:36:37 crc kubenswrapper[4749]: I0219 18:36:37.007158 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 19 18:36:37 crc kubenswrapper[4749]: I0219 18:36:37.010526 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 19 18:36:37 crc kubenswrapper[4749]: I0219 18:36:37.011175 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 19 18:36:37 crc kubenswrapper[4749]: I0219 18:36:37.036100 4749 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 19 18:36:37 crc kubenswrapper[4749]: I0219 18:36:37.181183 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 19 18:36:37 crc kubenswrapper[4749]: I0219 18:36:37.181349 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 19 18:36:37 crc kubenswrapper[4749]: I0219 18:36:37.251302 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 19 18:36:37 crc kubenswrapper[4749]: I0219 18:36:37.347866 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 19 18:36:37 crc kubenswrapper[4749]: I0219 18:36:37.417645 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 19 18:36:37 crc kubenswrapper[4749]: I0219 18:36:37.486196 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 19 18:36:37 crc kubenswrapper[4749]: I0219 18:36:37.515683 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 19 18:36:37 crc kubenswrapper[4749]: I0219 18:36:37.551485 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 19 18:36:37 crc kubenswrapper[4749]: I0219 18:36:37.604119 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 19 18:36:37 crc kubenswrapper[4749]: I0219 18:36:37.638987 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 19 18:36:37 crc kubenswrapper[4749]: I0219 18:36:37.661804 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 18:36:37 crc kubenswrapper[4749]: I0219 18:36:37.832623 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 19 18:36:37 crc kubenswrapper[4749]: I0219 18:36:37.900142 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 19 18:36:37 crc kubenswrapper[4749]: I0219 18:36:37.936211 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 19 18:36:38 crc kubenswrapper[4749]: I0219 18:36:38.001607 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 19 18:36:38 crc kubenswrapper[4749]: I0219 18:36:38.015792 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 19 18:36:38 crc kubenswrapper[4749]: I0219 18:36:38.156205 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 19 18:36:38 crc kubenswrapper[4749]: I0219 18:36:38.174918 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 19 18:36:38 crc kubenswrapper[4749]: I0219 18:36:38.241534 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 19 18:36:38 crc kubenswrapper[4749]: I0219 18:36:38.303637 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 19 18:36:38 crc kubenswrapper[4749]: I0219 18:36:38.372141 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 19 18:36:38 crc kubenswrapper[4749]: I0219 18:36:38.479218 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 19 18:36:38 crc kubenswrapper[4749]: I0219 18:36:38.511277 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 19 18:36:38 crc kubenswrapper[4749]: I0219 18:36:38.686434 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 19 18:36:38 crc kubenswrapper[4749]: I0219 18:36:38.799707 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 19 18:36:38 crc kubenswrapper[4749]: I0219 18:36:38.828441 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 19 18:36:38 crc kubenswrapper[4749]: I0219 18:36:38.974942 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 19 18:36:39 crc kubenswrapper[4749]: I0219 18:36:39.148413 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 19 18:36:39 crc kubenswrapper[4749]: I0219 18:36:39.285915 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 19 18:36:39 crc kubenswrapper[4749]: I0219 18:36:39.466890 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 19 18:36:39 crc kubenswrapper[4749]: I0219 18:36:39.502570 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 19 18:36:40 crc kubenswrapper[4749]: I0219 18:36:40.608679 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 19 18:36:41 crc kubenswrapper[4749]: I0219 18:36:41.099662 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 19 18:36:41 crc kubenswrapper[4749]: I0219 18:36:41.099737 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 18:36:41 crc kubenswrapper[4749]: I0219 18:36:41.128927 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 19 18:36:41 crc kubenswrapper[4749]: I0219 18:36:41.128981 4749 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="5c401f93c00f83173784642173aa87b0f3ddda9286d9457cf3eac1673a7e6fbf" exitCode=137 Feb 19 18:36:41 crc kubenswrapper[4749]: I0219 18:36:41.129020 4749 scope.go:117] "RemoveContainer" containerID="5c401f93c00f83173784642173aa87b0f3ddda9286d9457cf3eac1673a7e6fbf" Feb 19 18:36:41 crc kubenswrapper[4749]: I0219 18:36:41.129137 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 18:36:41 crc kubenswrapper[4749]: I0219 18:36:41.147542 4749 scope.go:117] "RemoveContainer" containerID="5c401f93c00f83173784642173aa87b0f3ddda9286d9457cf3eac1673a7e6fbf" Feb 19 18:36:41 crc kubenswrapper[4749]: E0219 18:36:41.147995 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c401f93c00f83173784642173aa87b0f3ddda9286d9457cf3eac1673a7e6fbf\": container with ID starting with 5c401f93c00f83173784642173aa87b0f3ddda9286d9457cf3eac1673a7e6fbf not found: ID does not exist" containerID="5c401f93c00f83173784642173aa87b0f3ddda9286d9457cf3eac1673a7e6fbf" Feb 19 18:36:41 crc kubenswrapper[4749]: I0219 18:36:41.148055 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c401f93c00f83173784642173aa87b0f3ddda9286d9457cf3eac1673a7e6fbf"} err="failed to get container status \"5c401f93c00f83173784642173aa87b0f3ddda9286d9457cf3eac1673a7e6fbf\": rpc error: code = NotFound desc = could not find container \"5c401f93c00f83173784642173aa87b0f3ddda9286d9457cf3eac1673a7e6fbf\": container with ID starting with 5c401f93c00f83173784642173aa87b0f3ddda9286d9457cf3eac1673a7e6fbf not found: ID does not exist" Feb 19 18:36:41 crc kubenswrapper[4749]: I0219 18:36:41.227876 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 18:36:41 crc kubenswrapper[4749]: I0219 18:36:41.227915 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 18:36:41 crc kubenswrapper[4749]: I0219 18:36:41.227942 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 18:36:41 crc kubenswrapper[4749]: I0219 18:36:41.227971 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 18:36:41 crc kubenswrapper[4749]: I0219 18:36:41.227993 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:36:41 crc kubenswrapper[4749]: I0219 18:36:41.228051 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:36:41 crc kubenswrapper[4749]: I0219 18:36:41.228020 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 18:36:41 crc kubenswrapper[4749]: I0219 18:36:41.228084 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:36:41 crc kubenswrapper[4749]: I0219 18:36:41.228251 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:36:41 crc kubenswrapper[4749]: I0219 18:36:41.228321 4749 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 19 18:36:41 crc kubenswrapper[4749]: I0219 18:36:41.228340 4749 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 19 18:36:41 crc kubenswrapper[4749]: I0219 18:36:41.228369 4749 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 19 18:36:41 crc kubenswrapper[4749]: I0219 18:36:41.239308 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:36:41 crc kubenswrapper[4749]: I0219 18:36:41.330115 4749 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 19 18:36:41 crc kubenswrapper[4749]: I0219 18:36:41.330211 4749 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 19 18:36:41 crc kubenswrapper[4749]: I0219 18:36:41.429197 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 19 18:36:42 crc kubenswrapper[4749]: I0219 18:36:42.686752 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 19 18:36:48 crc kubenswrapper[4749]: I0219 18:36:48.346363 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-698dc6f945-nl4lq"] Feb 19 18:36:48 crc kubenswrapper[4749]: I0219 18:36:48.347069 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" podUID="8f53d12d-2978-4387-aff5-365a6839966c" containerName="controller-manager" containerID="cri-o://936fe1e4ffbc08e3376e7ed178b8743c5b9dac79bbd7e4ee9926a0d1fb6119a8" gracePeriod=30 Feb 19 18:36:48 crc kubenswrapper[4749]: I0219 18:36:48.446290 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4"] Feb 19 18:36:48 crc kubenswrapper[4749]: I0219 18:36:48.446504 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4" podUID="87af11fa-6465-4c84-af15-d570aea6592d" containerName="route-controller-manager" containerID="cri-o://bc9e0a49bbdc83f5b1fccbc4ffdaf3378f80eadf81a398f2ccc19ce1e5ca1513" gracePeriod=30 Feb 19 18:36:48 crc kubenswrapper[4749]: I0219 18:36:48.786745 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" Feb 19 18:36:48 crc kubenswrapper[4749]: I0219 18:36:48.820442 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f53d12d-2978-4387-aff5-365a6839966c-client-ca\") pod \"8f53d12d-2978-4387-aff5-365a6839966c\" (UID: \"8f53d12d-2978-4387-aff5-365a6839966c\") " Feb 19 18:36:48 crc kubenswrapper[4749]: I0219 18:36:48.820488 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8f53d12d-2978-4387-aff5-365a6839966c-proxy-ca-bundles\") pod \"8f53d12d-2978-4387-aff5-365a6839966c\" (UID: \"8f53d12d-2978-4387-aff5-365a6839966c\") " Feb 19 18:36:48 crc kubenswrapper[4749]: I0219 18:36:48.820512 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcx47\" (UniqueName: \"kubernetes.io/projected/8f53d12d-2978-4387-aff5-365a6839966c-kube-api-access-rcx47\") pod \"8f53d12d-2978-4387-aff5-365a6839966c\" (UID: \"8f53d12d-2978-4387-aff5-365a6839966c\") " Feb 19 18:36:48 crc kubenswrapper[4749]: I0219 18:36:48.820543 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f53d12d-2978-4387-aff5-365a6839966c-config\") pod \"8f53d12d-2978-4387-aff5-365a6839966c\" (UID: \"8f53d12d-2978-4387-aff5-365a6839966c\") " Feb 19 18:36:48 crc kubenswrapper[4749]: I0219 18:36:48.820558 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f53d12d-2978-4387-aff5-365a6839966c-serving-cert\") pod \"8f53d12d-2978-4387-aff5-365a6839966c\" (UID: \"8f53d12d-2978-4387-aff5-365a6839966c\") " Feb 19 18:36:48 crc kubenswrapper[4749]: I0219 18:36:48.822012 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f53d12d-2978-4387-aff5-365a6839966c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8f53d12d-2978-4387-aff5-365a6839966c" (UID: "8f53d12d-2978-4387-aff5-365a6839966c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:36:48 crc kubenswrapper[4749]: I0219 18:36:48.822468 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f53d12d-2978-4387-aff5-365a6839966c-config" (OuterVolumeSpecName: "config") pod "8f53d12d-2978-4387-aff5-365a6839966c" (UID: "8f53d12d-2978-4387-aff5-365a6839966c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:36:48 crc kubenswrapper[4749]: I0219 18:36:48.822550 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f53d12d-2978-4387-aff5-365a6839966c-client-ca" (OuterVolumeSpecName: "client-ca") pod "8f53d12d-2978-4387-aff5-365a6839966c" (UID: "8f53d12d-2978-4387-aff5-365a6839966c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:36:48 crc kubenswrapper[4749]: I0219 18:36:48.826700 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f53d12d-2978-4387-aff5-365a6839966c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8f53d12d-2978-4387-aff5-365a6839966c" (UID: "8f53d12d-2978-4387-aff5-365a6839966c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:36:48 crc kubenswrapper[4749]: I0219 18:36:48.829153 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f53d12d-2978-4387-aff5-365a6839966c-kube-api-access-rcx47" (OuterVolumeSpecName: "kube-api-access-rcx47") pod "8f53d12d-2978-4387-aff5-365a6839966c" (UID: "8f53d12d-2978-4387-aff5-365a6839966c"). InnerVolumeSpecName "kube-api-access-rcx47". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:36:48 crc kubenswrapper[4749]: I0219 18:36:48.860811 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4" Feb 19 18:36:48 crc kubenswrapper[4749]: I0219 18:36:48.921445 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f53d12d-2978-4387-aff5-365a6839966c-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:36:48 crc kubenswrapper[4749]: I0219 18:36:48.921475 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8f53d12d-2978-4387-aff5-365a6839966c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 18:36:48 crc kubenswrapper[4749]: I0219 18:36:48.921487 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcx47\" (UniqueName: \"kubernetes.io/projected/8f53d12d-2978-4387-aff5-365a6839966c-kube-api-access-rcx47\") on node \"crc\" DevicePath \"\"" Feb 19 18:36:48 crc kubenswrapper[4749]: I0219 18:36:48.921496 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f53d12d-2978-4387-aff5-365a6839966c-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:36:48 crc kubenswrapper[4749]: I0219 18:36:48.921504 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f53d12d-2978-4387-aff5-365a6839966c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.022490 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87af11fa-6465-4c84-af15-d570aea6592d-client-ca\") pod \"87af11fa-6465-4c84-af15-d570aea6592d\" (UID: \"87af11fa-6465-4c84-af15-d570aea6592d\") " Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.022559 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkm68\" (UniqueName: \"kubernetes.io/projected/87af11fa-6465-4c84-af15-d570aea6592d-kube-api-access-fkm68\") pod \"87af11fa-6465-4c84-af15-d570aea6592d\" (UID: \"87af11fa-6465-4c84-af15-d570aea6592d\") " Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.022589 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87af11fa-6465-4c84-af15-d570aea6592d-serving-cert\") pod \"87af11fa-6465-4c84-af15-d570aea6592d\" (UID: \"87af11fa-6465-4c84-af15-d570aea6592d\") " Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.022655 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87af11fa-6465-4c84-af15-d570aea6592d-config\") pod \"87af11fa-6465-4c84-af15-d570aea6592d\" (UID: \"87af11fa-6465-4c84-af15-d570aea6592d\") " Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.023271 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87af11fa-6465-4c84-af15-d570aea6592d-client-ca" (OuterVolumeSpecName: "client-ca") pod "87af11fa-6465-4c84-af15-d570aea6592d" (UID: "87af11fa-6465-4c84-af15-d570aea6592d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.023376 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87af11fa-6465-4c84-af15-d570aea6592d-config" (OuterVolumeSpecName: "config") pod "87af11fa-6465-4c84-af15-d570aea6592d" (UID: "87af11fa-6465-4c84-af15-d570aea6592d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.026341 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87af11fa-6465-4c84-af15-d570aea6592d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "87af11fa-6465-4c84-af15-d570aea6592d" (UID: "87af11fa-6465-4c84-af15-d570aea6592d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.028001 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87af11fa-6465-4c84-af15-d570aea6592d-kube-api-access-fkm68" (OuterVolumeSpecName: "kube-api-access-fkm68") pod "87af11fa-6465-4c84-af15-d570aea6592d" (UID: "87af11fa-6465-4c84-af15-d570aea6592d"). InnerVolumeSpecName "kube-api-access-fkm68". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.123569 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87af11fa-6465-4c84-af15-d570aea6592d-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.123600 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87af11fa-6465-4c84-af15-d570aea6592d-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.123615 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkm68\" (UniqueName: \"kubernetes.io/projected/87af11fa-6465-4c84-af15-d570aea6592d-kube-api-access-fkm68\") on node \"crc\" DevicePath \"\"" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.123624 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87af11fa-6465-4c84-af15-d570aea6592d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.169114 4749 generic.go:334] "Generic (PLEG): container finished" podID="8f53d12d-2978-4387-aff5-365a6839966c" containerID="936fe1e4ffbc08e3376e7ed178b8743c5b9dac79bbd7e4ee9926a0d1fb6119a8" exitCode=0 Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.169183 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" event={"ID":"8f53d12d-2978-4387-aff5-365a6839966c","Type":"ContainerDied","Data":"936fe1e4ffbc08e3376e7ed178b8743c5b9dac79bbd7e4ee9926a0d1fb6119a8"} Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.169210 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" event={"ID":"8f53d12d-2978-4387-aff5-365a6839966c","Type":"ContainerDied","Data":"c4c2e92eaa1bfcd2aebc79bf3e784ff8c0c396ebd7c3b967297c5a57bbf57969"} Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.169228 4749 scope.go:117] "RemoveContainer" containerID="936fe1e4ffbc08e3376e7ed178b8743c5b9dac79bbd7e4ee9926a0d1fb6119a8" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.169287 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-698dc6f945-nl4lq" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.171186 4749 generic.go:334] "Generic (PLEG): container finished" podID="87af11fa-6465-4c84-af15-d570aea6592d" containerID="bc9e0a49bbdc83f5b1fccbc4ffdaf3378f80eadf81a398f2ccc19ce1e5ca1513" exitCode=0 Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.171274 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.171305 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4" event={"ID":"87af11fa-6465-4c84-af15-d570aea6592d","Type":"ContainerDied","Data":"bc9e0a49bbdc83f5b1fccbc4ffdaf3378f80eadf81a398f2ccc19ce1e5ca1513"} Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.171443 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4" event={"ID":"87af11fa-6465-4c84-af15-d570aea6592d","Type":"ContainerDied","Data":"f88ab0761aeeefe72bf2b93727d7ba326ba23c34c29b800fb4ad45e1014c3481"} Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.192224 4749 scope.go:117] "RemoveContainer" containerID="936fe1e4ffbc08e3376e7ed178b8743c5b9dac79bbd7e4ee9926a0d1fb6119a8" Feb 19 18:36:49 crc kubenswrapper[4749]: E0219 18:36:49.192798 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"936fe1e4ffbc08e3376e7ed178b8743c5b9dac79bbd7e4ee9926a0d1fb6119a8\": container with ID starting with 936fe1e4ffbc08e3376e7ed178b8743c5b9dac79bbd7e4ee9926a0d1fb6119a8 not found: ID does not exist" containerID="936fe1e4ffbc08e3376e7ed178b8743c5b9dac79bbd7e4ee9926a0d1fb6119a8" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.192907 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"936fe1e4ffbc08e3376e7ed178b8743c5b9dac79bbd7e4ee9926a0d1fb6119a8"} err="failed to get container status \"936fe1e4ffbc08e3376e7ed178b8743c5b9dac79bbd7e4ee9926a0d1fb6119a8\": rpc error: code = NotFound desc = could not find container \"936fe1e4ffbc08e3376e7ed178b8743c5b9dac79bbd7e4ee9926a0d1fb6119a8\": container with ID starting with 936fe1e4ffbc08e3376e7ed178b8743c5b9dac79bbd7e4ee9926a0d1fb6119a8 not found: ID does not exist" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.192940 4749 scope.go:117] "RemoveContainer" containerID="bc9e0a49bbdc83f5b1fccbc4ffdaf3378f80eadf81a398f2ccc19ce1e5ca1513" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.206299 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4"] Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.210585 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57f5b44cd7-lvmb4"] Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.221906 4749 scope.go:117] "RemoveContainer" containerID="bc9e0a49bbdc83f5b1fccbc4ffdaf3378f80eadf81a398f2ccc19ce1e5ca1513" Feb 19 18:36:49 crc kubenswrapper[4749]: E0219 18:36:49.222309 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc9e0a49bbdc83f5b1fccbc4ffdaf3378f80eadf81a398f2ccc19ce1e5ca1513\": container with ID starting with bc9e0a49bbdc83f5b1fccbc4ffdaf3378f80eadf81a398f2ccc19ce1e5ca1513 not found: ID does not exist" containerID="bc9e0a49bbdc83f5b1fccbc4ffdaf3378f80eadf81a398f2ccc19ce1e5ca1513" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.222346 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc9e0a49bbdc83f5b1fccbc4ffdaf3378f80eadf81a398f2ccc19ce1e5ca1513"} err="failed to get container status \"bc9e0a49bbdc83f5b1fccbc4ffdaf3378f80eadf81a398f2ccc19ce1e5ca1513\": rpc error: code = NotFound desc = could not find container \"bc9e0a49bbdc83f5b1fccbc4ffdaf3378f80eadf81a398f2ccc19ce1e5ca1513\": container with ID starting with bc9e0a49bbdc83f5b1fccbc4ffdaf3378f80eadf81a398f2ccc19ce1e5ca1513 not found: ID does not exist" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.235442 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-698dc6f945-nl4lq"] Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.241770 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-698dc6f945-nl4lq"] Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.787879 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6"] Feb 19 18:36:49 crc kubenswrapper[4749]: E0219 18:36:49.788195 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.788208 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 19 18:36:49 crc kubenswrapper[4749]: E0219 18:36:49.788224 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f53d12d-2978-4387-aff5-365a6839966c" containerName="controller-manager" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.788231 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f53d12d-2978-4387-aff5-365a6839966c" containerName="controller-manager" Feb 19 18:36:49 crc kubenswrapper[4749]: E0219 18:36:49.788243 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87af11fa-6465-4c84-af15-d570aea6592d" containerName="route-controller-manager" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.788249 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="87af11fa-6465-4c84-af15-d570aea6592d" containerName="route-controller-manager" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.788355 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f53d12d-2978-4387-aff5-365a6839966c" containerName="controller-manager" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.788374 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="87af11fa-6465-4c84-af15-d570aea6592d" containerName="route-controller-manager" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.788380 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.788776 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.791511 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.793271 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.793406 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.793543 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.793689 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.793907 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.797102 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-867c9d769c-xtjkk"] Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.797774 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-867c9d769c-xtjkk" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.799674 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6"] Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.801000 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.801275 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.801880 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.801981 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.802364 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.803554 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-867c9d769c-xtjkk"] Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.804646 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.806151 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.933801 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfhjd\" (UniqueName: \"kubernetes.io/projected/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-kube-api-access-cfhjd\") pod \"controller-manager-867c9d769c-xtjkk\" (UID: \"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0\") " pod="openshift-controller-manager/controller-manager-867c9d769c-xtjkk" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.933855 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5635d9d3-5d98-4289-83e8-7927b414a0f8-config\") pod \"route-controller-manager-6754d5984f-pwcx6\" (UID: \"5635d9d3-5d98-4289-83e8-7927b414a0f8\") " pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.933879 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-proxy-ca-bundles\") pod \"controller-manager-867c9d769c-xtjkk\" (UID: \"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0\") " pod="openshift-controller-manager/controller-manager-867c9d769c-xtjkk" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.933898 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5635d9d3-5d98-4289-83e8-7927b414a0f8-serving-cert\") pod \"route-controller-manager-6754d5984f-pwcx6\" (UID: \"5635d9d3-5d98-4289-83e8-7927b414a0f8\") " pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.933965 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-serving-cert\") pod \"controller-manager-867c9d769c-xtjkk\" (UID: \"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0\") " pod="openshift-controller-manager/controller-manager-867c9d769c-xtjkk" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.934114 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5635d9d3-5d98-4289-83e8-7927b414a0f8-client-ca\") pod \"route-controller-manager-6754d5984f-pwcx6\" (UID: \"5635d9d3-5d98-4289-83e8-7927b414a0f8\") " pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.934147 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-client-ca\") pod \"controller-manager-867c9d769c-xtjkk\" (UID: \"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0\") " pod="openshift-controller-manager/controller-manager-867c9d769c-xtjkk" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.934189 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-config\") pod \"controller-manager-867c9d769c-xtjkk\" (UID: \"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0\") " pod="openshift-controller-manager/controller-manager-867c9d769c-xtjkk" Feb 19 18:36:49 crc kubenswrapper[4749]: I0219 18:36:49.934230 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzxcf\" (UniqueName: \"kubernetes.io/projected/5635d9d3-5d98-4289-83e8-7927b414a0f8-kube-api-access-rzxcf\") pod \"route-controller-manager-6754d5984f-pwcx6\" (UID: \"5635d9d3-5d98-4289-83e8-7927b414a0f8\") " pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6" Feb 19 18:36:50 crc kubenswrapper[4749]: I0219 18:36:50.035716 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfhjd\" (UniqueName: \"kubernetes.io/projected/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-kube-api-access-cfhjd\") pod \"controller-manager-867c9d769c-xtjkk\" (UID: \"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0\") " pod="openshift-controller-manager/controller-manager-867c9d769c-xtjkk" Feb 19 18:36:50 crc kubenswrapper[4749]: I0219 18:36:50.035773 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5635d9d3-5d98-4289-83e8-7927b414a0f8-config\") pod \"route-controller-manager-6754d5984f-pwcx6\" (UID: \"5635d9d3-5d98-4289-83e8-7927b414a0f8\") " pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6" Feb 19 18:36:50 crc kubenswrapper[4749]: I0219 18:36:50.035795 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-proxy-ca-bundles\") pod \"controller-manager-867c9d769c-xtjkk\" (UID: \"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0\") " pod="openshift-controller-manager/controller-manager-867c9d769c-xtjkk" Feb 19 18:36:50 crc kubenswrapper[4749]: I0219 18:36:50.035817 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5635d9d3-5d98-4289-83e8-7927b414a0f8-serving-cert\") pod \"route-controller-manager-6754d5984f-pwcx6\" (UID: \"5635d9d3-5d98-4289-83e8-7927b414a0f8\") " pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6" Feb 19 18:36:50 crc kubenswrapper[4749]: I0219 18:36:50.035843 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-serving-cert\") pod \"controller-manager-867c9d769c-xtjkk\" (UID: \"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0\") " pod="openshift-controller-manager/controller-manager-867c9d769c-xtjkk" Feb 19 18:36:50 crc kubenswrapper[4749]: I0219 18:36:50.035876 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5635d9d3-5d98-4289-83e8-7927b414a0f8-client-ca\") pod \"route-controller-manager-6754d5984f-pwcx6\" (UID: \"5635d9d3-5d98-4289-83e8-7927b414a0f8\") " pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6" Feb 19 18:36:50 crc kubenswrapper[4749]: I0219 18:36:50.035902 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-client-ca\") pod \"controller-manager-867c9d769c-xtjkk\" (UID: \"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0\") " pod="openshift-controller-manager/controller-manager-867c9d769c-xtjkk" Feb 19 18:36:50 crc kubenswrapper[4749]: I0219 18:36:50.035929 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-config\") pod \"controller-manager-867c9d769c-xtjkk\" (UID: \"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0\") " pod="openshift-controller-manager/controller-manager-867c9d769c-xtjkk" Feb 19 18:36:50 crc kubenswrapper[4749]: I0219 18:36:50.035952 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzxcf\" (UniqueName: \"kubernetes.io/projected/5635d9d3-5d98-4289-83e8-7927b414a0f8-kube-api-access-rzxcf\") pod \"route-controller-manager-6754d5984f-pwcx6\" (UID: \"5635d9d3-5d98-4289-83e8-7927b414a0f8\") " pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6" Feb 19 18:36:50 crc kubenswrapper[4749]: I0219 18:36:50.037110 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-client-ca\") pod \"controller-manager-867c9d769c-xtjkk\" (UID: \"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0\") " pod="openshift-controller-manager/controller-manager-867c9d769c-xtjkk" Feb 19 18:36:50 crc kubenswrapper[4749]: I0219 18:36:50.037158 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-proxy-ca-bundles\") pod \"controller-manager-867c9d769c-xtjkk\" (UID: \"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0\") " pod="openshift-controller-manager/controller-manager-867c9d769c-xtjkk" Feb 19 18:36:50 crc kubenswrapper[4749]: I0219 18:36:50.037190 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5635d9d3-5d98-4289-83e8-7927b414a0f8-client-ca\") pod \"route-controller-manager-6754d5984f-pwcx6\" (UID: \"5635d9d3-5d98-4289-83e8-7927b414a0f8\") " pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6" Feb 19 18:36:50 crc kubenswrapper[4749]: I0219 18:36:50.040120 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5635d9d3-5d98-4289-83e8-7927b414a0f8-config\") pod \"route-controller-manager-6754d5984f-pwcx6\" (UID: \"5635d9d3-5d98-4289-83e8-7927b414a0f8\") " pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6" Feb 19 18:36:50 crc kubenswrapper[4749]: I0219 18:36:50.042149 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-config\") pod \"controller-manager-867c9d769c-xtjkk\" (UID: \"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0\") " pod="openshift-controller-manager/controller-manager-867c9d769c-xtjkk" Feb 19 18:36:50 crc kubenswrapper[4749]: I0219 18:36:50.045605 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-serving-cert\") pod \"controller-manager-867c9d769c-xtjkk\" (UID: \"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0\") " pod="openshift-controller-manager/controller-manager-867c9d769c-xtjkk" Feb 19 18:36:50 crc kubenswrapper[4749]: I0219 18:36:50.053120 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5635d9d3-5d98-4289-83e8-7927b414a0f8-serving-cert\") pod \"route-controller-manager-6754d5984f-pwcx6\" (UID: \"5635d9d3-5d98-4289-83e8-7927b414a0f8\") " pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6" Feb 19 18:36:50 crc kubenswrapper[4749]: I0219 18:36:50.053391 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfhjd\" (UniqueName: \"kubernetes.io/projected/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-kube-api-access-cfhjd\") pod \"controller-manager-867c9d769c-xtjkk\" (UID: \"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0\") " pod="openshift-controller-manager/controller-manager-867c9d769c-xtjkk" Feb 19 18:36:50 crc kubenswrapper[4749]: I0219 18:36:50.054170 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzxcf\" (UniqueName: \"kubernetes.io/projected/5635d9d3-5d98-4289-83e8-7927b414a0f8-kube-api-access-rzxcf\") pod \"route-controller-manager-6754d5984f-pwcx6\" (UID: \"5635d9d3-5d98-4289-83e8-7927b414a0f8\") " pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6" Feb 19 18:36:50 crc kubenswrapper[4749]: I0219 18:36:50.122869 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6" Feb 19 18:36:50 crc kubenswrapper[4749]: I0219 18:36:50.129895 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-867c9d769c-xtjkk" Feb 19 18:36:50 crc kubenswrapper[4749]: I0219 18:36:50.392240 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-867c9d769c-xtjkk"] Feb 19 18:36:50 crc kubenswrapper[4749]: W0219 18:36:50.397338 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod029c1fb7_d93c_44ef_a03d_db9c7e0e4bd0.slice/crio-b63b51689a96d99d0819004ba517536e6e3c211288adaace6183987ebff4e666 WatchSource:0}: Error finding container b63b51689a96d99d0819004ba517536e6e3c211288adaace6183987ebff4e666: Status 404 returned error can't find the container with id b63b51689a96d99d0819004ba517536e6e3c211288adaace6183987ebff4e666 Feb 19 18:36:50 crc kubenswrapper[4749]: I0219 18:36:50.561180 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6"] Feb 19 18:36:50 crc kubenswrapper[4749]: I0219 18:36:50.688903 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87af11fa-6465-4c84-af15-d570aea6592d" path="/var/lib/kubelet/pods/87af11fa-6465-4c84-af15-d570aea6592d/volumes" Feb 19 18:36:50 crc kubenswrapper[4749]: I0219 18:36:50.690309 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f53d12d-2978-4387-aff5-365a6839966c" path="/var/lib/kubelet/pods/8f53d12d-2978-4387-aff5-365a6839966c/volumes" Feb 19 18:36:51 crc kubenswrapper[4749]: I0219 18:36:51.186862 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-867c9d769c-xtjkk" event={"ID":"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0","Type":"ContainerStarted","Data":"0efc5c91ae793d46f0114d06e992c9bfc3c8c49a1cfde9f908f6d1ccfd26d249"} Feb 19 18:36:51 crc kubenswrapper[4749]: I0219 18:36:51.187252 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-867c9d769c-xtjkk" event={"ID":"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0","Type":"ContainerStarted","Data":"b63b51689a96d99d0819004ba517536e6e3c211288adaace6183987ebff4e666"} Feb 19 18:36:51 crc kubenswrapper[4749]: I0219 18:36:51.187275 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-867c9d769c-xtjkk" Feb 19 18:36:51 crc kubenswrapper[4749]: I0219 18:36:51.191407 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6" event={"ID":"5635d9d3-5d98-4289-83e8-7927b414a0f8","Type":"ContainerStarted","Data":"ffa60dad29852e713a8c310c6ec4a41bd95999e684aa9ecb64f9e595fcd24602"} Feb 19 18:36:51 crc kubenswrapper[4749]: I0219 18:36:51.191447 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6" event={"ID":"5635d9d3-5d98-4289-83e8-7927b414a0f8","Type":"ContainerStarted","Data":"ab3e0a5a6f7d081d976c453cc4941db844b3040f5f08713434b9417a2fba7074"} Feb 19 18:36:51 crc kubenswrapper[4749]: I0219 18:36:51.191884 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6" Feb 19 18:36:51 crc kubenswrapper[4749]: I0219 18:36:51.192273 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-867c9d769c-xtjkk" Feb 19 18:36:51 crc kubenswrapper[4749]: I0219 18:36:51.206038 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-867c9d769c-xtjkk" podStartSLOduration=3.206000686 podStartE2EDuration="3.206000686s" podCreationTimestamp="2026-02-19 18:36:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:36:51.203511526 +0000 UTC m=+185.164731560" watchObservedRunningTime="2026-02-19 18:36:51.206000686 +0000 UTC m=+185.167220640" Feb 19 18:36:51 crc kubenswrapper[4749]: I0219 18:36:51.220409 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6" Feb 19 18:36:51 crc kubenswrapper[4749]: I0219 18:36:51.267730 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6" podStartSLOduration=3.267712899 podStartE2EDuration="3.267712899s" podCreationTimestamp="2026-02-19 18:36:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:36:51.248934755 +0000 UTC m=+185.210154709" watchObservedRunningTime="2026-02-19 18:36:51.267712899 +0000 UTC m=+185.228932853" Feb 19 18:36:54 crc kubenswrapper[4749]: I0219 18:36:54.725401 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 18:36:54 crc kubenswrapper[4749]: I0219 18:36:54.725764 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 18:36:58 crc kubenswrapper[4749]: I0219 18:36:58.066718 4749 generic.go:334] "Generic (PLEG): container finished" podID="be345565-0341-4290-b5e8-9cf728685a6b" containerID="06d5e96737d0f4a47d1dddbd662440c6a106b31f04a61223d71c203bd702097a" exitCode=0 Feb 19 18:36:58 crc kubenswrapper[4749]: I0219 18:36:58.066784 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6hzt4" event={"ID":"be345565-0341-4290-b5e8-9cf728685a6b","Type":"ContainerDied","Data":"06d5e96737d0f4a47d1dddbd662440c6a106b31f04a61223d71c203bd702097a"} Feb 19 18:36:58 crc kubenswrapper[4749]: I0219 18:36:58.067473 4749 scope.go:117] "RemoveContainer" containerID="06d5e96737d0f4a47d1dddbd662440c6a106b31f04a61223d71c203bd702097a" Feb 19 18:36:59 crc kubenswrapper[4749]: I0219 18:36:59.077629 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6hzt4" event={"ID":"be345565-0341-4290-b5e8-9cf728685a6b","Type":"ContainerStarted","Data":"3008ef3793866254d0b21a4dc72db01224bae12a5c710639cea59c82e9571367"} Feb 19 18:36:59 crc kubenswrapper[4749]: I0219 18:36:59.078725 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6hzt4" Feb 19 18:36:59 crc kubenswrapper[4749]: I0219 18:36:59.081380 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6hzt4" Feb 19 18:37:08 crc kubenswrapper[4749]: I0219 18:37:08.367642 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-867c9d769c-xtjkk"] Feb 19 18:37:08 crc kubenswrapper[4749]: I0219 18:37:08.368401 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-867c9d769c-xtjkk" podUID="029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0" containerName="controller-manager" containerID="cri-o://0efc5c91ae793d46f0114d06e992c9bfc3c8c49a1cfde9f908f6d1ccfd26d249" gracePeriod=30 Feb 19 18:37:08 crc kubenswrapper[4749]: I0219 18:37:08.391999 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6"] Feb 19 18:37:08 crc kubenswrapper[4749]: I0219 18:37:08.392263 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6" podUID="5635d9d3-5d98-4289-83e8-7927b414a0f8" containerName="route-controller-manager" containerID="cri-o://ffa60dad29852e713a8c310c6ec4a41bd95999e684aa9ecb64f9e595fcd24602" gracePeriod=30 Feb 19 18:37:08 crc kubenswrapper[4749]: I0219 18:37:08.922298 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6" Feb 19 18:37:08 crc kubenswrapper[4749]: I0219 18:37:08.965694 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-867c9d769c-xtjkk" Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.107151 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-serving-cert\") pod \"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0\" (UID: \"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0\") " Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.107270 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzxcf\" (UniqueName: \"kubernetes.io/projected/5635d9d3-5d98-4289-83e8-7927b414a0f8-kube-api-access-rzxcf\") pod \"5635d9d3-5d98-4289-83e8-7927b414a0f8\" (UID: \"5635d9d3-5d98-4289-83e8-7927b414a0f8\") " Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.107319 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5635d9d3-5d98-4289-83e8-7927b414a0f8-config\") pod \"5635d9d3-5d98-4289-83e8-7927b414a0f8\" (UID: \"5635d9d3-5d98-4289-83e8-7927b414a0f8\") " Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.107348 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-client-ca\") pod \"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0\" (UID: \"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0\") " Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.107385 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5635d9d3-5d98-4289-83e8-7927b414a0f8-serving-cert\") pod \"5635d9d3-5d98-4289-83e8-7927b414a0f8\" (UID: \"5635d9d3-5d98-4289-83e8-7927b414a0f8\") " Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.107427 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5635d9d3-5d98-4289-83e8-7927b414a0f8-client-ca\") pod \"5635d9d3-5d98-4289-83e8-7927b414a0f8\" (UID: \"5635d9d3-5d98-4289-83e8-7927b414a0f8\") " Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.107514 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfhjd\" (UniqueName: \"kubernetes.io/projected/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-kube-api-access-cfhjd\") pod \"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0\" (UID: \"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0\") " Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.107554 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-proxy-ca-bundles\") pod \"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0\" (UID: \"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0\") " Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.107573 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-config\") pod \"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0\" (UID: \"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0\") " Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.108305 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-config" (OuterVolumeSpecName: "config") pod "029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0" (UID: "029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.108300 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5635d9d3-5d98-4289-83e8-7927b414a0f8-config" (OuterVolumeSpecName: "config") pod "5635d9d3-5d98-4289-83e8-7927b414a0f8" (UID: "5635d9d3-5d98-4289-83e8-7927b414a0f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.108659 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-client-ca" (OuterVolumeSpecName: "client-ca") pod "029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0" (UID: "029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.108716 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5635d9d3-5d98-4289-83e8-7927b414a0f8-client-ca" (OuterVolumeSpecName: "client-ca") pod "5635d9d3-5d98-4289-83e8-7927b414a0f8" (UID: "5635d9d3-5d98-4289-83e8-7927b414a0f8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.109139 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0" (UID: "029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.113531 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5635d9d3-5d98-4289-83e8-7927b414a0f8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5635d9d3-5d98-4289-83e8-7927b414a0f8" (UID: "5635d9d3-5d98-4289-83e8-7927b414a0f8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.113598 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5635d9d3-5d98-4289-83e8-7927b414a0f8-kube-api-access-rzxcf" (OuterVolumeSpecName: "kube-api-access-rzxcf") pod "5635d9d3-5d98-4289-83e8-7927b414a0f8" (UID: "5635d9d3-5d98-4289-83e8-7927b414a0f8"). InnerVolumeSpecName "kube-api-access-rzxcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.113673 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0" (UID: "029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.113694 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-kube-api-access-cfhjd" (OuterVolumeSpecName: "kube-api-access-cfhjd") pod "029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0" (UID: "029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0"). InnerVolumeSpecName "kube-api-access-cfhjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.134923 4749 generic.go:334] "Generic (PLEG): container finished" podID="5635d9d3-5d98-4289-83e8-7927b414a0f8" containerID="ffa60dad29852e713a8c310c6ec4a41bd95999e684aa9ecb64f9e595fcd24602" exitCode=0 Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.134986 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6" event={"ID":"5635d9d3-5d98-4289-83e8-7927b414a0f8","Type":"ContainerDied","Data":"ffa60dad29852e713a8c310c6ec4a41bd95999e684aa9ecb64f9e595fcd24602"} Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.135012 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6" event={"ID":"5635d9d3-5d98-4289-83e8-7927b414a0f8","Type":"ContainerDied","Data":"ab3e0a5a6f7d081d976c453cc4941db844b3040f5f08713434b9417a2fba7074"} Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.135044 4749 scope.go:117] "RemoveContainer" containerID="ffa60dad29852e713a8c310c6ec4a41bd95999e684aa9ecb64f9e595fcd24602" Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.135146 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6" Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.137830 4749 generic.go:334] "Generic (PLEG): container finished" podID="029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0" containerID="0efc5c91ae793d46f0114d06e992c9bfc3c8c49a1cfde9f908f6d1ccfd26d249" exitCode=0 Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.137922 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-867c9d769c-xtjkk" event={"ID":"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0","Type":"ContainerDied","Data":"0efc5c91ae793d46f0114d06e992c9bfc3c8c49a1cfde9f908f6d1ccfd26d249"} Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.137997 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-867c9d769c-xtjkk" event={"ID":"029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0","Type":"ContainerDied","Data":"b63b51689a96d99d0819004ba517536e6e3c211288adaace6183987ebff4e666"} Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.138134 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-867c9d769c-xtjkk" Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.154940 4749 scope.go:117] "RemoveContainer" containerID="ffa60dad29852e713a8c310c6ec4a41bd95999e684aa9ecb64f9e595fcd24602" Feb 19 18:37:09 crc kubenswrapper[4749]: E0219 18:37:09.155397 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffa60dad29852e713a8c310c6ec4a41bd95999e684aa9ecb64f9e595fcd24602\": container with ID starting with ffa60dad29852e713a8c310c6ec4a41bd95999e684aa9ecb64f9e595fcd24602 not found: ID does not exist" containerID="ffa60dad29852e713a8c310c6ec4a41bd95999e684aa9ecb64f9e595fcd24602" Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.155450 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffa60dad29852e713a8c310c6ec4a41bd95999e684aa9ecb64f9e595fcd24602"} err="failed to get container status \"ffa60dad29852e713a8c310c6ec4a41bd95999e684aa9ecb64f9e595fcd24602\": rpc error: code = NotFound desc = could not find container \"ffa60dad29852e713a8c310c6ec4a41bd95999e684aa9ecb64f9e595fcd24602\": container with ID starting with ffa60dad29852e713a8c310c6ec4a41bd95999e684aa9ecb64f9e595fcd24602 not found: ID does not exist" Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.155476 4749 scope.go:117] "RemoveContainer" containerID="0efc5c91ae793d46f0114d06e992c9bfc3c8c49a1cfde9f908f6d1ccfd26d249" Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.167981 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6"] Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.175249 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6754d5984f-pwcx6"] Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.175441 4749 scope.go:117] "RemoveContainer" containerID="0efc5c91ae793d46f0114d06e992c9bfc3c8c49a1cfde9f908f6d1ccfd26d249" Feb 19 18:37:09 crc kubenswrapper[4749]: E0219 18:37:09.175854 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0efc5c91ae793d46f0114d06e992c9bfc3c8c49a1cfde9f908f6d1ccfd26d249\": container with ID starting with 0efc5c91ae793d46f0114d06e992c9bfc3c8c49a1cfde9f908f6d1ccfd26d249 not found: ID does not exist" containerID="0efc5c91ae793d46f0114d06e992c9bfc3c8c49a1cfde9f908f6d1ccfd26d249" Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.175882 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0efc5c91ae793d46f0114d06e992c9bfc3c8c49a1cfde9f908f6d1ccfd26d249"} err="failed to get container status \"0efc5c91ae793d46f0114d06e992c9bfc3c8c49a1cfde9f908f6d1ccfd26d249\": rpc error: code = NotFound desc = could not find container \"0efc5c91ae793d46f0114d06e992c9bfc3c8c49a1cfde9f908f6d1ccfd26d249\": container with ID starting with 0efc5c91ae793d46f0114d06e992c9bfc3c8c49a1cfde9f908f6d1ccfd26d249 not found: ID does not exist" Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.178862 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-867c9d769c-xtjkk"] Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.182074 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-867c9d769c-xtjkk"] Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.208760 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5635d9d3-5d98-4289-83e8-7927b414a0f8-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.208996 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5635d9d3-5d98-4289-83e8-7927b414a0f8-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.209112 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfhjd\" (UniqueName: \"kubernetes.io/projected/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-kube-api-access-cfhjd\") on node \"crc\" DevicePath \"\"" Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.209180 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.209242 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.209303 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.209364 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzxcf\" (UniqueName: \"kubernetes.io/projected/5635d9d3-5d98-4289-83e8-7927b414a0f8-kube-api-access-rzxcf\") on node \"crc\" DevicePath \"\"" Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.209419 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5635d9d3-5d98-4289-83e8-7927b414a0f8-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:37:09 crc kubenswrapper[4749]: I0219 18:37:09.209478 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.056416 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-585fb5685-xkfb6"] Feb 19 18:37:10 crc kubenswrapper[4749]: E0219 18:37:10.057206 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5635d9d3-5d98-4289-83e8-7927b414a0f8" containerName="route-controller-manager" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.057229 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5635d9d3-5d98-4289-83e8-7927b414a0f8" containerName="route-controller-manager" Feb 19 18:37:10 crc kubenswrapper[4749]: E0219 18:37:10.057248 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0" containerName="controller-manager" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.057262 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0" containerName="controller-manager" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.057468 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5635d9d3-5d98-4289-83e8-7927b414a0f8" containerName="route-controller-manager" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.057502 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0" containerName="controller-manager" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.058137 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-585fb5685-xkfb6" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.064216 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.064315 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.064341 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.064536 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.065892 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz"] Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.067194 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.070995 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.071489 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.071668 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.071952 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.072436 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.072556 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.072718 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.073236 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.075871 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-585fb5685-xkfb6"] Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.080703 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.098180 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz"] Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.122016 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/43f9c67b-3e36-4eb5-8960-7b39ada11a50-client-ca\") pod \"route-controller-manager-df4584fbc-l2tzz\" (UID: \"43f9c67b-3e36-4eb5-8960-7b39ada11a50\") " pod="openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.122113 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffcedbd0-81fd-49e3-8801-127d67a7b71a-config\") pod \"controller-manager-585fb5685-xkfb6\" (UID: \"ffcedbd0-81fd-49e3-8801-127d67a7b71a\") " pod="openshift-controller-manager/controller-manager-585fb5685-xkfb6" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.122147 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43f9c67b-3e36-4eb5-8960-7b39ada11a50-config\") pod \"route-controller-manager-df4584fbc-l2tzz\" (UID: \"43f9c67b-3e36-4eb5-8960-7b39ada11a50\") " pod="openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.122258 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffcedbd0-81fd-49e3-8801-127d67a7b71a-serving-cert\") pod \"controller-manager-585fb5685-xkfb6\" (UID: \"ffcedbd0-81fd-49e3-8801-127d67a7b71a\") " pod="openshift-controller-manager/controller-manager-585fb5685-xkfb6" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.122305 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9vbs\" (UniqueName: \"kubernetes.io/projected/43f9c67b-3e36-4eb5-8960-7b39ada11a50-kube-api-access-l9vbs\") pod \"route-controller-manager-df4584fbc-l2tzz\" (UID: \"43f9c67b-3e36-4eb5-8960-7b39ada11a50\") " pod="openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.122367 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ffcedbd0-81fd-49e3-8801-127d67a7b71a-proxy-ca-bundles\") pod \"controller-manager-585fb5685-xkfb6\" (UID: \"ffcedbd0-81fd-49e3-8801-127d67a7b71a\") " pod="openshift-controller-manager/controller-manager-585fb5685-xkfb6" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.122396 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffcedbd0-81fd-49e3-8801-127d67a7b71a-client-ca\") pod \"controller-manager-585fb5685-xkfb6\" (UID: \"ffcedbd0-81fd-49e3-8801-127d67a7b71a\") " pod="openshift-controller-manager/controller-manager-585fb5685-xkfb6" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.122560 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43f9c67b-3e36-4eb5-8960-7b39ada11a50-serving-cert\") pod \"route-controller-manager-df4584fbc-l2tzz\" (UID: \"43f9c67b-3e36-4eb5-8960-7b39ada11a50\") " pod="openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.122591 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz8mm\" (UniqueName: \"kubernetes.io/projected/ffcedbd0-81fd-49e3-8801-127d67a7b71a-kube-api-access-xz8mm\") pod \"controller-manager-585fb5685-xkfb6\" (UID: \"ffcedbd0-81fd-49e3-8801-127d67a7b71a\") " pod="openshift-controller-manager/controller-manager-585fb5685-xkfb6" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.224114 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43f9c67b-3e36-4eb5-8960-7b39ada11a50-serving-cert\") pod \"route-controller-manager-df4584fbc-l2tzz\" (UID: \"43f9c67b-3e36-4eb5-8960-7b39ada11a50\") " pod="openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.224163 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz8mm\" (UniqueName: \"kubernetes.io/projected/ffcedbd0-81fd-49e3-8801-127d67a7b71a-kube-api-access-xz8mm\") pod \"controller-manager-585fb5685-xkfb6\" (UID: \"ffcedbd0-81fd-49e3-8801-127d67a7b71a\") " pod="openshift-controller-manager/controller-manager-585fb5685-xkfb6" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.224188 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/43f9c67b-3e36-4eb5-8960-7b39ada11a50-client-ca\") pod \"route-controller-manager-df4584fbc-l2tzz\" (UID: \"43f9c67b-3e36-4eb5-8960-7b39ada11a50\") " pod="openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.224215 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffcedbd0-81fd-49e3-8801-127d67a7b71a-config\") pod \"controller-manager-585fb5685-xkfb6\" (UID: \"ffcedbd0-81fd-49e3-8801-127d67a7b71a\") " pod="openshift-controller-manager/controller-manager-585fb5685-xkfb6" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.224233 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43f9c67b-3e36-4eb5-8960-7b39ada11a50-config\") pod \"route-controller-manager-df4584fbc-l2tzz\" (UID: \"43f9c67b-3e36-4eb5-8960-7b39ada11a50\") " pod="openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.224285 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffcedbd0-81fd-49e3-8801-127d67a7b71a-serving-cert\") pod \"controller-manager-585fb5685-xkfb6\" (UID: \"ffcedbd0-81fd-49e3-8801-127d67a7b71a\") " pod="openshift-controller-manager/controller-manager-585fb5685-xkfb6" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.224305 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9vbs\" (UniqueName: \"kubernetes.io/projected/43f9c67b-3e36-4eb5-8960-7b39ada11a50-kube-api-access-l9vbs\") pod \"route-controller-manager-df4584fbc-l2tzz\" (UID: \"43f9c67b-3e36-4eb5-8960-7b39ada11a50\") " pod="openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.224328 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ffcedbd0-81fd-49e3-8801-127d67a7b71a-proxy-ca-bundles\") pod \"controller-manager-585fb5685-xkfb6\" (UID: \"ffcedbd0-81fd-49e3-8801-127d67a7b71a\") " pod="openshift-controller-manager/controller-manager-585fb5685-xkfb6" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.224346 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffcedbd0-81fd-49e3-8801-127d67a7b71a-client-ca\") pod \"controller-manager-585fb5685-xkfb6\" (UID: \"ffcedbd0-81fd-49e3-8801-127d67a7b71a\") " pod="openshift-controller-manager/controller-manager-585fb5685-xkfb6" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.225369 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffcedbd0-81fd-49e3-8801-127d67a7b71a-client-ca\") pod \"controller-manager-585fb5685-xkfb6\" (UID: \"ffcedbd0-81fd-49e3-8801-127d67a7b71a\") " pod="openshift-controller-manager/controller-manager-585fb5685-xkfb6" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.225811 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43f9c67b-3e36-4eb5-8960-7b39ada11a50-config\") pod \"route-controller-manager-df4584fbc-l2tzz\" (UID: \"43f9c67b-3e36-4eb5-8960-7b39ada11a50\") " pod="openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.226078 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/43f9c67b-3e36-4eb5-8960-7b39ada11a50-client-ca\") pod \"route-controller-manager-df4584fbc-l2tzz\" (UID: \"43f9c67b-3e36-4eb5-8960-7b39ada11a50\") " pod="openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.226138 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ffcedbd0-81fd-49e3-8801-127d67a7b71a-proxy-ca-bundles\") pod \"controller-manager-585fb5685-xkfb6\" (UID: \"ffcedbd0-81fd-49e3-8801-127d67a7b71a\") " pod="openshift-controller-manager/controller-manager-585fb5685-xkfb6" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.226180 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffcedbd0-81fd-49e3-8801-127d67a7b71a-config\") pod \"controller-manager-585fb5685-xkfb6\" (UID: \"ffcedbd0-81fd-49e3-8801-127d67a7b71a\") " pod="openshift-controller-manager/controller-manager-585fb5685-xkfb6" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.228500 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffcedbd0-81fd-49e3-8801-127d67a7b71a-serving-cert\") pod \"controller-manager-585fb5685-xkfb6\" (UID: \"ffcedbd0-81fd-49e3-8801-127d67a7b71a\") " pod="openshift-controller-manager/controller-manager-585fb5685-xkfb6" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.240312 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43f9c67b-3e36-4eb5-8960-7b39ada11a50-serving-cert\") pod \"route-controller-manager-df4584fbc-l2tzz\" (UID: \"43f9c67b-3e36-4eb5-8960-7b39ada11a50\") " pod="openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.246775 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9vbs\" (UniqueName: \"kubernetes.io/projected/43f9c67b-3e36-4eb5-8960-7b39ada11a50-kube-api-access-l9vbs\") pod \"route-controller-manager-df4584fbc-l2tzz\" (UID: \"43f9c67b-3e36-4eb5-8960-7b39ada11a50\") " pod="openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.259972 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz8mm\" (UniqueName: \"kubernetes.io/projected/ffcedbd0-81fd-49e3-8801-127d67a7b71a-kube-api-access-xz8mm\") pod \"controller-manager-585fb5685-xkfb6\" (UID: \"ffcedbd0-81fd-49e3-8801-127d67a7b71a\") " pod="openshift-controller-manager/controller-manager-585fb5685-xkfb6" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.400484 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-585fb5685-xkfb6" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.409329 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.673716 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz"] Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.689717 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0" path="/var/lib/kubelet/pods/029c1fb7-d93c-44ef-a03d-db9c7e0e4bd0/volumes" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.691062 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5635d9d3-5d98-4289-83e8-7927b414a0f8" path="/var/lib/kubelet/pods/5635d9d3-5d98-4289-83e8-7927b414a0f8/volumes" Feb 19 18:37:10 crc kubenswrapper[4749]: I0219 18:37:10.840625 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-585fb5685-xkfb6"] Feb 19 18:37:10 crc kubenswrapper[4749]: W0219 18:37:10.854180 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffcedbd0_81fd_49e3_8801_127d67a7b71a.slice/crio-b5c5275a37189060a6e3c88c680c1ac8acf4eb587e6e19e6aa74a61f757cf9e4 WatchSource:0}: Error finding container b5c5275a37189060a6e3c88c680c1ac8acf4eb587e6e19e6aa74a61f757cf9e4: Status 404 returned error can't find the container with id b5c5275a37189060a6e3c88c680c1ac8acf4eb587e6e19e6aa74a61f757cf9e4 Feb 19 18:37:11 crc kubenswrapper[4749]: I0219 18:37:11.167258 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz" event={"ID":"43f9c67b-3e36-4eb5-8960-7b39ada11a50","Type":"ContainerStarted","Data":"edf2f4b30dcc0e630c50e2df07351db5a8886af528e6bc56963a7c4f0f3d115d"} Feb 19 18:37:11 crc kubenswrapper[4749]: I0219 18:37:11.167305 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz" event={"ID":"43f9c67b-3e36-4eb5-8960-7b39ada11a50","Type":"ContainerStarted","Data":"ead0296ad9f187df63a2dc69df279973bd184dd015c4206f84050f9cf5321685"} Feb 19 18:37:11 crc kubenswrapper[4749]: I0219 18:37:11.167580 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz" Feb 19 18:37:11 crc kubenswrapper[4749]: I0219 18:37:11.169578 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-585fb5685-xkfb6" event={"ID":"ffcedbd0-81fd-49e3-8801-127d67a7b71a","Type":"ContainerStarted","Data":"2ea728ae8a3149db958eb2e056c5471c4ef784737f38d05c5146e8637fbfdc35"} Feb 19 18:37:11 crc kubenswrapper[4749]: I0219 18:37:11.169637 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-585fb5685-xkfb6" event={"ID":"ffcedbd0-81fd-49e3-8801-127d67a7b71a","Type":"ContainerStarted","Data":"b5c5275a37189060a6e3c88c680c1ac8acf4eb587e6e19e6aa74a61f757cf9e4"} Feb 19 18:37:11 crc kubenswrapper[4749]: I0219 18:37:11.170099 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-585fb5685-xkfb6" Feb 19 18:37:11 crc kubenswrapper[4749]: I0219 18:37:11.185017 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-585fb5685-xkfb6" Feb 19 18:37:11 crc kubenswrapper[4749]: I0219 18:37:11.187850 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz" podStartSLOduration=3.187838774 podStartE2EDuration="3.187838774s" podCreationTimestamp="2026-02-19 18:37:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:37:11.187409043 +0000 UTC m=+205.148628997" watchObservedRunningTime="2026-02-19 18:37:11.187838774 +0000 UTC m=+205.149058728" Feb 19 18:37:11 crc kubenswrapper[4749]: I0219 18:37:11.213306 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-585fb5685-xkfb6" podStartSLOduration=3.213287657 podStartE2EDuration="3.213287657s" podCreationTimestamp="2026-02-19 18:37:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:37:11.211227445 +0000 UTC m=+205.172447399" watchObservedRunningTime="2026-02-19 18:37:11.213287657 +0000 UTC m=+205.174507611" Feb 19 18:37:11 crc kubenswrapper[4749]: I0219 18:37:11.478432 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz" Feb 19 18:37:24 crc kubenswrapper[4749]: I0219 18:37:24.725319 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 18:37:24 crc kubenswrapper[4749]: I0219 18:37:24.726173 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 18:37:24 crc kubenswrapper[4749]: I0219 18:37:24.726253 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 18:37:24 crc kubenswrapper[4749]: I0219 18:37:24.727099 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d123252182a47ccd6ba31cca4bf5a56c186613b03b6954d68bf4e641e17d68fd"} pod="openshift-machine-config-operator/machine-config-daemon-nzldt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 18:37:24 crc kubenswrapper[4749]: I0219 18:37:24.727195 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" containerID="cri-o://d123252182a47ccd6ba31cca4bf5a56c186613b03b6954d68bf4e641e17d68fd" gracePeriod=600 Feb 19 18:37:25 crc kubenswrapper[4749]: I0219 18:37:25.238583 4749 generic.go:334] "Generic (PLEG): container finished" podID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerID="d123252182a47ccd6ba31cca4bf5a56c186613b03b6954d68bf4e641e17d68fd" exitCode=0 Feb 19 18:37:25 crc kubenswrapper[4749]: I0219 18:37:25.238619 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerDied","Data":"d123252182a47ccd6ba31cca4bf5a56c186613b03b6954d68bf4e641e17d68fd"} Feb 19 18:37:26 crc kubenswrapper[4749]: I0219 18:37:26.247010 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerStarted","Data":"e950b1a12f156ed5917a35268977b9c8856e348f780776f4a9cea21c27147df8"} Feb 19 18:37:28 crc kubenswrapper[4749]: I0219 18:37:28.798004 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-585fb5685-xkfb6"] Feb 19 18:37:28 crc kubenswrapper[4749]: I0219 18:37:28.798678 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-585fb5685-xkfb6" podUID="ffcedbd0-81fd-49e3-8801-127d67a7b71a" containerName="controller-manager" containerID="cri-o://2ea728ae8a3149db958eb2e056c5471c4ef784737f38d05c5146e8637fbfdc35" gracePeriod=30 Feb 19 18:37:28 crc kubenswrapper[4749]: I0219 18:37:28.874718 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz"] Feb 19 18:37:28 crc kubenswrapper[4749]: I0219 18:37:28.875048 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz" podUID="43f9c67b-3e36-4eb5-8960-7b39ada11a50" containerName="route-controller-manager" containerID="cri-o://edf2f4b30dcc0e630c50e2df07351db5a8886af528e6bc56963a7c4f0f3d115d" gracePeriod=30 Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.265491 4749 generic.go:334] "Generic (PLEG): container finished" podID="43f9c67b-3e36-4eb5-8960-7b39ada11a50" containerID="edf2f4b30dcc0e630c50e2df07351db5a8886af528e6bc56963a7c4f0f3d115d" exitCode=0 Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.265615 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz" event={"ID":"43f9c67b-3e36-4eb5-8960-7b39ada11a50","Type":"ContainerDied","Data":"edf2f4b30dcc0e630c50e2df07351db5a8886af528e6bc56963a7c4f0f3d115d"} Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.268052 4749 generic.go:334] "Generic (PLEG): container finished" podID="ffcedbd0-81fd-49e3-8801-127d67a7b71a" containerID="2ea728ae8a3149db958eb2e056c5471c4ef784737f38d05c5146e8637fbfdc35" exitCode=0 Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.268090 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-585fb5685-xkfb6" event={"ID":"ffcedbd0-81fd-49e3-8801-127d67a7b71a","Type":"ContainerDied","Data":"2ea728ae8a3149db958eb2e056c5471c4ef784737f38d05c5146e8637fbfdc35"} Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.362014 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz" Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.417017 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-585fb5685-xkfb6" Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.496405 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43f9c67b-3e36-4eb5-8960-7b39ada11a50-config\") pod \"43f9c67b-3e36-4eb5-8960-7b39ada11a50\" (UID: \"43f9c67b-3e36-4eb5-8960-7b39ada11a50\") " Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.496473 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/43f9c67b-3e36-4eb5-8960-7b39ada11a50-client-ca\") pod \"43f9c67b-3e36-4eb5-8960-7b39ada11a50\" (UID: \"43f9c67b-3e36-4eb5-8960-7b39ada11a50\") " Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.496507 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43f9c67b-3e36-4eb5-8960-7b39ada11a50-serving-cert\") pod \"43f9c67b-3e36-4eb5-8960-7b39ada11a50\" (UID: \"43f9c67b-3e36-4eb5-8960-7b39ada11a50\") " Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.496577 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9vbs\" (UniqueName: \"kubernetes.io/projected/43f9c67b-3e36-4eb5-8960-7b39ada11a50-kube-api-access-l9vbs\") pod \"43f9c67b-3e36-4eb5-8960-7b39ada11a50\" (UID: \"43f9c67b-3e36-4eb5-8960-7b39ada11a50\") " Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.497704 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43f9c67b-3e36-4eb5-8960-7b39ada11a50-client-ca" (OuterVolumeSpecName: "client-ca") pod "43f9c67b-3e36-4eb5-8960-7b39ada11a50" (UID: "43f9c67b-3e36-4eb5-8960-7b39ada11a50"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.497779 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43f9c67b-3e36-4eb5-8960-7b39ada11a50-config" (OuterVolumeSpecName: "config") pod "43f9c67b-3e36-4eb5-8960-7b39ada11a50" (UID: "43f9c67b-3e36-4eb5-8960-7b39ada11a50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.502802 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43f9c67b-3e36-4eb5-8960-7b39ada11a50-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "43f9c67b-3e36-4eb5-8960-7b39ada11a50" (UID: "43f9c67b-3e36-4eb5-8960-7b39ada11a50"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.503306 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43f9c67b-3e36-4eb5-8960-7b39ada11a50-kube-api-access-l9vbs" (OuterVolumeSpecName: "kube-api-access-l9vbs") pod "43f9c67b-3e36-4eb5-8960-7b39ada11a50" (UID: "43f9c67b-3e36-4eb5-8960-7b39ada11a50"). InnerVolumeSpecName "kube-api-access-l9vbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.597200 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffcedbd0-81fd-49e3-8801-127d67a7b71a-config\") pod \"ffcedbd0-81fd-49e3-8801-127d67a7b71a\" (UID: \"ffcedbd0-81fd-49e3-8801-127d67a7b71a\") " Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.597251 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz8mm\" (UniqueName: \"kubernetes.io/projected/ffcedbd0-81fd-49e3-8801-127d67a7b71a-kube-api-access-xz8mm\") pod \"ffcedbd0-81fd-49e3-8801-127d67a7b71a\" (UID: \"ffcedbd0-81fd-49e3-8801-127d67a7b71a\") " Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.597298 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ffcedbd0-81fd-49e3-8801-127d67a7b71a-proxy-ca-bundles\") pod \"ffcedbd0-81fd-49e3-8801-127d67a7b71a\" (UID: \"ffcedbd0-81fd-49e3-8801-127d67a7b71a\") " Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.597349 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffcedbd0-81fd-49e3-8801-127d67a7b71a-client-ca\") pod \"ffcedbd0-81fd-49e3-8801-127d67a7b71a\" (UID: \"ffcedbd0-81fd-49e3-8801-127d67a7b71a\") " Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.597399 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffcedbd0-81fd-49e3-8801-127d67a7b71a-serving-cert\") pod \"ffcedbd0-81fd-49e3-8801-127d67a7b71a\" (UID: \"ffcedbd0-81fd-49e3-8801-127d67a7b71a\") " Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.597682 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43f9c67b-3e36-4eb5-8960-7b39ada11a50-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.597700 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/43f9c67b-3e36-4eb5-8960-7b39ada11a50-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.597709 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43f9c67b-3e36-4eb5-8960-7b39ada11a50-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.597718 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9vbs\" (UniqueName: \"kubernetes.io/projected/43f9c67b-3e36-4eb5-8960-7b39ada11a50-kube-api-access-l9vbs\") on node \"crc\" DevicePath \"\"" Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.598700 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffcedbd0-81fd-49e3-8801-127d67a7b71a-config" (OuterVolumeSpecName: "config") pod "ffcedbd0-81fd-49e3-8801-127d67a7b71a" (UID: "ffcedbd0-81fd-49e3-8801-127d67a7b71a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.598862 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffcedbd0-81fd-49e3-8801-127d67a7b71a-client-ca" (OuterVolumeSpecName: "client-ca") pod "ffcedbd0-81fd-49e3-8801-127d67a7b71a" (UID: "ffcedbd0-81fd-49e3-8801-127d67a7b71a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.598873 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffcedbd0-81fd-49e3-8801-127d67a7b71a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ffcedbd0-81fd-49e3-8801-127d67a7b71a" (UID: "ffcedbd0-81fd-49e3-8801-127d67a7b71a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.600912 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffcedbd0-81fd-49e3-8801-127d67a7b71a-kube-api-access-xz8mm" (OuterVolumeSpecName: "kube-api-access-xz8mm") pod "ffcedbd0-81fd-49e3-8801-127d67a7b71a" (UID: "ffcedbd0-81fd-49e3-8801-127d67a7b71a"). InnerVolumeSpecName "kube-api-access-xz8mm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.601164 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffcedbd0-81fd-49e3-8801-127d67a7b71a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ffcedbd0-81fd-49e3-8801-127d67a7b71a" (UID: "ffcedbd0-81fd-49e3-8801-127d67a7b71a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.698998 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffcedbd0-81fd-49e3-8801-127d67a7b71a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.699040 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffcedbd0-81fd-49e3-8801-127d67a7b71a-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.699050 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz8mm\" (UniqueName: \"kubernetes.io/projected/ffcedbd0-81fd-49e3-8801-127d67a7b71a-kube-api-access-xz8mm\") on node \"crc\" DevicePath \"\"" Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.699061 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ffcedbd0-81fd-49e3-8801-127d67a7b71a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 18:37:29 crc kubenswrapper[4749]: I0219 18:37:29.699069 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffcedbd0-81fd-49e3-8801-127d67a7b71a-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.073356 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-867c9d769c-ldlxw"] Feb 19 18:37:30 crc kubenswrapper[4749]: E0219 18:37:30.073607 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffcedbd0-81fd-49e3-8801-127d67a7b71a" containerName="controller-manager" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.073618 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffcedbd0-81fd-49e3-8801-127d67a7b71a" containerName="controller-manager" Feb 19 18:37:30 crc kubenswrapper[4749]: E0219 18:37:30.073630 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43f9c67b-3e36-4eb5-8960-7b39ada11a50" containerName="route-controller-manager" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.073636 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f9c67b-3e36-4eb5-8960-7b39ada11a50" containerName="route-controller-manager" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.073740 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffcedbd0-81fd-49e3-8801-127d67a7b71a" containerName="controller-manager" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.073749 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="43f9c67b-3e36-4eb5-8960-7b39ada11a50" containerName="route-controller-manager" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.074158 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-867c9d769c-ldlxw" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.080630 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6754d5984f-zksdl"] Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.081332 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-zksdl" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.086762 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6754d5984f-zksdl"] Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.091539 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-867c9d769c-ldlxw"] Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.103368 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/772b35c8-0ee5-4853-846f-1332e07b872e-config\") pod \"route-controller-manager-6754d5984f-zksdl\" (UID: \"772b35c8-0ee5-4853-846f-1332e07b872e\") " pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-zksdl" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.103415 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/772b35c8-0ee5-4853-846f-1332e07b872e-client-ca\") pod \"route-controller-manager-6754d5984f-zksdl\" (UID: \"772b35c8-0ee5-4853-846f-1332e07b872e\") " pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-zksdl" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.205250 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/772b35c8-0ee5-4853-846f-1332e07b872e-serving-cert\") pod \"route-controller-manager-6754d5984f-zksdl\" (UID: \"772b35c8-0ee5-4853-846f-1332e07b872e\") " pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-zksdl" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.205326 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/772b35c8-0ee5-4853-846f-1332e07b872e-config\") pod \"route-controller-manager-6754d5984f-zksdl\" (UID: \"772b35c8-0ee5-4853-846f-1332e07b872e\") " pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-zksdl" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.205361 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/772b35c8-0ee5-4853-846f-1332e07b872e-client-ca\") pod \"route-controller-manager-6754d5984f-zksdl\" (UID: \"772b35c8-0ee5-4853-846f-1332e07b872e\") " pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-zksdl" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.205382 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15e4dbc2-4810-4dfa-8f96-67a03c653232-config\") pod \"controller-manager-867c9d769c-ldlxw\" (UID: \"15e4dbc2-4810-4dfa-8f96-67a03c653232\") " pod="openshift-controller-manager/controller-manager-867c9d769c-ldlxw" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.205399 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/15e4dbc2-4810-4dfa-8f96-67a03c653232-proxy-ca-bundles\") pod \"controller-manager-867c9d769c-ldlxw\" (UID: \"15e4dbc2-4810-4dfa-8f96-67a03c653232\") " pod="openshift-controller-manager/controller-manager-867c9d769c-ldlxw" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.205434 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frjz8\" (UniqueName: \"kubernetes.io/projected/15e4dbc2-4810-4dfa-8f96-67a03c653232-kube-api-access-frjz8\") pod \"controller-manager-867c9d769c-ldlxw\" (UID: \"15e4dbc2-4810-4dfa-8f96-67a03c653232\") " pod="openshift-controller-manager/controller-manager-867c9d769c-ldlxw" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.205468 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15e4dbc2-4810-4dfa-8f96-67a03c653232-serving-cert\") pod \"controller-manager-867c9d769c-ldlxw\" (UID: \"15e4dbc2-4810-4dfa-8f96-67a03c653232\") " pod="openshift-controller-manager/controller-manager-867c9d769c-ldlxw" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.205525 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15e4dbc2-4810-4dfa-8f96-67a03c653232-client-ca\") pod \"controller-manager-867c9d769c-ldlxw\" (UID: \"15e4dbc2-4810-4dfa-8f96-67a03c653232\") " pod="openshift-controller-manager/controller-manager-867c9d769c-ldlxw" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.205542 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx66s\" (UniqueName: \"kubernetes.io/projected/772b35c8-0ee5-4853-846f-1332e07b872e-kube-api-access-sx66s\") pod \"route-controller-manager-6754d5984f-zksdl\" (UID: \"772b35c8-0ee5-4853-846f-1332e07b872e\") " pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-zksdl" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.206295 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/772b35c8-0ee5-4853-846f-1332e07b872e-client-ca\") pod \"route-controller-manager-6754d5984f-zksdl\" (UID: \"772b35c8-0ee5-4853-846f-1332e07b872e\") " pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-zksdl" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.206433 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/772b35c8-0ee5-4853-846f-1332e07b872e-config\") pod \"route-controller-manager-6754d5984f-zksdl\" (UID: \"772b35c8-0ee5-4853-846f-1332e07b872e\") " pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-zksdl" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.275168 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-585fb5685-xkfb6" event={"ID":"ffcedbd0-81fd-49e3-8801-127d67a7b71a","Type":"ContainerDied","Data":"b5c5275a37189060a6e3c88c680c1ac8acf4eb587e6e19e6aa74a61f757cf9e4"} Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.275220 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-585fb5685-xkfb6" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.275259 4749 scope.go:117] "RemoveContainer" containerID="2ea728ae8a3149db958eb2e056c5471c4ef784737f38d05c5146e8637fbfdc35" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.279517 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz" event={"ID":"43f9c67b-3e36-4eb5-8960-7b39ada11a50","Type":"ContainerDied","Data":"ead0296ad9f187df63a2dc69df279973bd184dd015c4206f84050f9cf5321685"} Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.279570 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.297635 4749 scope.go:117] "RemoveContainer" containerID="edf2f4b30dcc0e630c50e2df07351db5a8886af528e6bc56963a7c4f0f3d115d" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.306181 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15e4dbc2-4810-4dfa-8f96-67a03c653232-client-ca\") pod \"controller-manager-867c9d769c-ldlxw\" (UID: \"15e4dbc2-4810-4dfa-8f96-67a03c653232\") " pod="openshift-controller-manager/controller-manager-867c9d769c-ldlxw" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.306224 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx66s\" (UniqueName: \"kubernetes.io/projected/772b35c8-0ee5-4853-846f-1332e07b872e-kube-api-access-sx66s\") pod \"route-controller-manager-6754d5984f-zksdl\" (UID: \"772b35c8-0ee5-4853-846f-1332e07b872e\") " pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-zksdl" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.306255 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/772b35c8-0ee5-4853-846f-1332e07b872e-serving-cert\") pod \"route-controller-manager-6754d5984f-zksdl\" (UID: \"772b35c8-0ee5-4853-846f-1332e07b872e\") " pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-zksdl" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.306302 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15e4dbc2-4810-4dfa-8f96-67a03c653232-config\") pod \"controller-manager-867c9d769c-ldlxw\" (UID: \"15e4dbc2-4810-4dfa-8f96-67a03c653232\") " pod="openshift-controller-manager/controller-manager-867c9d769c-ldlxw" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.306325 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/15e4dbc2-4810-4dfa-8f96-67a03c653232-proxy-ca-bundles\") pod \"controller-manager-867c9d769c-ldlxw\" (UID: \"15e4dbc2-4810-4dfa-8f96-67a03c653232\") " pod="openshift-controller-manager/controller-manager-867c9d769c-ldlxw" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.306344 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frjz8\" (UniqueName: \"kubernetes.io/projected/15e4dbc2-4810-4dfa-8f96-67a03c653232-kube-api-access-frjz8\") pod \"controller-manager-867c9d769c-ldlxw\" (UID: \"15e4dbc2-4810-4dfa-8f96-67a03c653232\") " pod="openshift-controller-manager/controller-manager-867c9d769c-ldlxw" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.306378 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15e4dbc2-4810-4dfa-8f96-67a03c653232-serving-cert\") pod \"controller-manager-867c9d769c-ldlxw\" (UID: \"15e4dbc2-4810-4dfa-8f96-67a03c653232\") " pod="openshift-controller-manager/controller-manager-867c9d769c-ldlxw" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.307261 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15e4dbc2-4810-4dfa-8f96-67a03c653232-client-ca\") pod \"controller-manager-867c9d769c-ldlxw\" (UID: \"15e4dbc2-4810-4dfa-8f96-67a03c653232\") " pod="openshift-controller-manager/controller-manager-867c9d769c-ldlxw" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.308006 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-585fb5685-xkfb6"] Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.308304 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15e4dbc2-4810-4dfa-8f96-67a03c653232-config\") pod \"controller-manager-867c9d769c-ldlxw\" (UID: \"15e4dbc2-4810-4dfa-8f96-67a03c653232\") " pod="openshift-controller-manager/controller-manager-867c9d769c-ldlxw" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.308924 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/15e4dbc2-4810-4dfa-8f96-67a03c653232-proxy-ca-bundles\") pod \"controller-manager-867c9d769c-ldlxw\" (UID: \"15e4dbc2-4810-4dfa-8f96-67a03c653232\") " pod="openshift-controller-manager/controller-manager-867c9d769c-ldlxw" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.311981 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15e4dbc2-4810-4dfa-8f96-67a03c653232-serving-cert\") pod \"controller-manager-867c9d769c-ldlxw\" (UID: \"15e4dbc2-4810-4dfa-8f96-67a03c653232\") " pod="openshift-controller-manager/controller-manager-867c9d769c-ldlxw" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.313162 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/772b35c8-0ee5-4853-846f-1332e07b872e-serving-cert\") pod \"route-controller-manager-6754d5984f-zksdl\" (UID: \"772b35c8-0ee5-4853-846f-1332e07b872e\") " pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-zksdl" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.316312 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-585fb5685-xkfb6"] Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.319342 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz"] Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.322040 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-df4584fbc-l2tzz"] Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.323095 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx66s\" (UniqueName: \"kubernetes.io/projected/772b35c8-0ee5-4853-846f-1332e07b872e-kube-api-access-sx66s\") pod \"route-controller-manager-6754d5984f-zksdl\" (UID: \"772b35c8-0ee5-4853-846f-1332e07b872e\") " pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-zksdl" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.323704 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frjz8\" (UniqueName: \"kubernetes.io/projected/15e4dbc2-4810-4dfa-8f96-67a03c653232-kube-api-access-frjz8\") pod \"controller-manager-867c9d769c-ldlxw\" (UID: \"15e4dbc2-4810-4dfa-8f96-67a03c653232\") " pod="openshift-controller-manager/controller-manager-867c9d769c-ldlxw" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.399845 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-867c9d769c-ldlxw" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.409486 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-zksdl" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.689379 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43f9c67b-3e36-4eb5-8960-7b39ada11a50" path="/var/lib/kubelet/pods/43f9c67b-3e36-4eb5-8960-7b39ada11a50/volumes" Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.690354 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffcedbd0-81fd-49e3-8801-127d67a7b71a" path="/var/lib/kubelet/pods/ffcedbd0-81fd-49e3-8801-127d67a7b71a/volumes" Feb 19 18:37:30 crc kubenswrapper[4749]: W0219 18:37:30.788713 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15e4dbc2_4810_4dfa_8f96_67a03c653232.slice/crio-623573f25a99d329942635a7fc7ad6fa9e41f9da59ef43c3f8171e14f0e4e6bd WatchSource:0}: Error finding container 623573f25a99d329942635a7fc7ad6fa9e41f9da59ef43c3f8171e14f0e4e6bd: Status 404 returned error can't find the container with id 623573f25a99d329942635a7fc7ad6fa9e41f9da59ef43c3f8171e14f0e4e6bd Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.788977 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-867c9d769c-ldlxw"] Feb 19 18:37:30 crc kubenswrapper[4749]: I0219 18:37:30.849243 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6754d5984f-zksdl"] Feb 19 18:37:31 crc kubenswrapper[4749]: I0219 18:37:31.288116 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-zksdl" event={"ID":"772b35c8-0ee5-4853-846f-1332e07b872e","Type":"ContainerStarted","Data":"9c20484e1b0e5fea2394bfa81792d4b4263e9e9eb7a94d53d761b0d9059ca38a"} Feb 19 18:37:31 crc kubenswrapper[4749]: I0219 18:37:31.288452 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-zksdl" Feb 19 18:37:31 crc kubenswrapper[4749]: I0219 18:37:31.288466 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-zksdl" event={"ID":"772b35c8-0ee5-4853-846f-1332e07b872e","Type":"ContainerStarted","Data":"efaed43b42b847219cf0df3a9f80d6e9c03a5f8d06dcf2f160d7d451660b716f"} Feb 19 18:37:31 crc kubenswrapper[4749]: I0219 18:37:31.289611 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-867c9d769c-ldlxw" event={"ID":"15e4dbc2-4810-4dfa-8f96-67a03c653232","Type":"ContainerStarted","Data":"d33eb072e6aa8509ef533a87adcb3c989d6846f22d7889bc122d96da869bc79e"} Feb 19 18:37:31 crc kubenswrapper[4749]: I0219 18:37:31.289636 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-867c9d769c-ldlxw" event={"ID":"15e4dbc2-4810-4dfa-8f96-67a03c653232","Type":"ContainerStarted","Data":"623573f25a99d329942635a7fc7ad6fa9e41f9da59ef43c3f8171e14f0e4e6bd"} Feb 19 18:37:31 crc kubenswrapper[4749]: I0219 18:37:31.289848 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-867c9d769c-ldlxw" Feb 19 18:37:31 crc kubenswrapper[4749]: I0219 18:37:31.293597 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-867c9d769c-ldlxw" Feb 19 18:37:31 crc kubenswrapper[4749]: I0219 18:37:31.308595 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-zksdl" podStartSLOduration=3.308575905 podStartE2EDuration="3.308575905s" podCreationTimestamp="2026-02-19 18:37:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:37:31.307447817 +0000 UTC m=+225.268667771" watchObservedRunningTime="2026-02-19 18:37:31.308575905 +0000 UTC m=+225.269795849" Feb 19 18:37:31 crc kubenswrapper[4749]: I0219 18:37:31.325243 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-867c9d769c-ldlxw" podStartSLOduration=3.325215519 podStartE2EDuration="3.325215519s" podCreationTimestamp="2026-02-19 18:37:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:37:31.323464986 +0000 UTC m=+225.284684930" watchObservedRunningTime="2026-02-19 18:37:31.325215519 +0000 UTC m=+225.286435473" Feb 19 18:37:31 crc kubenswrapper[4749]: I0219 18:37:31.530326 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-zksdl" Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.054392 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jr5gz"] Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.055990 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.074167 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jr5gz"] Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.181874 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d7b12aa-36ed-45e1-ba97-2bdd4eec1090-registry-certificates\") pod \"image-registry-66df7c8f76-jr5gz\" (UID: \"7d7b12aa-36ed-45e1-ba97-2bdd4eec1090\") " pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.181921 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d7b12aa-36ed-45e1-ba97-2bdd4eec1090-bound-sa-token\") pod \"image-registry-66df7c8f76-jr5gz\" (UID: \"7d7b12aa-36ed-45e1-ba97-2bdd4eec1090\") " pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.181947 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gjrz\" (UniqueName: \"kubernetes.io/projected/7d7b12aa-36ed-45e1-ba97-2bdd4eec1090-kube-api-access-8gjrz\") pod \"image-registry-66df7c8f76-jr5gz\" (UID: \"7d7b12aa-36ed-45e1-ba97-2bdd4eec1090\") " pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.181976 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d7b12aa-36ed-45e1-ba97-2bdd4eec1090-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jr5gz\" (UID: \"7d7b12aa-36ed-45e1-ba97-2bdd4eec1090\") " pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.182137 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d7b12aa-36ed-45e1-ba97-2bdd4eec1090-trusted-ca\") pod \"image-registry-66df7c8f76-jr5gz\" (UID: \"7d7b12aa-36ed-45e1-ba97-2bdd4eec1090\") " pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.182230 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d7b12aa-36ed-45e1-ba97-2bdd4eec1090-registry-tls\") pod \"image-registry-66df7c8f76-jr5gz\" (UID: \"7d7b12aa-36ed-45e1-ba97-2bdd4eec1090\") " pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.182249 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d7b12aa-36ed-45e1-ba97-2bdd4eec1090-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jr5gz\" (UID: \"7d7b12aa-36ed-45e1-ba97-2bdd4eec1090\") " pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.182328 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-jr5gz\" (UID: \"7d7b12aa-36ed-45e1-ba97-2bdd4eec1090\") " pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.215540 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-jr5gz\" (UID: \"7d7b12aa-36ed-45e1-ba97-2bdd4eec1090\") " pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.283777 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gjrz\" (UniqueName: \"kubernetes.io/projected/7d7b12aa-36ed-45e1-ba97-2bdd4eec1090-kube-api-access-8gjrz\") pod \"image-registry-66df7c8f76-jr5gz\" (UID: \"7d7b12aa-36ed-45e1-ba97-2bdd4eec1090\") " pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.283845 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d7b12aa-36ed-45e1-ba97-2bdd4eec1090-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jr5gz\" (UID: \"7d7b12aa-36ed-45e1-ba97-2bdd4eec1090\") " pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.283880 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d7b12aa-36ed-45e1-ba97-2bdd4eec1090-trusted-ca\") pod \"image-registry-66df7c8f76-jr5gz\" (UID: \"7d7b12aa-36ed-45e1-ba97-2bdd4eec1090\") " pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.283911 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d7b12aa-36ed-45e1-ba97-2bdd4eec1090-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jr5gz\" (UID: \"7d7b12aa-36ed-45e1-ba97-2bdd4eec1090\") " pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.283930 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d7b12aa-36ed-45e1-ba97-2bdd4eec1090-registry-tls\") pod \"image-registry-66df7c8f76-jr5gz\" (UID: \"7d7b12aa-36ed-45e1-ba97-2bdd4eec1090\") " pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.283959 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d7b12aa-36ed-45e1-ba97-2bdd4eec1090-registry-certificates\") pod \"image-registry-66df7c8f76-jr5gz\" (UID: \"7d7b12aa-36ed-45e1-ba97-2bdd4eec1090\") " pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.283979 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d7b12aa-36ed-45e1-ba97-2bdd4eec1090-bound-sa-token\") pod \"image-registry-66df7c8f76-jr5gz\" (UID: \"7d7b12aa-36ed-45e1-ba97-2bdd4eec1090\") " pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.284373 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d7b12aa-36ed-45e1-ba97-2bdd4eec1090-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jr5gz\" (UID: \"7d7b12aa-36ed-45e1-ba97-2bdd4eec1090\") " pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.285455 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d7b12aa-36ed-45e1-ba97-2bdd4eec1090-trusted-ca\") pod \"image-registry-66df7c8f76-jr5gz\" (UID: \"7d7b12aa-36ed-45e1-ba97-2bdd4eec1090\") " pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.285580 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d7b12aa-36ed-45e1-ba97-2bdd4eec1090-registry-certificates\") pod \"image-registry-66df7c8f76-jr5gz\" (UID: \"7d7b12aa-36ed-45e1-ba97-2bdd4eec1090\") " pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.290753 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d7b12aa-36ed-45e1-ba97-2bdd4eec1090-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jr5gz\" (UID: \"7d7b12aa-36ed-45e1-ba97-2bdd4eec1090\") " pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.295167 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d7b12aa-36ed-45e1-ba97-2bdd4eec1090-registry-tls\") pod \"image-registry-66df7c8f76-jr5gz\" (UID: \"7d7b12aa-36ed-45e1-ba97-2bdd4eec1090\") " pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.304708 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d7b12aa-36ed-45e1-ba97-2bdd4eec1090-bound-sa-token\") pod \"image-registry-66df7c8f76-jr5gz\" (UID: \"7d7b12aa-36ed-45e1-ba97-2bdd4eec1090\") " pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.304844 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gjrz\" (UniqueName: \"kubernetes.io/projected/7d7b12aa-36ed-45e1-ba97-2bdd4eec1090-kube-api-access-8gjrz\") pod \"image-registry-66df7c8f76-jr5gz\" (UID: \"7d7b12aa-36ed-45e1-ba97-2bdd4eec1090\") " pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.420657 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" Feb 19 18:37:56 crc kubenswrapper[4749]: I0219 18:37:56.895936 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jr5gz"] Feb 19 18:37:57 crc kubenswrapper[4749]: I0219 18:37:57.576756 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" event={"ID":"7d7b12aa-36ed-45e1-ba97-2bdd4eec1090","Type":"ContainerStarted","Data":"2f42b3b7e88bfbcd6fec02492b472a2afc1ae2b91a8f43a0de3d5dc3a7da50d2"} Feb 19 18:37:57 crc kubenswrapper[4749]: I0219 18:37:57.577126 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" Feb 19 18:37:57 crc kubenswrapper[4749]: I0219 18:37:57.577139 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" event={"ID":"7d7b12aa-36ed-45e1-ba97-2bdd4eec1090","Type":"ContainerStarted","Data":"b190b5170a193bd81583318f02b70ed161dfe9f612773bccd72483d97ba6d626"} Feb 19 18:37:57 crc kubenswrapper[4749]: I0219 18:37:57.596516 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" podStartSLOduration=1.596498187 podStartE2EDuration="1.596498187s" podCreationTimestamp="2026-02-19 18:37:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:37:57.593627935 +0000 UTC m=+251.554847879" watchObservedRunningTime="2026-02-19 18:37:57.596498187 +0000 UTC m=+251.557718131" Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.389609 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2n4lk"] Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.390443 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2n4lk" podUID="e19c61ad-b387-457b-814b-e382b0265880" containerName="registry-server" containerID="cri-o://bf71ebb1f626a916784e5d96dac0e63da11729d1b2eab51122f2a1fcf658658c" gracePeriod=30 Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.396323 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5psfv"] Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.399474 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5psfv" podUID="f9165e83-4c09-4c44-b185-8f8922fcdad7" containerName="registry-server" containerID="cri-o://79c420ff0c280697eff14168d9b1bd532f25df855e04ae0afe8f73f071036838" gracePeriod=30 Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.412270 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6hzt4"] Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.412533 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-6hzt4" podUID="be345565-0341-4290-b5e8-9cf728685a6b" containerName="marketplace-operator" containerID="cri-o://3008ef3793866254d0b21a4dc72db01224bae12a5c710639cea59c82e9571367" gracePeriod=30 Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.423342 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zqrzw"] Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.423613 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zqrzw" podUID="ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2" containerName="registry-server" containerID="cri-o://57053058ce9dba1ad132b46be3808a9c08bcce8abeab544fe1107d1a539677c0" gracePeriod=30 Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.428172 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wd24f"] Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.429121 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wd24f" Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.433147 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mw89w"] Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.433581 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mw89w" podUID="69277352-22e8-4094-944f-bb38a3fb3a83" containerName="registry-server" containerID="cri-o://c6924bb6bd831cc3ea14dd7bfc20730677ce22775f25736361ed2ef2eba435f4" gracePeriod=30 Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.435972 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wd24f"] Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.466629 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-zqrzw" podUID="ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2" containerName="registry-server" probeResult="failure" output="" Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.493380 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22a5e8d0-f222-4a7c-8bb7-51689ef460a8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wd24f\" (UID: \"22a5e8d0-f222-4a7c-8bb7-51689ef460a8\") " pod="openshift-marketplace/marketplace-operator-79b997595-wd24f" Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.493443 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pzts\" (UniqueName: \"kubernetes.io/projected/22a5e8d0-f222-4a7c-8bb7-51689ef460a8-kube-api-access-2pzts\") pod \"marketplace-operator-79b997595-wd24f\" (UID: \"22a5e8d0-f222-4a7c-8bb7-51689ef460a8\") " pod="openshift-marketplace/marketplace-operator-79b997595-wd24f" Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.493479 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/22a5e8d0-f222-4a7c-8bb7-51689ef460a8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wd24f\" (UID: \"22a5e8d0-f222-4a7c-8bb7-51689ef460a8\") " pod="openshift-marketplace/marketplace-operator-79b997595-wd24f" Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.594655 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pzts\" (UniqueName: \"kubernetes.io/projected/22a5e8d0-f222-4a7c-8bb7-51689ef460a8-kube-api-access-2pzts\") pod \"marketplace-operator-79b997595-wd24f\" (UID: \"22a5e8d0-f222-4a7c-8bb7-51689ef460a8\") " pod="openshift-marketplace/marketplace-operator-79b997595-wd24f" Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.595065 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/22a5e8d0-f222-4a7c-8bb7-51689ef460a8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wd24f\" (UID: \"22a5e8d0-f222-4a7c-8bb7-51689ef460a8\") " pod="openshift-marketplace/marketplace-operator-79b997595-wd24f" Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.595181 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22a5e8d0-f222-4a7c-8bb7-51689ef460a8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wd24f\" (UID: \"22a5e8d0-f222-4a7c-8bb7-51689ef460a8\") " pod="openshift-marketplace/marketplace-operator-79b997595-wd24f" Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.596280 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22a5e8d0-f222-4a7c-8bb7-51689ef460a8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wd24f\" (UID: \"22a5e8d0-f222-4a7c-8bb7-51689ef460a8\") " pod="openshift-marketplace/marketplace-operator-79b997595-wd24f" Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.603794 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/22a5e8d0-f222-4a7c-8bb7-51689ef460a8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wd24f\" (UID: \"22a5e8d0-f222-4a7c-8bb7-51689ef460a8\") " pod="openshift-marketplace/marketplace-operator-79b997595-wd24f" Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.614377 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pzts\" (UniqueName: \"kubernetes.io/projected/22a5e8d0-f222-4a7c-8bb7-51689ef460a8-kube-api-access-2pzts\") pod \"marketplace-operator-79b997595-wd24f\" (UID: \"22a5e8d0-f222-4a7c-8bb7-51689ef460a8\") " pod="openshift-marketplace/marketplace-operator-79b997595-wd24f" Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.646504 4749 generic.go:334] "Generic (PLEG): container finished" podID="f9165e83-4c09-4c44-b185-8f8922fcdad7" containerID="79c420ff0c280697eff14168d9b1bd532f25df855e04ae0afe8f73f071036838" exitCode=0 Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.646554 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5psfv" event={"ID":"f9165e83-4c09-4c44-b185-8f8922fcdad7","Type":"ContainerDied","Data":"79c420ff0c280697eff14168d9b1bd532f25df855e04ae0afe8f73f071036838"} Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.650462 4749 generic.go:334] "Generic (PLEG): container finished" podID="ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2" containerID="57053058ce9dba1ad132b46be3808a9c08bcce8abeab544fe1107d1a539677c0" exitCode=0 Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.650498 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zqrzw" event={"ID":"ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2","Type":"ContainerDied","Data":"57053058ce9dba1ad132b46be3808a9c08bcce8abeab544fe1107d1a539677c0"} Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.652882 4749 generic.go:334] "Generic (PLEG): container finished" podID="be345565-0341-4290-b5e8-9cf728685a6b" containerID="3008ef3793866254d0b21a4dc72db01224bae12a5c710639cea59c82e9571367" exitCode=0 Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.652939 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6hzt4" event={"ID":"be345565-0341-4290-b5e8-9cf728685a6b","Type":"ContainerDied","Data":"3008ef3793866254d0b21a4dc72db01224bae12a5c710639cea59c82e9571367"} Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.652961 4749 scope.go:117] "RemoveContainer" containerID="06d5e96737d0f4a47d1dddbd662440c6a106b31f04a61223d71c203bd702097a" Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.656862 4749 generic.go:334] "Generic (PLEG): container finished" podID="e19c61ad-b387-457b-814b-e382b0265880" containerID="bf71ebb1f626a916784e5d96dac0e63da11729d1b2eab51122f2a1fcf658658c" exitCode=0 Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.656915 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2n4lk" event={"ID":"e19c61ad-b387-457b-814b-e382b0265880","Type":"ContainerDied","Data":"bf71ebb1f626a916784e5d96dac0e63da11729d1b2eab51122f2a1fcf658658c"} Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.660138 4749 generic.go:334] "Generic (PLEG): container finished" podID="69277352-22e8-4094-944f-bb38a3fb3a83" containerID="c6924bb6bd831cc3ea14dd7bfc20730677ce22775f25736361ed2ef2eba435f4" exitCode=0 Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.660158 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mw89w" event={"ID":"69277352-22e8-4094-944f-bb38a3fb3a83","Type":"ContainerDied","Data":"c6924bb6bd831cc3ea14dd7bfc20730677ce22775f25736361ed2ef2eba435f4"} Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.754610 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wd24f" Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.917352 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5psfv" Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.971694 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6hzt4" Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.976892 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mw89w" Feb 19 18:38:10 crc kubenswrapper[4749]: I0219 18:38:10.984777 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zqrzw" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.003990 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hf4qf\" (UniqueName: \"kubernetes.io/projected/be345565-0341-4290-b5e8-9cf728685a6b-kube-api-access-hf4qf\") pod \"be345565-0341-4290-b5e8-9cf728685a6b\" (UID: \"be345565-0341-4290-b5e8-9cf728685a6b\") " Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.004047 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69277352-22e8-4094-944f-bb38a3fb3a83-utilities\") pod \"69277352-22e8-4094-944f-bb38a3fb3a83\" (UID: \"69277352-22e8-4094-944f-bb38a3fb3a83\") " Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.004070 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bflbh\" (UniqueName: \"kubernetes.io/projected/69277352-22e8-4094-944f-bb38a3fb3a83-kube-api-access-bflbh\") pod \"69277352-22e8-4094-944f-bb38a3fb3a83\" (UID: \"69277352-22e8-4094-944f-bb38a3fb3a83\") " Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.004126 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/be345565-0341-4290-b5e8-9cf728685a6b-marketplace-operator-metrics\") pod \"be345565-0341-4290-b5e8-9cf728685a6b\" (UID: \"be345565-0341-4290-b5e8-9cf728685a6b\") " Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.004147 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/be345565-0341-4290-b5e8-9cf728685a6b-marketplace-trusted-ca\") pod \"be345565-0341-4290-b5e8-9cf728685a6b\" (UID: \"be345565-0341-4290-b5e8-9cf728685a6b\") " Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.004179 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dx7n\" (UniqueName: \"kubernetes.io/projected/f9165e83-4c09-4c44-b185-8f8922fcdad7-kube-api-access-7dx7n\") pod \"f9165e83-4c09-4c44-b185-8f8922fcdad7\" (UID: \"f9165e83-4c09-4c44-b185-8f8922fcdad7\") " Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.004204 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2-utilities\") pod \"ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2\" (UID: \"ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2\") " Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.004219 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfdxc\" (UniqueName: \"kubernetes.io/projected/ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2-kube-api-access-pfdxc\") pod \"ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2\" (UID: \"ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2\") " Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.004242 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2-catalog-content\") pod \"ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2\" (UID: \"ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2\") " Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.004293 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9165e83-4c09-4c44-b185-8f8922fcdad7-utilities\") pod \"f9165e83-4c09-4c44-b185-8f8922fcdad7\" (UID: \"f9165e83-4c09-4c44-b185-8f8922fcdad7\") " Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.004310 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9165e83-4c09-4c44-b185-8f8922fcdad7-catalog-content\") pod \"f9165e83-4c09-4c44-b185-8f8922fcdad7\" (UID: \"f9165e83-4c09-4c44-b185-8f8922fcdad7\") " Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.004351 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69277352-22e8-4094-944f-bb38a3fb3a83-catalog-content\") pod \"69277352-22e8-4094-944f-bb38a3fb3a83\" (UID: \"69277352-22e8-4094-944f-bb38a3fb3a83\") " Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.005699 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2-utilities" (OuterVolumeSpecName: "utilities") pod "ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2" (UID: "ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.012237 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9165e83-4c09-4c44-b185-8f8922fcdad7-kube-api-access-7dx7n" (OuterVolumeSpecName: "kube-api-access-7dx7n") pod "f9165e83-4c09-4c44-b185-8f8922fcdad7" (UID: "f9165e83-4c09-4c44-b185-8f8922fcdad7"). InnerVolumeSpecName "kube-api-access-7dx7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.013298 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69277352-22e8-4094-944f-bb38a3fb3a83-utilities" (OuterVolumeSpecName: "utilities") pod "69277352-22e8-4094-944f-bb38a3fb3a83" (UID: "69277352-22e8-4094-944f-bb38a3fb3a83"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.014734 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2-kube-api-access-pfdxc" (OuterVolumeSpecName: "kube-api-access-pfdxc") pod "ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2" (UID: "ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2"). InnerVolumeSpecName "kube-api-access-pfdxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.017586 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9165e83-4c09-4c44-b185-8f8922fcdad7-utilities" (OuterVolumeSpecName: "utilities") pod "f9165e83-4c09-4c44-b185-8f8922fcdad7" (UID: "f9165e83-4c09-4c44-b185-8f8922fcdad7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.019585 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be345565-0341-4290-b5e8-9cf728685a6b-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "be345565-0341-4290-b5e8-9cf728685a6b" (UID: "be345565-0341-4290-b5e8-9cf728685a6b"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.019783 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be345565-0341-4290-b5e8-9cf728685a6b-kube-api-access-hf4qf" (OuterVolumeSpecName: "kube-api-access-hf4qf") pod "be345565-0341-4290-b5e8-9cf728685a6b" (UID: "be345565-0341-4290-b5e8-9cf728685a6b"). InnerVolumeSpecName "kube-api-access-hf4qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.019918 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be345565-0341-4290-b5e8-9cf728685a6b-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "be345565-0341-4290-b5e8-9cf728685a6b" (UID: "be345565-0341-4290-b5e8-9cf728685a6b"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.023147 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69277352-22e8-4094-944f-bb38a3fb3a83-kube-api-access-bflbh" (OuterVolumeSpecName: "kube-api-access-bflbh") pod "69277352-22e8-4094-944f-bb38a3fb3a83" (UID: "69277352-22e8-4094-944f-bb38a3fb3a83"). InnerVolumeSpecName "kube-api-access-bflbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.026073 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2n4lk" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.042546 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2" (UID: "ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.086473 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9165e83-4c09-4c44-b185-8f8922fcdad7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9165e83-4c09-4c44-b185-8f8922fcdad7" (UID: "f9165e83-4c09-4c44-b185-8f8922fcdad7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.104771 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e19c61ad-b387-457b-814b-e382b0265880-utilities\") pod \"e19c61ad-b387-457b-814b-e382b0265880\" (UID: \"e19c61ad-b387-457b-814b-e382b0265880\") " Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.104824 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e19c61ad-b387-457b-814b-e382b0265880-catalog-content\") pod \"e19c61ad-b387-457b-814b-e382b0265880\" (UID: \"e19c61ad-b387-457b-814b-e382b0265880\") " Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.104854 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmq7w\" (UniqueName: \"kubernetes.io/projected/e19c61ad-b387-457b-814b-e382b0265880-kube-api-access-gmq7w\") pod \"e19c61ad-b387-457b-814b-e382b0265880\" (UID: \"e19c61ad-b387-457b-814b-e382b0265880\") " Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.105143 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/be345565-0341-4290-b5e8-9cf728685a6b-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.105165 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/be345565-0341-4290-b5e8-9cf728685a6b-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.105179 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dx7n\" (UniqueName: \"kubernetes.io/projected/f9165e83-4c09-4c44-b185-8f8922fcdad7-kube-api-access-7dx7n\") on node \"crc\" DevicePath \"\"" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.105192 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.105203 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfdxc\" (UniqueName: \"kubernetes.io/projected/ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2-kube-api-access-pfdxc\") on node \"crc\" DevicePath \"\"" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.105214 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.105225 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9165e83-4c09-4c44-b185-8f8922fcdad7-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.105236 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9165e83-4c09-4c44-b185-8f8922fcdad7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.105246 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hf4qf\" (UniqueName: \"kubernetes.io/projected/be345565-0341-4290-b5e8-9cf728685a6b-kube-api-access-hf4qf\") on node \"crc\" DevicePath \"\"" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.105257 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69277352-22e8-4094-944f-bb38a3fb3a83-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.105268 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bflbh\" (UniqueName: \"kubernetes.io/projected/69277352-22e8-4094-944f-bb38a3fb3a83-kube-api-access-bflbh\") on node \"crc\" DevicePath \"\"" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.106231 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e19c61ad-b387-457b-814b-e382b0265880-utilities" (OuterVolumeSpecName: "utilities") pod "e19c61ad-b387-457b-814b-e382b0265880" (UID: "e19c61ad-b387-457b-814b-e382b0265880"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.107852 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e19c61ad-b387-457b-814b-e382b0265880-kube-api-access-gmq7w" (OuterVolumeSpecName: "kube-api-access-gmq7w") pod "e19c61ad-b387-457b-814b-e382b0265880" (UID: "e19c61ad-b387-457b-814b-e382b0265880"). InnerVolumeSpecName "kube-api-access-gmq7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.165457 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e19c61ad-b387-457b-814b-e382b0265880-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e19c61ad-b387-457b-814b-e382b0265880" (UID: "e19c61ad-b387-457b-814b-e382b0265880"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.182237 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69277352-22e8-4094-944f-bb38a3fb3a83-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69277352-22e8-4094-944f-bb38a3fb3a83" (UID: "69277352-22e8-4094-944f-bb38a3fb3a83"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.205860 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e19c61ad-b387-457b-814b-e382b0265880-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.205886 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e19c61ad-b387-457b-814b-e382b0265880-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.205899 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmq7w\" (UniqueName: \"kubernetes.io/projected/e19c61ad-b387-457b-814b-e382b0265880-kube-api-access-gmq7w\") on node \"crc\" DevicePath \"\"" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.205909 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69277352-22e8-4094-944f-bb38a3fb3a83-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.351168 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wd24f"] Feb 19 18:38:11 crc kubenswrapper[4749]: W0219 18:38:11.357422 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22a5e8d0_f222_4a7c_8bb7_51689ef460a8.slice/crio-c942f6493be2a82ce332ae380fa61b7bbf55059b8356e34179366c99dc953ff6 WatchSource:0}: Error finding container c942f6493be2a82ce332ae380fa61b7bbf55059b8356e34179366c99dc953ff6: Status 404 returned error can't find the container with id c942f6493be2a82ce332ae380fa61b7bbf55059b8356e34179366c99dc953ff6 Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.666394 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5psfv" event={"ID":"f9165e83-4c09-4c44-b185-8f8922fcdad7","Type":"ContainerDied","Data":"0e3ff519c4aeb91a96fa60a1753584bc8bc911794cc34b45677ef82a49543a56"} Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.666408 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5psfv" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.666454 4749 scope.go:117] "RemoveContainer" containerID="79c420ff0c280697eff14168d9b1bd532f25df855e04ae0afe8f73f071036838" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.669579 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zqrzw" event={"ID":"ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2","Type":"ContainerDied","Data":"bf2a63de371b3f6e96e35c39eec2619090881a550d87edd1d374d5570b71dd4b"} Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.669659 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zqrzw" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.673506 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2n4lk" event={"ID":"e19c61ad-b387-457b-814b-e382b0265880","Type":"ContainerDied","Data":"7d9a9e717d9257a3572dbf5b955b8ff29d67f86c7d002689e7f3edc18a3b0060"} Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.673535 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2n4lk" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.675931 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mw89w" event={"ID":"69277352-22e8-4094-944f-bb38a3fb3a83","Type":"ContainerDied","Data":"c6064e4f35d132ef526cf97b5d5f70a023cae975be9417d797fce56386080b2c"} Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.676157 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mw89w" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.678925 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wd24f" event={"ID":"22a5e8d0-f222-4a7c-8bb7-51689ef460a8","Type":"ContainerStarted","Data":"6d54c10c02874b39dad5324edf71309822258f69237bdd489f08e22134014b54"} Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.678956 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wd24f" event={"ID":"22a5e8d0-f222-4a7c-8bb7-51689ef460a8","Type":"ContainerStarted","Data":"c942f6493be2a82ce332ae380fa61b7bbf55059b8356e34179366c99dc953ff6"} Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.679525 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-wd24f" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.680518 4749 scope.go:117] "RemoveContainer" containerID="a1c99cc01e261fa29c40c85fa0b7f1d6a0c233f77180bb31566d2edadb41d805" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.682496 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6hzt4" event={"ID":"be345565-0341-4290-b5e8-9cf728685a6b","Type":"ContainerDied","Data":"a59c55537f38b7418549628cd33f7ee6f1115c9173cef2ec8b7941ccd270acaf"} Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.682648 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6hzt4" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.682659 4749 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wd24f container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" start-of-body= Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.683094 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wd24f" podUID="22a5e8d0-f222-4a7c-8bb7-51689ef460a8" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.724900 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-wd24f" podStartSLOduration=1.7248845670000001 podStartE2EDuration="1.724884567s" podCreationTimestamp="2026-02-19 18:38:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:38:11.722318095 +0000 UTC m=+265.683538069" watchObservedRunningTime="2026-02-19 18:38:11.724884567 +0000 UTC m=+265.686104521" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.727293 4749 scope.go:117] "RemoveContainer" containerID="618acf69a1453aa4006f93fdac71b7f649fbb6268f86664da9c8c53a8fc772a5" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.754249 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zqrzw"] Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.759910 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zqrzw"] Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.765393 4749 scope.go:117] "RemoveContainer" containerID="57053058ce9dba1ad132b46be3808a9c08bcce8abeab544fe1107d1a539677c0" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.766139 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mw89w"] Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.773810 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mw89w"] Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.779582 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2n4lk"] Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.786148 4749 scope.go:117] "RemoveContainer" containerID="1627e72b097131221e09b8f0a5379b7bd8d53f17bf4a97521131a0f62bd155f8" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.787405 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2n4lk"] Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.802139 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6hzt4"] Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.802205 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6hzt4"] Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.802842 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5psfv"] Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.802917 4749 scope.go:117] "RemoveContainer" containerID="d740085761000fb6a8cb0ebd4843bc319d78a11d5b161cd9188174f09be3c54c" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.805424 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5psfv"] Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.825550 4749 scope.go:117] "RemoveContainer" containerID="bf71ebb1f626a916784e5d96dac0e63da11729d1b2eab51122f2a1fcf658658c" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.840577 4749 scope.go:117] "RemoveContainer" containerID="021ab6d95986618edbe6ae1d8d8362f6b09613200000a3026e94528ab4e46f3d" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.853853 4749 scope.go:117] "RemoveContainer" containerID="e707574541e931655daa9470cbd350cc04c2cdb19a5980a945380d7f511b0a37" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.867570 4749 scope.go:117] "RemoveContainer" containerID="c6924bb6bd831cc3ea14dd7bfc20730677ce22775f25736361ed2ef2eba435f4" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.879722 4749 scope.go:117] "RemoveContainer" containerID="306f0f4cfccfc898010693b446028e5667f6ec4dba2ed545165d44aff25071e8" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.900363 4749 scope.go:117] "RemoveContainer" containerID="5620579ece5c2e07ef2e9c99aa6e64eb16c69fac1b649920764774b970dda932" Feb 19 18:38:11 crc kubenswrapper[4749]: I0219 18:38:11.912140 4749 scope.go:117] "RemoveContainer" containerID="3008ef3793866254d0b21a4dc72db01224bae12a5c710639cea59c82e9571367" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.402254 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-chgqt"] Feb 19 18:38:12 crc kubenswrapper[4749]: E0219 18:38:12.402593 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be345565-0341-4290-b5e8-9cf728685a6b" containerName="marketplace-operator" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.402631 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="be345565-0341-4290-b5e8-9cf728685a6b" containerName="marketplace-operator" Feb 19 18:38:12 crc kubenswrapper[4749]: E0219 18:38:12.402649 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9165e83-4c09-4c44-b185-8f8922fcdad7" containerName="registry-server" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.402666 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9165e83-4c09-4c44-b185-8f8922fcdad7" containerName="registry-server" Feb 19 18:38:12 crc kubenswrapper[4749]: E0219 18:38:12.402694 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2" containerName="extract-utilities" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.402711 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2" containerName="extract-utilities" Feb 19 18:38:12 crc kubenswrapper[4749]: E0219 18:38:12.402738 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9165e83-4c09-4c44-b185-8f8922fcdad7" containerName="extract-content" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.402754 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9165e83-4c09-4c44-b185-8f8922fcdad7" containerName="extract-content" Feb 19 18:38:12 crc kubenswrapper[4749]: E0219 18:38:12.402776 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2" containerName="registry-server" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.402793 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2" containerName="registry-server" Feb 19 18:38:12 crc kubenswrapper[4749]: E0219 18:38:12.402811 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e19c61ad-b387-457b-814b-e382b0265880" containerName="extract-content" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.402827 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e19c61ad-b387-457b-814b-e382b0265880" containerName="extract-content" Feb 19 18:38:12 crc kubenswrapper[4749]: E0219 18:38:12.402849 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2" containerName="extract-content" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.402865 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2" containerName="extract-content" Feb 19 18:38:12 crc kubenswrapper[4749]: E0219 18:38:12.402887 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69277352-22e8-4094-944f-bb38a3fb3a83" containerName="extract-utilities" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.402901 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="69277352-22e8-4094-944f-bb38a3fb3a83" containerName="extract-utilities" Feb 19 18:38:12 crc kubenswrapper[4749]: E0219 18:38:12.402926 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69277352-22e8-4094-944f-bb38a3fb3a83" containerName="registry-server" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.402942 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="69277352-22e8-4094-944f-bb38a3fb3a83" containerName="registry-server" Feb 19 18:38:12 crc kubenswrapper[4749]: E0219 18:38:12.402967 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9165e83-4c09-4c44-b185-8f8922fcdad7" containerName="extract-utilities" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.402982 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9165e83-4c09-4c44-b185-8f8922fcdad7" containerName="extract-utilities" Feb 19 18:38:12 crc kubenswrapper[4749]: E0219 18:38:12.403007 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69277352-22e8-4094-944f-bb38a3fb3a83" containerName="extract-content" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.403058 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="69277352-22e8-4094-944f-bb38a3fb3a83" containerName="extract-content" Feb 19 18:38:12 crc kubenswrapper[4749]: E0219 18:38:12.403079 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be345565-0341-4290-b5e8-9cf728685a6b" containerName="marketplace-operator" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.403095 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="be345565-0341-4290-b5e8-9cf728685a6b" containerName="marketplace-operator" Feb 19 18:38:12 crc kubenswrapper[4749]: E0219 18:38:12.403124 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e19c61ad-b387-457b-814b-e382b0265880" containerName="extract-utilities" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.403139 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e19c61ad-b387-457b-814b-e382b0265880" containerName="extract-utilities" Feb 19 18:38:12 crc kubenswrapper[4749]: E0219 18:38:12.403164 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e19c61ad-b387-457b-814b-e382b0265880" containerName="registry-server" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.403180 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e19c61ad-b387-457b-814b-e382b0265880" containerName="registry-server" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.403357 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="69277352-22e8-4094-944f-bb38a3fb3a83" containerName="registry-server" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.403382 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="be345565-0341-4290-b5e8-9cf728685a6b" containerName="marketplace-operator" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.403398 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="be345565-0341-4290-b5e8-9cf728685a6b" containerName="marketplace-operator" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.403416 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e19c61ad-b387-457b-814b-e382b0265880" containerName="registry-server" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.403431 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9165e83-4c09-4c44-b185-8f8922fcdad7" containerName="registry-server" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.403447 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2" containerName="registry-server" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.405086 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-chgqt" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.410203 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-chgqt"] Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.414862 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.422864 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a20f87-31e9-444d-a98f-588258a67d7d-catalog-content\") pod \"certified-operators-chgqt\" (UID: \"00a20f87-31e9-444d-a98f-588258a67d7d\") " pod="openshift-marketplace/certified-operators-chgqt" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.423269 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgkb7\" (UniqueName: \"kubernetes.io/projected/00a20f87-31e9-444d-a98f-588258a67d7d-kube-api-access-lgkb7\") pod \"certified-operators-chgqt\" (UID: \"00a20f87-31e9-444d-a98f-588258a67d7d\") " pod="openshift-marketplace/certified-operators-chgqt" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.423472 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a20f87-31e9-444d-a98f-588258a67d7d-utilities\") pod \"certified-operators-chgqt\" (UID: \"00a20f87-31e9-444d-a98f-588258a67d7d\") " pod="openshift-marketplace/certified-operators-chgqt" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.524858 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a20f87-31e9-444d-a98f-588258a67d7d-catalog-content\") pod \"certified-operators-chgqt\" (UID: \"00a20f87-31e9-444d-a98f-588258a67d7d\") " pod="openshift-marketplace/certified-operators-chgqt" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.525115 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgkb7\" (UniqueName: \"kubernetes.io/projected/00a20f87-31e9-444d-a98f-588258a67d7d-kube-api-access-lgkb7\") pod \"certified-operators-chgqt\" (UID: \"00a20f87-31e9-444d-a98f-588258a67d7d\") " pod="openshift-marketplace/certified-operators-chgqt" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.525170 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a20f87-31e9-444d-a98f-588258a67d7d-utilities\") pod \"certified-operators-chgqt\" (UID: \"00a20f87-31e9-444d-a98f-588258a67d7d\") " pod="openshift-marketplace/certified-operators-chgqt" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.525667 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a20f87-31e9-444d-a98f-588258a67d7d-utilities\") pod \"certified-operators-chgqt\" (UID: \"00a20f87-31e9-444d-a98f-588258a67d7d\") " pod="openshift-marketplace/certified-operators-chgqt" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.525845 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a20f87-31e9-444d-a98f-588258a67d7d-catalog-content\") pod \"certified-operators-chgqt\" (UID: \"00a20f87-31e9-444d-a98f-588258a67d7d\") " pod="openshift-marketplace/certified-operators-chgqt" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.542912 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgkb7\" (UniqueName: \"kubernetes.io/projected/00a20f87-31e9-444d-a98f-588258a67d7d-kube-api-access-lgkb7\") pod \"certified-operators-chgqt\" (UID: \"00a20f87-31e9-444d-a98f-588258a67d7d\") " pod="openshift-marketplace/certified-operators-chgqt" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.689874 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69277352-22e8-4094-944f-bb38a3fb3a83" path="/var/lib/kubelet/pods/69277352-22e8-4094-944f-bb38a3fb3a83/volumes" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.691673 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be345565-0341-4290-b5e8-9cf728685a6b" path="/var/lib/kubelet/pods/be345565-0341-4290-b5e8-9cf728685a6b/volumes" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.693180 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2" path="/var/lib/kubelet/pods/ce02a175-8cf2-4b11-b1b6-a3e0eb2fe4b2/volumes" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.694459 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e19c61ad-b387-457b-814b-e382b0265880" path="/var/lib/kubelet/pods/e19c61ad-b387-457b-814b-e382b0265880/volumes" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.695061 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9165e83-4c09-4c44-b185-8f8922fcdad7" path="/var/lib/kubelet/pods/f9165e83-4c09-4c44-b185-8f8922fcdad7/volumes" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.696864 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-wd24f" Feb 19 18:38:12 crc kubenswrapper[4749]: I0219 18:38:12.731120 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-chgqt" Feb 19 18:38:13 crc kubenswrapper[4749]: I0219 18:38:13.000705 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rt4s8"] Feb 19 18:38:13 crc kubenswrapper[4749]: I0219 18:38:13.008521 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rt4s8"] Feb 19 18:38:13 crc kubenswrapper[4749]: I0219 18:38:13.008625 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rt4s8" Feb 19 18:38:13 crc kubenswrapper[4749]: I0219 18:38:13.010442 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 19 18:38:13 crc kubenswrapper[4749]: I0219 18:38:13.031597 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52v5x\" (UniqueName: \"kubernetes.io/projected/73ceef31-959a-49ce-9f27-0b41330d330b-kube-api-access-52v5x\") pod \"redhat-marketplace-rt4s8\" (UID: \"73ceef31-959a-49ce-9f27-0b41330d330b\") " pod="openshift-marketplace/redhat-marketplace-rt4s8" Feb 19 18:38:13 crc kubenswrapper[4749]: I0219 18:38:13.031709 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73ceef31-959a-49ce-9f27-0b41330d330b-catalog-content\") pod \"redhat-marketplace-rt4s8\" (UID: \"73ceef31-959a-49ce-9f27-0b41330d330b\") " pod="openshift-marketplace/redhat-marketplace-rt4s8" Feb 19 18:38:13 crc kubenswrapper[4749]: I0219 18:38:13.031760 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73ceef31-959a-49ce-9f27-0b41330d330b-utilities\") pod \"redhat-marketplace-rt4s8\" (UID: \"73ceef31-959a-49ce-9f27-0b41330d330b\") " pod="openshift-marketplace/redhat-marketplace-rt4s8" Feb 19 18:38:13 crc kubenswrapper[4749]: I0219 18:38:13.100991 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-chgqt"] Feb 19 18:38:13 crc kubenswrapper[4749]: W0219 18:38:13.107793 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00a20f87_31e9_444d_a98f_588258a67d7d.slice/crio-6a42e9e804e126e4c07d1ca9b1a9068da3c90c67758cc30e731d092e62768466 WatchSource:0}: Error finding container 6a42e9e804e126e4c07d1ca9b1a9068da3c90c67758cc30e731d092e62768466: Status 404 returned error can't find the container with id 6a42e9e804e126e4c07d1ca9b1a9068da3c90c67758cc30e731d092e62768466 Feb 19 18:38:13 crc kubenswrapper[4749]: I0219 18:38:13.136892 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52v5x\" (UniqueName: \"kubernetes.io/projected/73ceef31-959a-49ce-9f27-0b41330d330b-kube-api-access-52v5x\") pod \"redhat-marketplace-rt4s8\" (UID: \"73ceef31-959a-49ce-9f27-0b41330d330b\") " pod="openshift-marketplace/redhat-marketplace-rt4s8" Feb 19 18:38:13 crc kubenswrapper[4749]: I0219 18:38:13.136988 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73ceef31-959a-49ce-9f27-0b41330d330b-catalog-content\") pod \"redhat-marketplace-rt4s8\" (UID: \"73ceef31-959a-49ce-9f27-0b41330d330b\") " pod="openshift-marketplace/redhat-marketplace-rt4s8" Feb 19 18:38:13 crc kubenswrapper[4749]: I0219 18:38:13.137065 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73ceef31-959a-49ce-9f27-0b41330d330b-utilities\") pod \"redhat-marketplace-rt4s8\" (UID: \"73ceef31-959a-49ce-9f27-0b41330d330b\") " pod="openshift-marketplace/redhat-marketplace-rt4s8" Feb 19 18:38:13 crc kubenswrapper[4749]: I0219 18:38:13.137743 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73ceef31-959a-49ce-9f27-0b41330d330b-utilities\") pod \"redhat-marketplace-rt4s8\" (UID: \"73ceef31-959a-49ce-9f27-0b41330d330b\") " pod="openshift-marketplace/redhat-marketplace-rt4s8" Feb 19 18:38:13 crc kubenswrapper[4749]: I0219 18:38:13.137760 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73ceef31-959a-49ce-9f27-0b41330d330b-catalog-content\") pod \"redhat-marketplace-rt4s8\" (UID: \"73ceef31-959a-49ce-9f27-0b41330d330b\") " pod="openshift-marketplace/redhat-marketplace-rt4s8" Feb 19 18:38:13 crc kubenswrapper[4749]: I0219 18:38:13.155863 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52v5x\" (UniqueName: \"kubernetes.io/projected/73ceef31-959a-49ce-9f27-0b41330d330b-kube-api-access-52v5x\") pod \"redhat-marketplace-rt4s8\" (UID: \"73ceef31-959a-49ce-9f27-0b41330d330b\") " pod="openshift-marketplace/redhat-marketplace-rt4s8" Feb 19 18:38:13 crc kubenswrapper[4749]: I0219 18:38:13.328156 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rt4s8" Feb 19 18:38:13 crc kubenswrapper[4749]: I0219 18:38:13.699407 4749 generic.go:334] "Generic (PLEG): container finished" podID="00a20f87-31e9-444d-a98f-588258a67d7d" containerID="e4afd21982234451b14802fa967944e95a58c976991a21ab96a7f508dac90437" exitCode=0 Feb 19 18:38:13 crc kubenswrapper[4749]: I0219 18:38:13.699460 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-chgqt" event={"ID":"00a20f87-31e9-444d-a98f-588258a67d7d","Type":"ContainerDied","Data":"e4afd21982234451b14802fa967944e95a58c976991a21ab96a7f508dac90437"} Feb 19 18:38:13 crc kubenswrapper[4749]: I0219 18:38:13.699508 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-chgqt" event={"ID":"00a20f87-31e9-444d-a98f-588258a67d7d","Type":"ContainerStarted","Data":"6a42e9e804e126e4c07d1ca9b1a9068da3c90c67758cc30e731d092e62768466"} Feb 19 18:38:13 crc kubenswrapper[4749]: I0219 18:38:13.709223 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rt4s8"] Feb 19 18:38:13 crc kubenswrapper[4749]: W0219 18:38:13.723403 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73ceef31_959a_49ce_9f27_0b41330d330b.slice/crio-f30e5f51510aadfcf8de5cc06cd18f40f90af49097b10917c1fe59ade5876813 WatchSource:0}: Error finding container f30e5f51510aadfcf8de5cc06cd18f40f90af49097b10917c1fe59ade5876813: Status 404 returned error can't find the container with id f30e5f51510aadfcf8de5cc06cd18f40f90af49097b10917c1fe59ade5876813 Feb 19 18:38:14 crc kubenswrapper[4749]: I0219 18:38:14.706784 4749 generic.go:334] "Generic (PLEG): container finished" podID="73ceef31-959a-49ce-9f27-0b41330d330b" containerID="1526eaf753934025c4ca0f9bec4e28ec5866f303d8a3ee2a0c3373de48d69a82" exitCode=0 Feb 19 18:38:14 crc kubenswrapper[4749]: I0219 18:38:14.707148 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rt4s8" event={"ID":"73ceef31-959a-49ce-9f27-0b41330d330b","Type":"ContainerDied","Data":"1526eaf753934025c4ca0f9bec4e28ec5866f303d8a3ee2a0c3373de48d69a82"} Feb 19 18:38:14 crc kubenswrapper[4749]: I0219 18:38:14.707200 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rt4s8" event={"ID":"73ceef31-959a-49ce-9f27-0b41330d330b","Type":"ContainerStarted","Data":"f30e5f51510aadfcf8de5cc06cd18f40f90af49097b10917c1fe59ade5876813"} Feb 19 18:38:14 crc kubenswrapper[4749]: I0219 18:38:14.815396 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d8w5c"] Feb 19 18:38:14 crc kubenswrapper[4749]: I0219 18:38:14.817433 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d8w5c" Feb 19 18:38:14 crc kubenswrapper[4749]: I0219 18:38:14.818172 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d8w5c"] Feb 19 18:38:14 crc kubenswrapper[4749]: I0219 18:38:14.849334 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 19 18:38:14 crc kubenswrapper[4749]: I0219 18:38:14.865764 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50a852ed-343b-45a6-987c-d6a4a98446dd-catalog-content\") pod \"redhat-operators-d8w5c\" (UID: \"50a852ed-343b-45a6-987c-d6a4a98446dd\") " pod="openshift-marketplace/redhat-operators-d8w5c" Feb 19 18:38:14 crc kubenswrapper[4749]: I0219 18:38:14.865943 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50a852ed-343b-45a6-987c-d6a4a98446dd-utilities\") pod \"redhat-operators-d8w5c\" (UID: \"50a852ed-343b-45a6-987c-d6a4a98446dd\") " pod="openshift-marketplace/redhat-operators-d8w5c" Feb 19 18:38:14 crc kubenswrapper[4749]: I0219 18:38:14.866071 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szxr2\" (UniqueName: \"kubernetes.io/projected/50a852ed-343b-45a6-987c-d6a4a98446dd-kube-api-access-szxr2\") pod \"redhat-operators-d8w5c\" (UID: \"50a852ed-343b-45a6-987c-d6a4a98446dd\") " pod="openshift-marketplace/redhat-operators-d8w5c" Feb 19 18:38:14 crc kubenswrapper[4749]: I0219 18:38:14.967018 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szxr2\" (UniqueName: \"kubernetes.io/projected/50a852ed-343b-45a6-987c-d6a4a98446dd-kube-api-access-szxr2\") pod \"redhat-operators-d8w5c\" (UID: \"50a852ed-343b-45a6-987c-d6a4a98446dd\") " pod="openshift-marketplace/redhat-operators-d8w5c" Feb 19 18:38:14 crc kubenswrapper[4749]: I0219 18:38:14.967119 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50a852ed-343b-45a6-987c-d6a4a98446dd-catalog-content\") pod \"redhat-operators-d8w5c\" (UID: \"50a852ed-343b-45a6-987c-d6a4a98446dd\") " pod="openshift-marketplace/redhat-operators-d8w5c" Feb 19 18:38:14 crc kubenswrapper[4749]: I0219 18:38:14.967171 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50a852ed-343b-45a6-987c-d6a4a98446dd-utilities\") pod \"redhat-operators-d8w5c\" (UID: \"50a852ed-343b-45a6-987c-d6a4a98446dd\") " pod="openshift-marketplace/redhat-operators-d8w5c" Feb 19 18:38:14 crc kubenswrapper[4749]: I0219 18:38:14.967719 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50a852ed-343b-45a6-987c-d6a4a98446dd-catalog-content\") pod \"redhat-operators-d8w5c\" (UID: \"50a852ed-343b-45a6-987c-d6a4a98446dd\") " pod="openshift-marketplace/redhat-operators-d8w5c" Feb 19 18:38:14 crc kubenswrapper[4749]: I0219 18:38:14.967722 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50a852ed-343b-45a6-987c-d6a4a98446dd-utilities\") pod \"redhat-operators-d8w5c\" (UID: \"50a852ed-343b-45a6-987c-d6a4a98446dd\") " pod="openshift-marketplace/redhat-operators-d8w5c" Feb 19 18:38:14 crc kubenswrapper[4749]: I0219 18:38:14.988759 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szxr2\" (UniqueName: \"kubernetes.io/projected/50a852ed-343b-45a6-987c-d6a4a98446dd-kube-api-access-szxr2\") pod \"redhat-operators-d8w5c\" (UID: \"50a852ed-343b-45a6-987c-d6a4a98446dd\") " pod="openshift-marketplace/redhat-operators-d8w5c" Feb 19 18:38:15 crc kubenswrapper[4749]: I0219 18:38:15.193992 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d8w5c" Feb 19 18:38:15 crc kubenswrapper[4749]: I0219 18:38:15.406361 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g6rth"] Feb 19 18:38:15 crc kubenswrapper[4749]: I0219 18:38:15.407681 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6rth" Feb 19 18:38:15 crc kubenswrapper[4749]: I0219 18:38:15.409664 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 19 18:38:15 crc kubenswrapper[4749]: I0219 18:38:15.411545 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g6rth"] Feb 19 18:38:15 crc kubenswrapper[4749]: I0219 18:38:15.472697 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9-catalog-content\") pod \"community-operators-g6rth\" (UID: \"ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9\") " pod="openshift-marketplace/community-operators-g6rth" Feb 19 18:38:15 crc kubenswrapper[4749]: I0219 18:38:15.472737 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk456\" (UniqueName: \"kubernetes.io/projected/ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9-kube-api-access-zk456\") pod \"community-operators-g6rth\" (UID: \"ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9\") " pod="openshift-marketplace/community-operators-g6rth" Feb 19 18:38:15 crc kubenswrapper[4749]: I0219 18:38:15.472766 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9-utilities\") pod \"community-operators-g6rth\" (UID: \"ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9\") " pod="openshift-marketplace/community-operators-g6rth" Feb 19 18:38:15 crc kubenswrapper[4749]: I0219 18:38:15.552287 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d8w5c"] Feb 19 18:38:15 crc kubenswrapper[4749]: I0219 18:38:15.579623 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9-catalog-content\") pod \"community-operators-g6rth\" (UID: \"ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9\") " pod="openshift-marketplace/community-operators-g6rth" Feb 19 18:38:15 crc kubenswrapper[4749]: I0219 18:38:15.580248 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk456\" (UniqueName: \"kubernetes.io/projected/ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9-kube-api-access-zk456\") pod \"community-operators-g6rth\" (UID: \"ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9\") " pod="openshift-marketplace/community-operators-g6rth" Feb 19 18:38:15 crc kubenswrapper[4749]: I0219 18:38:15.580292 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9-utilities\") pod \"community-operators-g6rth\" (UID: \"ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9\") " pod="openshift-marketplace/community-operators-g6rth" Feb 19 18:38:15 crc kubenswrapper[4749]: I0219 18:38:15.580362 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9-catalog-content\") pod \"community-operators-g6rth\" (UID: \"ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9\") " pod="openshift-marketplace/community-operators-g6rth" Feb 19 18:38:15 crc kubenswrapper[4749]: I0219 18:38:15.580645 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9-utilities\") pod \"community-operators-g6rth\" (UID: \"ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9\") " pod="openshift-marketplace/community-operators-g6rth" Feb 19 18:38:15 crc kubenswrapper[4749]: I0219 18:38:15.598451 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk456\" (UniqueName: \"kubernetes.io/projected/ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9-kube-api-access-zk456\") pod \"community-operators-g6rth\" (UID: \"ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9\") " pod="openshift-marketplace/community-operators-g6rth" Feb 19 18:38:15 crc kubenswrapper[4749]: I0219 18:38:15.712801 4749 generic.go:334] "Generic (PLEG): container finished" podID="00a20f87-31e9-444d-a98f-588258a67d7d" containerID="645bf79792a478e63f3ead21b3951333d5d9558da8b3c73f06627b50eb63dd05" exitCode=0 Feb 19 18:38:15 crc kubenswrapper[4749]: I0219 18:38:15.713772 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-chgqt" event={"ID":"00a20f87-31e9-444d-a98f-588258a67d7d","Type":"ContainerDied","Data":"645bf79792a478e63f3ead21b3951333d5d9558da8b3c73f06627b50eb63dd05"} Feb 19 18:38:15 crc kubenswrapper[4749]: I0219 18:38:15.718118 4749 generic.go:334] "Generic (PLEG): container finished" podID="50a852ed-343b-45a6-987c-d6a4a98446dd" containerID="af81772975a4f77316461a4327503a7330b60f1e53cfc48f5eac15dc63e5185a" exitCode=0 Feb 19 18:38:15 crc kubenswrapper[4749]: I0219 18:38:15.718223 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d8w5c" event={"ID":"50a852ed-343b-45a6-987c-d6a4a98446dd","Type":"ContainerDied","Data":"af81772975a4f77316461a4327503a7330b60f1e53cfc48f5eac15dc63e5185a"} Feb 19 18:38:15 crc kubenswrapper[4749]: I0219 18:38:15.718409 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d8w5c" event={"ID":"50a852ed-343b-45a6-987c-d6a4a98446dd","Type":"ContainerStarted","Data":"dc24004a45dcc977ab480bb55f40d129f1838c114420a5d12b55973c3df8be41"} Feb 19 18:38:15 crc kubenswrapper[4749]: I0219 18:38:15.740931 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6rth" Feb 19 18:38:16 crc kubenswrapper[4749]: I0219 18:38:16.248398 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g6rth"] Feb 19 18:38:16 crc kubenswrapper[4749]: W0219 18:38:16.256223 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffb2ea5d_fbe3_41d3_8dde_e7c7f31722a9.slice/crio-07d672e40a4920af48670180445c00e9b2c2c066b4567570b156d89e230d2d16 WatchSource:0}: Error finding container 07d672e40a4920af48670180445c00e9b2c2c066b4567570b156d89e230d2d16: Status 404 returned error can't find the container with id 07d672e40a4920af48670180445c00e9b2c2c066b4567570b156d89e230d2d16 Feb 19 18:38:16 crc kubenswrapper[4749]: I0219 18:38:16.425879 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-jr5gz" Feb 19 18:38:16 crc kubenswrapper[4749]: I0219 18:38:16.474241 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-w4nkj"] Feb 19 18:38:16 crc kubenswrapper[4749]: I0219 18:38:16.727466 4749 generic.go:334] "Generic (PLEG): container finished" podID="73ceef31-959a-49ce-9f27-0b41330d330b" containerID="05fa42d2dbf58c7eced1a8044392be3944edf79352ef0f81a9350e349d1c4371" exitCode=0 Feb 19 18:38:16 crc kubenswrapper[4749]: I0219 18:38:16.727516 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rt4s8" event={"ID":"73ceef31-959a-49ce-9f27-0b41330d330b","Type":"ContainerDied","Data":"05fa42d2dbf58c7eced1a8044392be3944edf79352ef0f81a9350e349d1c4371"} Feb 19 18:38:16 crc kubenswrapper[4749]: I0219 18:38:16.732349 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-chgqt" event={"ID":"00a20f87-31e9-444d-a98f-588258a67d7d","Type":"ContainerStarted","Data":"48313f26e3a02ff577c96ffdfc7a5f7f51aef1c498eedef06d9ce95f59942c60"} Feb 19 18:38:16 crc kubenswrapper[4749]: I0219 18:38:16.734709 4749 generic.go:334] "Generic (PLEG): container finished" podID="ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9" containerID="1473e8ddd62ce11ff40df0072a829bbb16ff62a1ce71959df387e3594e018808" exitCode=0 Feb 19 18:38:16 crc kubenswrapper[4749]: I0219 18:38:16.734787 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6rth" event={"ID":"ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9","Type":"ContainerDied","Data":"1473e8ddd62ce11ff40df0072a829bbb16ff62a1ce71959df387e3594e018808"} Feb 19 18:38:16 crc kubenswrapper[4749]: I0219 18:38:16.734810 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6rth" event={"ID":"ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9","Type":"ContainerStarted","Data":"07d672e40a4920af48670180445c00e9b2c2c066b4567570b156d89e230d2d16"} Feb 19 18:38:16 crc kubenswrapper[4749]: I0219 18:38:16.768494 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-chgqt" podStartSLOduration=2.229661015 podStartE2EDuration="4.768474471s" podCreationTimestamp="2026-02-19 18:38:12 +0000 UTC" firstStartedPulling="2026-02-19 18:38:13.702287408 +0000 UTC m=+267.663507362" lastFinishedPulling="2026-02-19 18:38:16.241100864 +0000 UTC m=+270.202320818" observedRunningTime="2026-02-19 18:38:16.765902419 +0000 UTC m=+270.727122373" watchObservedRunningTime="2026-02-19 18:38:16.768474471 +0000 UTC m=+270.729694425" Feb 19 18:38:18 crc kubenswrapper[4749]: I0219 18:38:18.750834 4749 generic.go:334] "Generic (PLEG): container finished" podID="50a852ed-343b-45a6-987c-d6a4a98446dd" containerID="9dfa26cee3070d2b4d18ce73b5ae7747bc813ed9c5edb2cb1e08fefdab718a0b" exitCode=0 Feb 19 18:38:18 crc kubenswrapper[4749]: I0219 18:38:18.750917 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d8w5c" event={"ID":"50a852ed-343b-45a6-987c-d6a4a98446dd","Type":"ContainerDied","Data":"9dfa26cee3070d2b4d18ce73b5ae7747bc813ed9c5edb2cb1e08fefdab718a0b"} Feb 19 18:38:18 crc kubenswrapper[4749]: I0219 18:38:18.753782 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rt4s8" event={"ID":"73ceef31-959a-49ce-9f27-0b41330d330b","Type":"ContainerStarted","Data":"72b1e5010b94ced0006bc28096279022fb8d550b0d4466f000685a2255916ad7"} Feb 19 18:38:18 crc kubenswrapper[4749]: I0219 18:38:18.763860 4749 generic.go:334] "Generic (PLEG): container finished" podID="ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9" containerID="a023f2be3689921e7388477a1015f3f24a84d5d38ba04e93cfb4718f67ad33f9" exitCode=0 Feb 19 18:38:18 crc kubenswrapper[4749]: I0219 18:38:18.763903 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6rth" event={"ID":"ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9","Type":"ContainerDied","Data":"a023f2be3689921e7388477a1015f3f24a84d5d38ba04e93cfb4718f67ad33f9"} Feb 19 18:38:18 crc kubenswrapper[4749]: I0219 18:38:18.793513 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rt4s8" podStartSLOduration=4.001959705 podStartE2EDuration="6.793490397s" podCreationTimestamp="2026-02-19 18:38:12 +0000 UTC" firstStartedPulling="2026-02-19 18:38:14.77655598 +0000 UTC m=+268.737775944" lastFinishedPulling="2026-02-19 18:38:17.568086682 +0000 UTC m=+271.529306636" observedRunningTime="2026-02-19 18:38:18.789986814 +0000 UTC m=+272.751206788" watchObservedRunningTime="2026-02-19 18:38:18.793490397 +0000 UTC m=+272.754710361" Feb 19 18:38:19 crc kubenswrapper[4749]: I0219 18:38:19.771976 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6rth" event={"ID":"ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9","Type":"ContainerStarted","Data":"12779e901e9600c2c7610c87f4290634214ae19608f8708dd4587184664328d5"} Feb 19 18:38:19 crc kubenswrapper[4749]: I0219 18:38:19.774420 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d8w5c" event={"ID":"50a852ed-343b-45a6-987c-d6a4a98446dd","Type":"ContainerStarted","Data":"0de06afab78797e3eb8b96fa1f757876c334dfb30f73cab32369cdeaf420d4ed"} Feb 19 18:38:19 crc kubenswrapper[4749]: I0219 18:38:19.845405 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g6rth" podStartSLOduration=2.346474815 podStartE2EDuration="4.845387093s" podCreationTimestamp="2026-02-19 18:38:15 +0000 UTC" firstStartedPulling="2026-02-19 18:38:16.736160595 +0000 UTC m=+270.697380549" lastFinishedPulling="2026-02-19 18:38:19.235072883 +0000 UTC m=+273.196292827" observedRunningTime="2026-02-19 18:38:19.804965201 +0000 UTC m=+273.766185155" watchObservedRunningTime="2026-02-19 18:38:19.845387093 +0000 UTC m=+273.806607047" Feb 19 18:38:19 crc kubenswrapper[4749]: I0219 18:38:19.848409 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d8w5c" podStartSLOduration=3.209909601 podStartE2EDuration="5.848401165s" podCreationTimestamp="2026-02-19 18:38:14 +0000 UTC" firstStartedPulling="2026-02-19 18:38:16.736273487 +0000 UTC m=+270.697493441" lastFinishedPulling="2026-02-19 18:38:19.374765011 +0000 UTC m=+273.335985005" observedRunningTime="2026-02-19 18:38:19.843451106 +0000 UTC m=+273.804671060" watchObservedRunningTime="2026-02-19 18:38:19.848401165 +0000 UTC m=+273.809621119" Feb 19 18:38:22 crc kubenswrapper[4749]: I0219 18:38:22.731343 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-chgqt" Feb 19 18:38:22 crc kubenswrapper[4749]: I0219 18:38:22.731994 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-chgqt" Feb 19 18:38:22 crc kubenswrapper[4749]: I0219 18:38:22.771233 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-chgqt" Feb 19 18:38:22 crc kubenswrapper[4749]: I0219 18:38:22.830565 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-chgqt" Feb 19 18:38:23 crc kubenswrapper[4749]: I0219 18:38:23.328291 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rt4s8" Feb 19 18:38:23 crc kubenswrapper[4749]: I0219 18:38:23.328353 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rt4s8" Feb 19 18:38:23 crc kubenswrapper[4749]: I0219 18:38:23.369759 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rt4s8" Feb 19 18:38:23 crc kubenswrapper[4749]: I0219 18:38:23.847312 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rt4s8" Feb 19 18:38:25 crc kubenswrapper[4749]: I0219 18:38:25.194799 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d8w5c" Feb 19 18:38:25 crc kubenswrapper[4749]: I0219 18:38:25.194845 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d8w5c" Feb 19 18:38:25 crc kubenswrapper[4749]: I0219 18:38:25.232610 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d8w5c" Feb 19 18:38:25 crc kubenswrapper[4749]: I0219 18:38:25.742619 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g6rth" Feb 19 18:38:25 crc kubenswrapper[4749]: I0219 18:38:25.743090 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g6rth" Feb 19 18:38:25 crc kubenswrapper[4749]: I0219 18:38:25.782353 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g6rth" Feb 19 18:38:25 crc kubenswrapper[4749]: I0219 18:38:25.842177 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g6rth" Feb 19 18:38:25 crc kubenswrapper[4749]: I0219 18:38:25.852550 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d8w5c" Feb 19 18:38:41 crc kubenswrapper[4749]: I0219 18:38:41.509833 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" podUID="ac1677b6-8344-44c7-a5fc-2924da30ddbc" containerName="registry" containerID="cri-o://8a953d8d4494f7a77d0414437504a1f253340983b72af14119c3b528e38ad8ec" gracePeriod=30 Feb 19 18:38:41 crc kubenswrapper[4749]: I0219 18:38:41.887314 4749 generic.go:334] "Generic (PLEG): container finished" podID="ac1677b6-8344-44c7-a5fc-2924da30ddbc" containerID="8a953d8d4494f7a77d0414437504a1f253340983b72af14119c3b528e38ad8ec" exitCode=0 Feb 19 18:38:41 crc kubenswrapper[4749]: I0219 18:38:41.887470 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" event={"ID":"ac1677b6-8344-44c7-a5fc-2924da30ddbc","Type":"ContainerDied","Data":"8a953d8d4494f7a77d0414437504a1f253340983b72af14119c3b528e38ad8ec"} Feb 19 18:38:41 crc kubenswrapper[4749]: I0219 18:38:41.887766 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" event={"ID":"ac1677b6-8344-44c7-a5fc-2924da30ddbc","Type":"ContainerDied","Data":"547682b9d6cb014df5f439a576c469d484e31e3ec2c85b06f7e26882530b28d2"} Feb 19 18:38:41 crc kubenswrapper[4749]: I0219 18:38:41.887796 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="547682b9d6cb014df5f439a576c469d484e31e3ec2c85b06f7e26882530b28d2" Feb 19 18:38:41 crc kubenswrapper[4749]: I0219 18:38:41.903229 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:38:42 crc kubenswrapper[4749]: I0219 18:38:42.035548 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " Feb 19 18:38:42 crc kubenswrapper[4749]: I0219 18:38:42.035644 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ac1677b6-8344-44c7-a5fc-2924da30ddbc-registry-certificates\") pod \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " Feb 19 18:38:42 crc kubenswrapper[4749]: I0219 18:38:42.035690 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac1677b6-8344-44c7-a5fc-2924da30ddbc-registry-tls\") pod \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " Feb 19 18:38:42 crc kubenswrapper[4749]: I0219 18:38:42.035716 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjrh8\" (UniqueName: \"kubernetes.io/projected/ac1677b6-8344-44c7-a5fc-2924da30ddbc-kube-api-access-gjrh8\") pod \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " Feb 19 18:38:42 crc kubenswrapper[4749]: I0219 18:38:42.035775 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ac1677b6-8344-44c7-a5fc-2924da30ddbc-installation-pull-secrets\") pod \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " Feb 19 18:38:42 crc kubenswrapper[4749]: I0219 18:38:42.035799 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac1677b6-8344-44c7-a5fc-2924da30ddbc-trusted-ca\") pod \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " Feb 19 18:38:42 crc kubenswrapper[4749]: I0219 18:38:42.035822 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ac1677b6-8344-44c7-a5fc-2924da30ddbc-ca-trust-extracted\") pod \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " Feb 19 18:38:42 crc kubenswrapper[4749]: I0219 18:38:42.035857 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac1677b6-8344-44c7-a5fc-2924da30ddbc-bound-sa-token\") pod \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\" (UID: \"ac1677b6-8344-44c7-a5fc-2924da30ddbc\") " Feb 19 18:38:42 crc kubenswrapper[4749]: I0219 18:38:42.036656 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac1677b6-8344-44c7-a5fc-2924da30ddbc-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "ac1677b6-8344-44c7-a5fc-2924da30ddbc" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:38:42 crc kubenswrapper[4749]: I0219 18:38:42.041344 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac1677b6-8344-44c7-a5fc-2924da30ddbc-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "ac1677b6-8344-44c7-a5fc-2924da30ddbc" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:38:42 crc kubenswrapper[4749]: I0219 18:38:42.041830 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac1677b6-8344-44c7-a5fc-2924da30ddbc-kube-api-access-gjrh8" (OuterVolumeSpecName: "kube-api-access-gjrh8") pod "ac1677b6-8344-44c7-a5fc-2924da30ddbc" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc"). InnerVolumeSpecName "kube-api-access-gjrh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:38:42 crc kubenswrapper[4749]: I0219 18:38:42.041853 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac1677b6-8344-44c7-a5fc-2924da30ddbc-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "ac1677b6-8344-44c7-a5fc-2924da30ddbc" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:38:42 crc kubenswrapper[4749]: I0219 18:38:42.043332 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac1677b6-8344-44c7-a5fc-2924da30ddbc-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "ac1677b6-8344-44c7-a5fc-2924da30ddbc" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:38:42 crc kubenswrapper[4749]: I0219 18:38:42.044114 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac1677b6-8344-44c7-a5fc-2924da30ddbc-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "ac1677b6-8344-44c7-a5fc-2924da30ddbc" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:38:42 crc kubenswrapper[4749]: I0219 18:38:42.051967 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac1677b6-8344-44c7-a5fc-2924da30ddbc-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "ac1677b6-8344-44c7-a5fc-2924da30ddbc" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:38:42 crc kubenswrapper[4749]: I0219 18:38:42.053456 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "ac1677b6-8344-44c7-a5fc-2924da30ddbc" (UID: "ac1677b6-8344-44c7-a5fc-2924da30ddbc"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 18:38:42 crc kubenswrapper[4749]: I0219 18:38:42.138214 4749 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ac1677b6-8344-44c7-a5fc-2924da30ddbc-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 19 18:38:42 crc kubenswrapper[4749]: I0219 18:38:42.138251 4749 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac1677b6-8344-44c7-a5fc-2924da30ddbc-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 19 18:38:42 crc kubenswrapper[4749]: I0219 18:38:42.138262 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjrh8\" (UniqueName: \"kubernetes.io/projected/ac1677b6-8344-44c7-a5fc-2924da30ddbc-kube-api-access-gjrh8\") on node \"crc\" DevicePath \"\"" Feb 19 18:38:42 crc kubenswrapper[4749]: I0219 18:38:42.138271 4749 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ac1677b6-8344-44c7-a5fc-2924da30ddbc-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 19 18:38:42 crc kubenswrapper[4749]: I0219 18:38:42.138279 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac1677b6-8344-44c7-a5fc-2924da30ddbc-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:38:42 crc kubenswrapper[4749]: I0219 18:38:42.138288 4749 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ac1677b6-8344-44c7-a5fc-2924da30ddbc-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 19 18:38:42 crc kubenswrapper[4749]: I0219 18:38:42.138297 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac1677b6-8344-44c7-a5fc-2924da30ddbc-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 18:38:42 crc kubenswrapper[4749]: I0219 18:38:42.894261 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-w4nkj" Feb 19 18:38:42 crc kubenswrapper[4749]: I0219 18:38:42.914196 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-w4nkj"] Feb 19 18:38:42 crc kubenswrapper[4749]: I0219 18:38:42.917830 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-w4nkj"] Feb 19 18:38:44 crc kubenswrapper[4749]: I0219 18:38:44.688402 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac1677b6-8344-44c7-a5fc-2924da30ddbc" path="/var/lib/kubelet/pods/ac1677b6-8344-44c7-a5fc-2924da30ddbc/volumes" Feb 19 18:38:46 crc kubenswrapper[4749]: I0219 18:38:46.510766 4749 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 19 18:39:54 crc kubenswrapper[4749]: I0219 18:39:54.725347 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 18:39:54 crc kubenswrapper[4749]: I0219 18:39:54.726113 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 18:40:24 crc kubenswrapper[4749]: I0219 18:40:24.725107 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 18:40:24 crc kubenswrapper[4749]: I0219 18:40:24.725761 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 18:40:47 crc kubenswrapper[4749]: I0219 18:40:47.049208 4749 scope.go:117] "RemoveContainer" containerID="dffabbfdfa3d38499c2adf033d395e3afebdffff6631e5b3abba38ea6d2be20b" Feb 19 18:40:47 crc kubenswrapper[4749]: I0219 18:40:47.074234 4749 scope.go:117] "RemoveContainer" containerID="8a953d8d4494f7a77d0414437504a1f253340983b72af14119c3b528e38ad8ec" Feb 19 18:40:54 crc kubenswrapper[4749]: I0219 18:40:54.725550 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 18:40:54 crc kubenswrapper[4749]: I0219 18:40:54.726193 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 18:40:54 crc kubenswrapper[4749]: I0219 18:40:54.726259 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 18:40:54 crc kubenswrapper[4749]: I0219 18:40:54.727017 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e950b1a12f156ed5917a35268977b9c8856e348f780776f4a9cea21c27147df8"} pod="openshift-machine-config-operator/machine-config-daemon-nzldt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 18:40:54 crc kubenswrapper[4749]: I0219 18:40:54.727141 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" containerID="cri-o://e950b1a12f156ed5917a35268977b9c8856e348f780776f4a9cea21c27147df8" gracePeriod=600 Feb 19 18:40:55 crc kubenswrapper[4749]: I0219 18:40:55.663563 4749 generic.go:334] "Generic (PLEG): container finished" podID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerID="e950b1a12f156ed5917a35268977b9c8856e348f780776f4a9cea21c27147df8" exitCode=0 Feb 19 18:40:55 crc kubenswrapper[4749]: I0219 18:40:55.663706 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerDied","Data":"e950b1a12f156ed5917a35268977b9c8856e348f780776f4a9cea21c27147df8"} Feb 19 18:40:55 crc kubenswrapper[4749]: I0219 18:40:55.664261 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerStarted","Data":"834fb3e32a872fa725853bfb5119dcd730968979e3bcf1f345f3d75fe740a490"} Feb 19 18:40:55 crc kubenswrapper[4749]: I0219 18:40:55.664284 4749 scope.go:117] "RemoveContainer" containerID="d123252182a47ccd6ba31cca4bf5a56c186613b03b6954d68bf4e641e17d68fd" Feb 19 18:43:24 crc kubenswrapper[4749]: I0219 18:43:24.725837 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 18:43:24 crc kubenswrapper[4749]: I0219 18:43:24.726483 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 18:43:54 crc kubenswrapper[4749]: I0219 18:43:54.727604 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 18:43:54 crc kubenswrapper[4749]: I0219 18:43:54.728324 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 18:43:55 crc kubenswrapper[4749]: I0219 18:43:55.879500 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-pbj6v"] Feb 19 18:43:55 crc kubenswrapper[4749]: E0219 18:43:55.880257 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac1677b6-8344-44c7-a5fc-2924da30ddbc" containerName="registry" Feb 19 18:43:55 crc kubenswrapper[4749]: I0219 18:43:55.880281 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac1677b6-8344-44c7-a5fc-2924da30ddbc" containerName="registry" Feb 19 18:43:55 crc kubenswrapper[4749]: I0219 18:43:55.880462 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac1677b6-8344-44c7-a5fc-2924da30ddbc" containerName="registry" Feb 19 18:43:55 crc kubenswrapper[4749]: I0219 18:43:55.881109 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-pbj6v" Feb 19 18:43:55 crc kubenswrapper[4749]: I0219 18:43:55.883551 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 19 18:43:55 crc kubenswrapper[4749]: I0219 18:43:55.883971 4749 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-n755b" Feb 19 18:43:55 crc kubenswrapper[4749]: I0219 18:43:55.884204 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 19 18:43:55 crc kubenswrapper[4749]: I0219 18:43:55.888110 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-7ngvd"] Feb 19 18:43:55 crc kubenswrapper[4749]: I0219 18:43:55.889213 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-7ngvd" Feb 19 18:43:55 crc kubenswrapper[4749]: I0219 18:43:55.890962 4749 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-bxhj8" Feb 19 18:43:55 crc kubenswrapper[4749]: I0219 18:43:55.893939 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-7ngvd"] Feb 19 18:43:55 crc kubenswrapper[4749]: I0219 18:43:55.902355 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-pbj6v"] Feb 19 18:43:55 crc kubenswrapper[4749]: I0219 18:43:55.905920 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-gqvc7"] Feb 19 18:43:55 crc kubenswrapper[4749]: I0219 18:43:55.906553 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-gqvc7" Feb 19 18:43:55 crc kubenswrapper[4749]: I0219 18:43:55.908897 4749 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-hmfwc" Feb 19 18:43:55 crc kubenswrapper[4749]: I0219 18:43:55.915971 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-gqvc7"] Feb 19 18:43:56 crc kubenswrapper[4749]: I0219 18:43:56.016728 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgfgk\" (UniqueName: \"kubernetes.io/projected/b4703030-f4cb-4751-a9e1-5a6c1c9f4332-kube-api-access-qgfgk\") pod \"cert-manager-cainjector-cf98fcc89-pbj6v\" (UID: \"b4703030-f4cb-4751-a9e1-5a6c1c9f4332\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-pbj6v" Feb 19 18:43:56 crc kubenswrapper[4749]: I0219 18:43:56.016812 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4dsn\" (UniqueName: \"kubernetes.io/projected/30e026fc-9274-4942-bf3d-68740957aeec-kube-api-access-k4dsn\") pod \"cert-manager-858654f9db-7ngvd\" (UID: \"30e026fc-9274-4942-bf3d-68740957aeec\") " pod="cert-manager/cert-manager-858654f9db-7ngvd" Feb 19 18:43:56 crc kubenswrapper[4749]: I0219 18:43:56.016876 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdqt5\" (UniqueName: \"kubernetes.io/projected/fc78af5c-d237-4523-8035-d8992d4b539c-kube-api-access-gdqt5\") pod \"cert-manager-webhook-687f57d79b-gqvc7\" (UID: \"fc78af5c-d237-4523-8035-d8992d4b539c\") " pod="cert-manager/cert-manager-webhook-687f57d79b-gqvc7" Feb 19 18:43:56 crc kubenswrapper[4749]: I0219 18:43:56.117949 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgfgk\" (UniqueName: \"kubernetes.io/projected/b4703030-f4cb-4751-a9e1-5a6c1c9f4332-kube-api-access-qgfgk\") pod \"cert-manager-cainjector-cf98fcc89-pbj6v\" (UID: \"b4703030-f4cb-4751-a9e1-5a6c1c9f4332\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-pbj6v" Feb 19 18:43:56 crc kubenswrapper[4749]: I0219 18:43:56.118011 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4dsn\" (UniqueName: \"kubernetes.io/projected/30e026fc-9274-4942-bf3d-68740957aeec-kube-api-access-k4dsn\") pod \"cert-manager-858654f9db-7ngvd\" (UID: \"30e026fc-9274-4942-bf3d-68740957aeec\") " pod="cert-manager/cert-manager-858654f9db-7ngvd" Feb 19 18:43:56 crc kubenswrapper[4749]: I0219 18:43:56.118089 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdqt5\" (UniqueName: \"kubernetes.io/projected/fc78af5c-d237-4523-8035-d8992d4b539c-kube-api-access-gdqt5\") pod \"cert-manager-webhook-687f57d79b-gqvc7\" (UID: \"fc78af5c-d237-4523-8035-d8992d4b539c\") " pod="cert-manager/cert-manager-webhook-687f57d79b-gqvc7" Feb 19 18:43:56 crc kubenswrapper[4749]: I0219 18:43:56.136858 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdqt5\" (UniqueName: \"kubernetes.io/projected/fc78af5c-d237-4523-8035-d8992d4b539c-kube-api-access-gdqt5\") pod \"cert-manager-webhook-687f57d79b-gqvc7\" (UID: \"fc78af5c-d237-4523-8035-d8992d4b539c\") " pod="cert-manager/cert-manager-webhook-687f57d79b-gqvc7" Feb 19 18:43:56 crc kubenswrapper[4749]: I0219 18:43:56.136891 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4dsn\" (UniqueName: \"kubernetes.io/projected/30e026fc-9274-4942-bf3d-68740957aeec-kube-api-access-k4dsn\") pod \"cert-manager-858654f9db-7ngvd\" (UID: \"30e026fc-9274-4942-bf3d-68740957aeec\") " pod="cert-manager/cert-manager-858654f9db-7ngvd" Feb 19 18:43:56 crc kubenswrapper[4749]: I0219 18:43:56.137127 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgfgk\" (UniqueName: \"kubernetes.io/projected/b4703030-f4cb-4751-a9e1-5a6c1c9f4332-kube-api-access-qgfgk\") pod \"cert-manager-cainjector-cf98fcc89-pbj6v\" (UID: \"b4703030-f4cb-4751-a9e1-5a6c1c9f4332\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-pbj6v" Feb 19 18:43:56 crc kubenswrapper[4749]: I0219 18:43:56.201848 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-pbj6v" Feb 19 18:43:56 crc kubenswrapper[4749]: I0219 18:43:56.208750 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-7ngvd" Feb 19 18:43:56 crc kubenswrapper[4749]: I0219 18:43:56.221264 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-gqvc7" Feb 19 18:43:56 crc kubenswrapper[4749]: I0219 18:43:56.487528 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-gqvc7"] Feb 19 18:43:56 crc kubenswrapper[4749]: W0219 18:43:56.499682 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc78af5c_d237_4523_8035_d8992d4b539c.slice/crio-781c84f2a0b47809db3d04cb1412d778ad2940994171454df1035ee9acd49988 WatchSource:0}: Error finding container 781c84f2a0b47809db3d04cb1412d778ad2940994171454df1035ee9acd49988: Status 404 returned error can't find the container with id 781c84f2a0b47809db3d04cb1412d778ad2940994171454df1035ee9acd49988 Feb 19 18:43:56 crc kubenswrapper[4749]: I0219 18:43:56.502310 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 18:43:56 crc kubenswrapper[4749]: I0219 18:43:56.643250 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-pbj6v"] Feb 19 18:43:56 crc kubenswrapper[4749]: I0219 18:43:56.647849 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-7ngvd"] Feb 19 18:43:56 crc kubenswrapper[4749]: W0219 18:43:56.649226 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30e026fc_9274_4942_bf3d_68740957aeec.slice/crio-e23529d4057cd5aabebef042c072e628ecc4aefd7c543e73562cbc589abc74b7 WatchSource:0}: Error finding container e23529d4057cd5aabebef042c072e628ecc4aefd7c543e73562cbc589abc74b7: Status 404 returned error can't find the container with id e23529d4057cd5aabebef042c072e628ecc4aefd7c543e73562cbc589abc74b7 Feb 19 18:43:56 crc kubenswrapper[4749]: W0219 18:43:56.653776 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4703030_f4cb_4751_a9e1_5a6c1c9f4332.slice/crio-adbfae840027fbcfe8832285b6b58e7d8ccd400c6c6e76d888d4060682098032 WatchSource:0}: Error finding container adbfae840027fbcfe8832285b6b58e7d8ccd400c6c6e76d888d4060682098032: Status 404 returned error can't find the container with id adbfae840027fbcfe8832285b6b58e7d8ccd400c6c6e76d888d4060682098032 Feb 19 18:43:56 crc kubenswrapper[4749]: I0219 18:43:56.714907 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-gqvc7" event={"ID":"fc78af5c-d237-4523-8035-d8992d4b539c","Type":"ContainerStarted","Data":"781c84f2a0b47809db3d04cb1412d778ad2940994171454df1035ee9acd49988"} Feb 19 18:43:56 crc kubenswrapper[4749]: I0219 18:43:56.715668 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-pbj6v" event={"ID":"b4703030-f4cb-4751-a9e1-5a6c1c9f4332","Type":"ContainerStarted","Data":"adbfae840027fbcfe8832285b6b58e7d8ccd400c6c6e76d888d4060682098032"} Feb 19 18:43:56 crc kubenswrapper[4749]: I0219 18:43:56.716505 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-7ngvd" event={"ID":"30e026fc-9274-4942-bf3d-68740957aeec","Type":"ContainerStarted","Data":"e23529d4057cd5aabebef042c072e628ecc4aefd7c543e73562cbc589abc74b7"} Feb 19 18:44:00 crc kubenswrapper[4749]: I0219 18:44:00.762499 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-7ngvd" event={"ID":"30e026fc-9274-4942-bf3d-68740957aeec","Type":"ContainerStarted","Data":"6f507c38df3999ee1b77f90022878c6deb05b5a899a0b91e0b4ea848aea2a533"} Feb 19 18:44:00 crc kubenswrapper[4749]: I0219 18:44:00.764550 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-gqvc7" event={"ID":"fc78af5c-d237-4523-8035-d8992d4b539c","Type":"ContainerStarted","Data":"141a6aeea57a18e78983a63c868afa94990ce30fc7f35ad26157ef24a4ceb8a0"} Feb 19 18:44:00 crc kubenswrapper[4749]: I0219 18:44:00.764672 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-gqvc7" Feb 19 18:44:00 crc kubenswrapper[4749]: I0219 18:44:00.765869 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-pbj6v" event={"ID":"b4703030-f4cb-4751-a9e1-5a6c1c9f4332","Type":"ContainerStarted","Data":"3079874e9df6af04306ea228973038a444a4b3887de3cc283b541e7d0fa90195"} Feb 19 18:44:00 crc kubenswrapper[4749]: I0219 18:44:00.775450 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-7ngvd" podStartSLOduration=2.775757747 podStartE2EDuration="5.775435193s" podCreationTimestamp="2026-02-19 18:43:55 +0000 UTC" firstStartedPulling="2026-02-19 18:43:56.651207943 +0000 UTC m=+610.612427917" lastFinishedPulling="2026-02-19 18:43:59.650885409 +0000 UTC m=+613.612105363" observedRunningTime="2026-02-19 18:44:00.775108325 +0000 UTC m=+614.736328279" watchObservedRunningTime="2026-02-19 18:44:00.775435193 +0000 UTC m=+614.736655147" Feb 19 18:44:00 crc kubenswrapper[4749]: I0219 18:44:00.804506 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-pbj6v" podStartSLOduration=2.442700293 podStartE2EDuration="5.80449051s" podCreationTimestamp="2026-02-19 18:43:55 +0000 UTC" firstStartedPulling="2026-02-19 18:43:56.655514817 +0000 UTC m=+610.616734771" lastFinishedPulling="2026-02-19 18:44:00.017305034 +0000 UTC m=+613.978524988" observedRunningTime="2026-02-19 18:44:00.80157708 +0000 UTC m=+614.762797034" watchObservedRunningTime="2026-02-19 18:44:00.80449051 +0000 UTC m=+614.765710464" Feb 19 18:44:00 crc kubenswrapper[4749]: I0219 18:44:00.816302 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-gqvc7" podStartSLOduration=2.667905236 podStartE2EDuration="5.816280044s" podCreationTimestamp="2026-02-19 18:43:55 +0000 UTC" firstStartedPulling="2026-02-19 18:43:56.502102161 +0000 UTC m=+610.463322115" lastFinishedPulling="2026-02-19 18:43:59.650476969 +0000 UTC m=+613.611696923" observedRunningTime="2026-02-19 18:44:00.81359774 +0000 UTC m=+614.774817694" watchObservedRunningTime="2026-02-19 18:44:00.816280044 +0000 UTC m=+614.777499998" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.428413 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hz5j8"] Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.432104 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="ovn-controller" containerID="cri-o://2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045" gracePeriod=30 Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.432538 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="nbdb" containerID="cri-o://ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9" gracePeriod=30 Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.432175 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="northd" containerID="cri-o://27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625" gracePeriod=30 Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.432545 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="ovn-acl-logging" containerID="cri-o://df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84" gracePeriod=30 Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.432222 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="kube-rbac-proxy-node" containerID="cri-o://fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425" gracePeriod=30 Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.432251 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="sbdb" containerID="cri-o://0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14" gracePeriod=30 Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.432176 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7" gracePeriod=30 Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.478465 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="ovnkube-controller" containerID="cri-o://2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990" gracePeriod=30 Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.778445 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hz5j8_01e6ef30-b3be-4cfe-869c-0341a645215b/ovn-acl-logging/0.log" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.779177 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hz5j8_01e6ef30-b3be-4cfe-869c-0341a645215b/ovn-controller/0.log" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.779823 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.794091 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hz5j8_01e6ef30-b3be-4cfe-869c-0341a645215b/ovn-acl-logging/0.log" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.794722 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hz5j8_01e6ef30-b3be-4cfe-869c-0341a645215b/ovn-controller/0.log" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795205 4749 generic.go:334] "Generic (PLEG): container finished" podID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerID="2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990" exitCode=0 Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795236 4749 generic.go:334] "Generic (PLEG): container finished" podID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerID="0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14" exitCode=0 Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795250 4749 generic.go:334] "Generic (PLEG): container finished" podID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerID="ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9" exitCode=0 Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795264 4749 generic.go:334] "Generic (PLEG): container finished" podID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerID="27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625" exitCode=0 Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795276 4749 generic.go:334] "Generic (PLEG): container finished" podID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerID="086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7" exitCode=0 Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795288 4749 generic.go:334] "Generic (PLEG): container finished" podID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerID="fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425" exitCode=0 Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795302 4749 generic.go:334] "Generic (PLEG): container finished" podID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerID="df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84" exitCode=143 Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795315 4749 generic.go:334] "Generic (PLEG): container finished" podID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerID="2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045" exitCode=143 Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795369 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" event={"ID":"01e6ef30-b3be-4cfe-869c-0341a645215b","Type":"ContainerDied","Data":"2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795405 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" event={"ID":"01e6ef30-b3be-4cfe-869c-0341a645215b","Type":"ContainerDied","Data":"0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795427 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" event={"ID":"01e6ef30-b3be-4cfe-869c-0341a645215b","Type":"ContainerDied","Data":"ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795446 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" event={"ID":"01e6ef30-b3be-4cfe-869c-0341a645215b","Type":"ContainerDied","Data":"27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795465 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" event={"ID":"01e6ef30-b3be-4cfe-869c-0341a645215b","Type":"ContainerDied","Data":"086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795483 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" event={"ID":"01e6ef30-b3be-4cfe-869c-0341a645215b","Type":"ContainerDied","Data":"fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795502 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795519 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795531 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795545 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" event={"ID":"01e6ef30-b3be-4cfe-869c-0341a645215b","Type":"ContainerDied","Data":"df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795560 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795572 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795583 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795593 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795605 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795616 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795627 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795638 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795648 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795662 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" event={"ID":"01e6ef30-b3be-4cfe-869c-0341a645215b","Type":"ContainerDied","Data":"2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795677 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795691 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795702 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795713 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795724 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795735 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795746 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795756 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795767 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795780 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" event={"ID":"01e6ef30-b3be-4cfe-869c-0341a645215b","Type":"ContainerDied","Data":"d5049b05c8bfc42c2e39890f7612a049bcc2a26b627bedea39c2081326062474"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795795 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795808 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795820 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795831 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795842 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795853 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795865 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795876 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795887 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.795907 4749 scope.go:117] "RemoveContainer" containerID="2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.796143 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hz5j8" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.801757 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4w8w4_51c18c30-ebd6-48f1-bc44-97849f648ed2/kube-multus/0.log" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.801823 4749 generic.go:334] "Generic (PLEG): container finished" podID="51c18c30-ebd6-48f1-bc44-97849f648ed2" containerID="a041fbac0086e83243af5dde989a265a8d0815962a53402b6f5c778112f054c3" exitCode=2 Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.801853 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4w8w4" event={"ID":"51c18c30-ebd6-48f1-bc44-97849f648ed2","Type":"ContainerDied","Data":"a041fbac0086e83243af5dde989a265a8d0815962a53402b6f5c778112f054c3"} Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.802401 4749 scope.go:117] "RemoveContainer" containerID="a041fbac0086e83243af5dde989a265a8d0815962a53402b6f5c778112f054c3" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.815583 4749 scope.go:117] "RemoveContainer" containerID="0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.830133 4749 scope.go:117] "RemoveContainer" containerID="ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.852678 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4jvxz"] Feb 19 18:44:04 crc kubenswrapper[4749]: E0219 18:44:04.852950 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="kube-rbac-proxy-ovn-metrics" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.852972 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="kube-rbac-proxy-ovn-metrics" Feb 19 18:44:04 crc kubenswrapper[4749]: E0219 18:44:04.852985 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="ovn-controller" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.852994 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="ovn-controller" Feb 19 18:44:04 crc kubenswrapper[4749]: E0219 18:44:04.853008 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="sbdb" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.853017 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="sbdb" Feb 19 18:44:04 crc kubenswrapper[4749]: E0219 18:44:04.853045 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="kubecfg-setup" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.853052 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="kubecfg-setup" Feb 19 18:44:04 crc kubenswrapper[4749]: E0219 18:44:04.853061 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="kube-rbac-proxy-node" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.853069 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="kube-rbac-proxy-node" Feb 19 18:44:04 crc kubenswrapper[4749]: E0219 18:44:04.853086 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="ovn-acl-logging" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.853093 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="ovn-acl-logging" Feb 19 18:44:04 crc kubenswrapper[4749]: E0219 18:44:04.853101 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="ovnkube-controller" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.853109 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="ovnkube-controller" Feb 19 18:44:04 crc kubenswrapper[4749]: E0219 18:44:04.853118 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="northd" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.853125 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="northd" Feb 19 18:44:04 crc kubenswrapper[4749]: E0219 18:44:04.853135 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="nbdb" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.853142 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="nbdb" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.853251 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="northd" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.853261 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="ovn-acl-logging" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.853275 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="kube-rbac-proxy-ovn-metrics" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.853285 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="ovnkube-controller" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.853294 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="sbdb" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.853302 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="ovn-controller" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.853310 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="kube-rbac-proxy-node" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.853318 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" containerName="nbdb" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.855244 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.859039 4749 scope.go:117] "RemoveContainer" containerID="27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.873756 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/01e6ef30-b3be-4cfe-869c-0341a645215b-ovnkube-config\") pod \"01e6ef30-b3be-4cfe-869c-0341a645215b\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.873807 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-etc-openvswitch\") pod \"01e6ef30-b3be-4cfe-869c-0341a645215b\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.873834 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/01e6ef30-b3be-4cfe-869c-0341a645215b-ovn-node-metrics-cert\") pod \"01e6ef30-b3be-4cfe-869c-0341a645215b\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.873857 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-cni-netd\") pod \"01e6ef30-b3be-4cfe-869c-0341a645215b\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.873958 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-node-log\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.873988 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-ovnkube-script-lib\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.874016 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-var-lib-openvswitch\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.874061 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-host-run-ovn-kubernetes\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.874083 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-env-overrides\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.874118 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-host-run-netns\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.874139 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-run-ovn\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.874175 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-host-cni-bin\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.874200 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-log-socket\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.874244 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-host-cni-netd\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.874306 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-systemd-units\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.874337 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-ovnkube-config\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.874363 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-host-slash\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.874393 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-ovn-node-metrics-cert\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.874430 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-host-kubelet\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.874451 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-run-openvswitch\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.874476 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.874501 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-etc-openvswitch\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.874524 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-run-systemd\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.874544 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xpqg\" (UniqueName: \"kubernetes.io/projected/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-kube-api-access-5xpqg\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.874767 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "01e6ef30-b3be-4cfe-869c-0341a645215b" (UID: "01e6ef30-b3be-4cfe-869c-0341a645215b"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.875017 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "01e6ef30-b3be-4cfe-869c-0341a645215b" (UID: "01e6ef30-b3be-4cfe-869c-0341a645215b"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.875787 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01e6ef30-b3be-4cfe-869c-0341a645215b-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "01e6ef30-b3be-4cfe-869c-0341a645215b" (UID: "01e6ef30-b3be-4cfe-869c-0341a645215b"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.882060 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e6ef30-b3be-4cfe-869c-0341a645215b-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "01e6ef30-b3be-4cfe-869c-0341a645215b" (UID: "01e6ef30-b3be-4cfe-869c-0341a645215b"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.895653 4749 scope.go:117] "RemoveContainer" containerID="086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.927785 4749 scope.go:117] "RemoveContainer" containerID="fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.943146 4749 scope.go:117] "RemoveContainer" containerID="df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.956863 4749 scope.go:117] "RemoveContainer" containerID="2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.973821 4749 scope.go:117] "RemoveContainer" containerID="503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.974896 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-node-log\") pod \"01e6ef30-b3be-4cfe-869c-0341a645215b\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.974937 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/01e6ef30-b3be-4cfe-869c-0341a645215b-ovnkube-script-lib\") pod \"01e6ef30-b3be-4cfe-869c-0341a645215b\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.974961 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-cni-bin\") pod \"01e6ef30-b3be-4cfe-869c-0341a645215b\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.974985 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-systemd-units\") pod \"01e6ef30-b3be-4cfe-869c-0341a645215b\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.974994 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-node-log" (OuterVolumeSpecName: "node-log") pod "01e6ef30-b3be-4cfe-869c-0341a645215b" (UID: "01e6ef30-b3be-4cfe-869c-0341a645215b"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.975005 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-run-ovn\") pod \"01e6ef30-b3be-4cfe-869c-0341a645215b\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.975049 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "01e6ef30-b3be-4cfe-869c-0341a645215b" (UID: "01e6ef30-b3be-4cfe-869c-0341a645215b"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.975090 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-run-ovn-kubernetes\") pod \"01e6ef30-b3be-4cfe-869c-0341a645215b\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.975119 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-slash\") pod \"01e6ef30-b3be-4cfe-869c-0341a645215b\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.975118 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "01e6ef30-b3be-4cfe-869c-0341a645215b" (UID: "01e6ef30-b3be-4cfe-869c-0341a645215b"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.975132 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "01e6ef30-b3be-4cfe-869c-0341a645215b" (UID: "01e6ef30-b3be-4cfe-869c-0341a645215b"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.975173 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "01e6ef30-b3be-4cfe-869c-0341a645215b" (UID: "01e6ef30-b3be-4cfe-869c-0341a645215b"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.975142 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"01e6ef30-b3be-4cfe-869c-0341a645215b\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.975205 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-slash" (OuterVolumeSpecName: "host-slash") pod "01e6ef30-b3be-4cfe-869c-0341a645215b" (UID: "01e6ef30-b3be-4cfe-869c-0341a645215b"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.975207 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "01e6ef30-b3be-4cfe-869c-0341a645215b" (UID: "01e6ef30-b3be-4cfe-869c-0341a645215b"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.975240 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-kubelet\") pod \"01e6ef30-b3be-4cfe-869c-0341a645215b\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.975275 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/01e6ef30-b3be-4cfe-869c-0341a645215b-env-overrides\") pod \"01e6ef30-b3be-4cfe-869c-0341a645215b\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.975302 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-var-lib-openvswitch\") pod \"01e6ef30-b3be-4cfe-869c-0341a645215b\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.975331 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "01e6ef30-b3be-4cfe-869c-0341a645215b" (UID: "01e6ef30-b3be-4cfe-869c-0341a645215b"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.975341 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-run-openvswitch\") pod \"01e6ef30-b3be-4cfe-869c-0341a645215b\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.975368 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01e6ef30-b3be-4cfe-869c-0341a645215b-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "01e6ef30-b3be-4cfe-869c-0341a645215b" (UID: "01e6ef30-b3be-4cfe-869c-0341a645215b"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.975401 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "01e6ef30-b3be-4cfe-869c-0341a645215b" (UID: "01e6ef30-b3be-4cfe-869c-0341a645215b"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.975401 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "01e6ef30-b3be-4cfe-869c-0341a645215b" (UID: "01e6ef30-b3be-4cfe-869c-0341a645215b"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.975416 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-log-socket\") pod \"01e6ef30-b3be-4cfe-869c-0341a645215b\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.975444 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-log-socket" (OuterVolumeSpecName: "log-socket") pod "01e6ef30-b3be-4cfe-869c-0341a645215b" (UID: "01e6ef30-b3be-4cfe-869c-0341a645215b"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.975452 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-run-systemd\") pod \"01e6ef30-b3be-4cfe-869c-0341a645215b\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.975484 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-run-netns\") pod \"01e6ef30-b3be-4cfe-869c-0341a645215b\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.975531 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nww4\" (UniqueName: \"kubernetes.io/projected/01e6ef30-b3be-4cfe-869c-0341a645215b-kube-api-access-9nww4\") pod \"01e6ef30-b3be-4cfe-869c-0341a645215b\" (UID: \"01e6ef30-b3be-4cfe-869c-0341a645215b\") " Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.975575 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "01e6ef30-b3be-4cfe-869c-0341a645215b" (UID: "01e6ef30-b3be-4cfe-869c-0341a645215b"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.975610 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01e6ef30-b3be-4cfe-869c-0341a645215b-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "01e6ef30-b3be-4cfe-869c-0341a645215b" (UID: "01e6ef30-b3be-4cfe-869c-0341a645215b"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.975908 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-var-lib-openvswitch\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.975967 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-host-run-ovn-kubernetes\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976011 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-env-overrides\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976013 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-var-lib-openvswitch\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976072 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-host-run-ovn-kubernetes\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976097 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-host-run-netns\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976133 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-host-run-netns\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976135 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-run-ovn\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976168 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-run-ovn\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976184 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-host-cni-bin\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976247 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-log-socket\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976275 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-host-cni-bin\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976317 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-host-cni-netd\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976357 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-systemd-units\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976363 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-log-socket\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976383 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-ovnkube-config\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976420 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-host-cni-netd\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976425 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-host-slash\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976520 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-systemd-units\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976509 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-ovn-node-metrics-cert\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976560 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-host-slash\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976654 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-host-kubelet\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976692 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-host-kubelet\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976711 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-run-openvswitch\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976747 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976796 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-etc-openvswitch\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976825 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-run-systemd\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976829 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976848 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xpqg\" (UniqueName: \"kubernetes.io/projected/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-kube-api-access-5xpqg\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976877 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-node-log\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976880 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-etc-openvswitch\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976906 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-ovnkube-script-lib\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976913 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-run-systemd\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976841 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-run-openvswitch\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.976997 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-node-log\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.977011 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-ovnkube-config\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.977074 4749 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.977103 4749 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-log-socket\") on node \"crc\" DevicePath \"\"" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.977136 4749 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.977155 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/01e6ef30-b3be-4cfe-869c-0341a645215b-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.977174 4749 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.977193 4749 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.977210 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/01e6ef30-b3be-4cfe-869c-0341a645215b-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.977228 4749 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-node-log\") on node \"crc\" DevicePath \"\"" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.977244 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/01e6ef30-b3be-4cfe-869c-0341a645215b-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.977259 4749 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.977277 4749 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.977296 4749 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.977314 4749 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.977331 4749 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-slash\") on node \"crc\" DevicePath \"\"" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.977347 4749 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.977365 4749 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.977383 4749 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/01e6ef30-b3be-4cfe-869c-0341a645215b-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.977399 4749 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.977651 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-ovnkube-script-lib\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.979474 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-env-overrides\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.979752 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01e6ef30-b3be-4cfe-869c-0341a645215b-kube-api-access-9nww4" (OuterVolumeSpecName: "kube-api-access-9nww4") pod "01e6ef30-b3be-4cfe-869c-0341a645215b" (UID: "01e6ef30-b3be-4cfe-869c-0341a645215b"). InnerVolumeSpecName "kube-api-access-9nww4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.980805 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-ovn-node-metrics-cert\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.990745 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "01e6ef30-b3be-4cfe-869c-0341a645215b" (UID: "01e6ef30-b3be-4cfe-869c-0341a645215b"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.991693 4749 scope.go:117] "RemoveContainer" containerID="2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990" Feb 19 18:44:04 crc kubenswrapper[4749]: E0219 18:44:04.992173 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990\": container with ID starting with 2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990 not found: ID does not exist" containerID="2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.992214 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990"} err="failed to get container status \"2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990\": rpc error: code = NotFound desc = could not find container \"2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990\": container with ID starting with 2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990 not found: ID does not exist" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.992246 4749 scope.go:117] "RemoveContainer" containerID="0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.992998 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xpqg\" (UniqueName: \"kubernetes.io/projected/f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2-kube-api-access-5xpqg\") pod \"ovnkube-node-4jvxz\" (UID: \"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:04 crc kubenswrapper[4749]: E0219 18:44:04.993439 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14\": container with ID starting with 0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14 not found: ID does not exist" containerID="0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.993479 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14"} err="failed to get container status \"0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14\": rpc error: code = NotFound desc = could not find container \"0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14\": container with ID starting with 0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14 not found: ID does not exist" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.993498 4749 scope.go:117] "RemoveContainer" containerID="ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9" Feb 19 18:44:04 crc kubenswrapper[4749]: E0219 18:44:04.993777 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9\": container with ID starting with ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9 not found: ID does not exist" containerID="ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.993826 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9"} err="failed to get container status \"ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9\": rpc error: code = NotFound desc = could not find container \"ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9\": container with ID starting with ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9 not found: ID does not exist" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.993851 4749 scope.go:117] "RemoveContainer" containerID="27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625" Feb 19 18:44:04 crc kubenswrapper[4749]: E0219 18:44:04.994167 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625\": container with ID starting with 27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625 not found: ID does not exist" containerID="27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.994186 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625"} err="failed to get container status \"27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625\": rpc error: code = NotFound desc = could not find container \"27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625\": container with ID starting with 27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625 not found: ID does not exist" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.994200 4749 scope.go:117] "RemoveContainer" containerID="086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7" Feb 19 18:44:04 crc kubenswrapper[4749]: E0219 18:44:04.994437 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7\": container with ID starting with 086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7 not found: ID does not exist" containerID="086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.994470 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7"} err="failed to get container status \"086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7\": rpc error: code = NotFound desc = could not find container \"086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7\": container with ID starting with 086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7 not found: ID does not exist" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.994490 4749 scope.go:117] "RemoveContainer" containerID="fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425" Feb 19 18:44:04 crc kubenswrapper[4749]: E0219 18:44:04.994772 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425\": container with ID starting with fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425 not found: ID does not exist" containerID="fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.994800 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425"} err="failed to get container status \"fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425\": rpc error: code = NotFound desc = could not find container \"fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425\": container with ID starting with fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425 not found: ID does not exist" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.994818 4749 scope.go:117] "RemoveContainer" containerID="df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84" Feb 19 18:44:04 crc kubenswrapper[4749]: E0219 18:44:04.995370 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84\": container with ID starting with df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84 not found: ID does not exist" containerID="df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.995405 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84"} err="failed to get container status \"df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84\": rpc error: code = NotFound desc = could not find container \"df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84\": container with ID starting with df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84 not found: ID does not exist" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.995430 4749 scope.go:117] "RemoveContainer" containerID="2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045" Feb 19 18:44:04 crc kubenswrapper[4749]: E0219 18:44:04.995774 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045\": container with ID starting with 2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045 not found: ID does not exist" containerID="2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.995805 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045"} err="failed to get container status \"2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045\": rpc error: code = NotFound desc = could not find container \"2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045\": container with ID starting with 2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045 not found: ID does not exist" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.995824 4749 scope.go:117] "RemoveContainer" containerID="503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674" Feb 19 18:44:04 crc kubenswrapper[4749]: E0219 18:44:04.996177 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674\": container with ID starting with 503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674 not found: ID does not exist" containerID="503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.996205 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674"} err="failed to get container status \"503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674\": rpc error: code = NotFound desc = could not find container \"503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674\": container with ID starting with 503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674 not found: ID does not exist" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.996226 4749 scope.go:117] "RemoveContainer" containerID="2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.996589 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990"} err="failed to get container status \"2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990\": rpc error: code = NotFound desc = could not find container \"2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990\": container with ID starting with 2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990 not found: ID does not exist" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.996623 4749 scope.go:117] "RemoveContainer" containerID="0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.996927 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14"} err="failed to get container status \"0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14\": rpc error: code = NotFound desc = could not find container \"0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14\": container with ID starting with 0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14 not found: ID does not exist" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.996952 4749 scope.go:117] "RemoveContainer" containerID="ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.997243 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9"} err="failed to get container status \"ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9\": rpc error: code = NotFound desc = could not find container \"ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9\": container with ID starting with ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9 not found: ID does not exist" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.997277 4749 scope.go:117] "RemoveContainer" containerID="27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.998260 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625"} err="failed to get container status \"27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625\": rpc error: code = NotFound desc = could not find container \"27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625\": container with ID starting with 27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625 not found: ID does not exist" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.998285 4749 scope.go:117] "RemoveContainer" containerID="086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.999694 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7"} err="failed to get container status \"086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7\": rpc error: code = NotFound desc = could not find container \"086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7\": container with ID starting with 086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7 not found: ID does not exist" Feb 19 18:44:04 crc kubenswrapper[4749]: I0219 18:44:04.999729 4749 scope.go:117] "RemoveContainer" containerID="fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.000094 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425"} err="failed to get container status \"fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425\": rpc error: code = NotFound desc = could not find container \"fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425\": container with ID starting with fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.000118 4749 scope.go:117] "RemoveContainer" containerID="df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.000491 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84"} err="failed to get container status \"df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84\": rpc error: code = NotFound desc = could not find container \"df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84\": container with ID starting with df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.000519 4749 scope.go:117] "RemoveContainer" containerID="2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.000751 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045"} err="failed to get container status \"2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045\": rpc error: code = NotFound desc = could not find container \"2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045\": container with ID starting with 2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.000786 4749 scope.go:117] "RemoveContainer" containerID="503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.001016 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674"} err="failed to get container status \"503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674\": rpc error: code = NotFound desc = could not find container \"503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674\": container with ID starting with 503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.001052 4749 scope.go:117] "RemoveContainer" containerID="2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.001300 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990"} err="failed to get container status \"2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990\": rpc error: code = NotFound desc = could not find container \"2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990\": container with ID starting with 2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.001323 4749 scope.go:117] "RemoveContainer" containerID="0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.001720 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14"} err="failed to get container status \"0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14\": rpc error: code = NotFound desc = could not find container \"0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14\": container with ID starting with 0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.001744 4749 scope.go:117] "RemoveContainer" containerID="ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.002052 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9"} err="failed to get container status \"ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9\": rpc error: code = NotFound desc = could not find container \"ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9\": container with ID starting with ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.002078 4749 scope.go:117] "RemoveContainer" containerID="27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.002345 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625"} err="failed to get container status \"27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625\": rpc error: code = NotFound desc = could not find container \"27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625\": container with ID starting with 27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.002367 4749 scope.go:117] "RemoveContainer" containerID="086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.002596 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7"} err="failed to get container status \"086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7\": rpc error: code = NotFound desc = could not find container \"086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7\": container with ID starting with 086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.002618 4749 scope.go:117] "RemoveContainer" containerID="fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.002824 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425"} err="failed to get container status \"fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425\": rpc error: code = NotFound desc = could not find container \"fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425\": container with ID starting with fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.002845 4749 scope.go:117] "RemoveContainer" containerID="df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.003199 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84"} err="failed to get container status \"df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84\": rpc error: code = NotFound desc = could not find container \"df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84\": container with ID starting with df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.003222 4749 scope.go:117] "RemoveContainer" containerID="2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.003627 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045"} err="failed to get container status \"2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045\": rpc error: code = NotFound desc = could not find container \"2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045\": container with ID starting with 2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.003650 4749 scope.go:117] "RemoveContainer" containerID="503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.003885 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674"} err="failed to get container status \"503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674\": rpc error: code = NotFound desc = could not find container \"503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674\": container with ID starting with 503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.003908 4749 scope.go:117] "RemoveContainer" containerID="2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.004237 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990"} err="failed to get container status \"2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990\": rpc error: code = NotFound desc = could not find container \"2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990\": container with ID starting with 2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.004266 4749 scope.go:117] "RemoveContainer" containerID="0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.004679 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14"} err="failed to get container status \"0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14\": rpc error: code = NotFound desc = could not find container \"0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14\": container with ID starting with 0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.004702 4749 scope.go:117] "RemoveContainer" containerID="ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.005012 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9"} err="failed to get container status \"ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9\": rpc error: code = NotFound desc = could not find container \"ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9\": container with ID starting with ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.005049 4749 scope.go:117] "RemoveContainer" containerID="27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.005403 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625"} err="failed to get container status \"27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625\": rpc error: code = NotFound desc = could not find container \"27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625\": container with ID starting with 27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.005456 4749 scope.go:117] "RemoveContainer" containerID="086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.005754 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7"} err="failed to get container status \"086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7\": rpc error: code = NotFound desc = could not find container \"086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7\": container with ID starting with 086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.005781 4749 scope.go:117] "RemoveContainer" containerID="fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.006785 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425"} err="failed to get container status \"fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425\": rpc error: code = NotFound desc = could not find container \"fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425\": container with ID starting with fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.006816 4749 scope.go:117] "RemoveContainer" containerID="df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.007147 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84"} err="failed to get container status \"df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84\": rpc error: code = NotFound desc = could not find container \"df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84\": container with ID starting with df5e075ba1771647dee4b27062936428ebdbbfddef01cd97d6b68c80d876cf84 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.007171 4749 scope.go:117] "RemoveContainer" containerID="2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.007412 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045"} err="failed to get container status \"2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045\": rpc error: code = NotFound desc = could not find container \"2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045\": container with ID starting with 2e8d7947c8d51b740ff78d496e59f052cf420d68cc70cb216b91da9595ad3045 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.007445 4749 scope.go:117] "RemoveContainer" containerID="503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.007683 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674"} err="failed to get container status \"503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674\": rpc error: code = NotFound desc = could not find container \"503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674\": container with ID starting with 503b3cb49cdeceaf8efa42348f7afbac25cd38e1945edf02bf32b6a807b26674 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.007712 4749 scope.go:117] "RemoveContainer" containerID="2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.008015 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990"} err="failed to get container status \"2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990\": rpc error: code = NotFound desc = could not find container \"2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990\": container with ID starting with 2259d12312910e88fb2e88d5192ed5ea2c4604cdcd522b28ed354bc7875f6990 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.008055 4749 scope.go:117] "RemoveContainer" containerID="0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.008277 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14"} err="failed to get container status \"0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14\": rpc error: code = NotFound desc = could not find container \"0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14\": container with ID starting with 0bebe4e1d72b2f42ea80927903ea85986a29655a06a90dba8b3f67fd7aa3cf14 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.008304 4749 scope.go:117] "RemoveContainer" containerID="ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.008524 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9"} err="failed to get container status \"ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9\": rpc error: code = NotFound desc = could not find container \"ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9\": container with ID starting with ebd1a02d7e134d6f0b41cec64b50d141fd48583363d7a6e723ef600d05982fa9 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.008559 4749 scope.go:117] "RemoveContainer" containerID="27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.008884 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625"} err="failed to get container status \"27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625\": rpc error: code = NotFound desc = could not find container \"27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625\": container with ID starting with 27fad835a6adb8b34fa3b026726fdfbfbe4fa10366b43d42550067cae0723625 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.008910 4749 scope.go:117] "RemoveContainer" containerID="086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.009254 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7"} err="failed to get container status \"086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7\": rpc error: code = NotFound desc = could not find container \"086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7\": container with ID starting with 086f0093495edd85f39ec5c34afcc696cdb15fff042d0d9a3f72fc4397e095e7 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.009284 4749 scope.go:117] "RemoveContainer" containerID="fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.009502 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425"} err="failed to get container status \"fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425\": rpc error: code = NotFound desc = could not find container \"fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425\": container with ID starting with fc100d85f98ca7331033866de3310490d1df611e1b8542ae99b9cb46e61f3425 not found: ID does not exist" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.078687 4749 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/01e6ef30-b3be-4cfe-869c-0341a645215b-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.078723 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nww4\" (UniqueName: \"kubernetes.io/projected/01e6ef30-b3be-4cfe-869c-0341a645215b-kube-api-access-9nww4\") on node \"crc\" DevicePath \"\"" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.135216 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hz5j8"] Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.138535 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hz5j8"] Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.225425 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:05 crc kubenswrapper[4749]: W0219 18:44:05.243830 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7f473f6_e6ee_4c46_b5a6_7242c1a85ff2.slice/crio-16988f0113227b903830190a43bcdb59da9ba3d96df92a898a0c319f92585353 WatchSource:0}: Error finding container 16988f0113227b903830190a43bcdb59da9ba3d96df92a898a0c319f92585353: Status 404 returned error can't find the container with id 16988f0113227b903830190a43bcdb59da9ba3d96df92a898a0c319f92585353 Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.809766 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4w8w4_51c18c30-ebd6-48f1-bc44-97849f648ed2/kube-multus/0.log" Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.809828 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4w8w4" event={"ID":"51c18c30-ebd6-48f1-bc44-97849f648ed2","Type":"ContainerStarted","Data":"a37a300cb9e5857d58dad8a70069339a8dfc4d6fa922ddff4de5058ab1185c95"} Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.811817 4749 generic.go:334] "Generic (PLEG): container finished" podID="f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2" containerID="013e38aee70fd91e319bab014097ab3c831882e1c3e6161731065b1c687bba63" exitCode=0 Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.811849 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" event={"ID":"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2","Type":"ContainerDied","Data":"013e38aee70fd91e319bab014097ab3c831882e1c3e6161731065b1c687bba63"} Feb 19 18:44:05 crc kubenswrapper[4749]: I0219 18:44:05.811867 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" event={"ID":"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2","Type":"ContainerStarted","Data":"16988f0113227b903830190a43bcdb59da9ba3d96df92a898a0c319f92585353"} Feb 19 18:44:06 crc kubenswrapper[4749]: I0219 18:44:06.224425 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-gqvc7" Feb 19 18:44:06 crc kubenswrapper[4749]: I0219 18:44:06.686552 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01e6ef30-b3be-4cfe-869c-0341a645215b" path="/var/lib/kubelet/pods/01e6ef30-b3be-4cfe-869c-0341a645215b/volumes" Feb 19 18:44:06 crc kubenswrapper[4749]: I0219 18:44:06.827635 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" event={"ID":"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2","Type":"ContainerStarted","Data":"b46a81c51f7e5da9e5ce24efbf1dd28369a5fdec287a215cee8abc07e921ba9e"} Feb 19 18:44:06 crc kubenswrapper[4749]: I0219 18:44:06.827681 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" event={"ID":"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2","Type":"ContainerStarted","Data":"343067251c610427c873927fa999a96de5025481f82458fcf952668d7f305899"} Feb 19 18:44:06 crc kubenswrapper[4749]: I0219 18:44:06.827692 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" event={"ID":"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2","Type":"ContainerStarted","Data":"8e327791348e43e5050d1d531f55c22bc7746e2b5729f09e56d066c164253651"} Feb 19 18:44:06 crc kubenswrapper[4749]: I0219 18:44:06.827707 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" event={"ID":"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2","Type":"ContainerStarted","Data":"c7f5d1f74f120280fa6113be1a441025cfea4d72166e2dc5f3ab82c9def24cab"} Feb 19 18:44:06 crc kubenswrapper[4749]: I0219 18:44:06.827718 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" event={"ID":"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2","Type":"ContainerStarted","Data":"a7458696fc6d33dcb3007f6f8508f1ff34252a071cd02ee7e834d2b6d09e46ff"} Feb 19 18:44:06 crc kubenswrapper[4749]: I0219 18:44:06.827727 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" event={"ID":"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2","Type":"ContainerStarted","Data":"701798e3c1c004496dd827e360a931b2ce6b7176570bb09cfb36b9b2d4a3adce"} Feb 19 18:44:08 crc kubenswrapper[4749]: I0219 18:44:08.845104 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" event={"ID":"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2","Type":"ContainerStarted","Data":"9e321b0ed4eee78e8e8e29672f4ce8f00649240ad388ed4a028959343894850b"} Feb 19 18:44:11 crc kubenswrapper[4749]: I0219 18:44:11.868240 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" event={"ID":"f7f473f6-e6ee-4c46-b5a6-7242c1a85ff2","Type":"ContainerStarted","Data":"6a985adc6171d8a85d644de7aa6a890d77ca6692739bacf5fce268e264842a34"} Feb 19 18:44:11 crc kubenswrapper[4749]: I0219 18:44:11.868643 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:11 crc kubenswrapper[4749]: I0219 18:44:11.868659 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:11 crc kubenswrapper[4749]: I0219 18:44:11.896702 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:11 crc kubenswrapper[4749]: I0219 18:44:11.900696 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" podStartSLOduration=7.9006418830000005 podStartE2EDuration="7.900641883s" podCreationTimestamp="2026-02-19 18:44:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:44:11.895431018 +0000 UTC m=+625.856650982" watchObservedRunningTime="2026-02-19 18:44:11.900641883 +0000 UTC m=+625.861861877" Feb 19 18:44:12 crc kubenswrapper[4749]: I0219 18:44:12.873151 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:12 crc kubenswrapper[4749]: I0219 18:44:12.896868 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:24 crc kubenswrapper[4749]: I0219 18:44:24.725368 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 18:44:24 crc kubenswrapper[4749]: I0219 18:44:24.726067 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 18:44:24 crc kubenswrapper[4749]: I0219 18:44:24.726146 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 18:44:24 crc kubenswrapper[4749]: I0219 18:44:24.727617 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"834fb3e32a872fa725853bfb5119dcd730968979e3bcf1f345f3d75fe740a490"} pod="openshift-machine-config-operator/machine-config-daemon-nzldt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 18:44:24 crc kubenswrapper[4749]: I0219 18:44:24.727685 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" containerID="cri-o://834fb3e32a872fa725853bfb5119dcd730968979e3bcf1f345f3d75fe740a490" gracePeriod=600 Feb 19 18:44:24 crc kubenswrapper[4749]: I0219 18:44:24.938873 4749 generic.go:334] "Generic (PLEG): container finished" podID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerID="834fb3e32a872fa725853bfb5119dcd730968979e3bcf1f345f3d75fe740a490" exitCode=0 Feb 19 18:44:24 crc kubenswrapper[4749]: I0219 18:44:24.939068 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerDied","Data":"834fb3e32a872fa725853bfb5119dcd730968979e3bcf1f345f3d75fe740a490"} Feb 19 18:44:24 crc kubenswrapper[4749]: I0219 18:44:24.939212 4749 scope.go:117] "RemoveContainer" containerID="e950b1a12f156ed5917a35268977b9c8856e348f780776f4a9cea21c27147df8" Feb 19 18:44:25 crc kubenswrapper[4749]: I0219 18:44:25.945767 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerStarted","Data":"d32840142f59a3f51a6617459783496dfcb99167a3d91ff021347454591db672"} Feb 19 18:44:34 crc kubenswrapper[4749]: I0219 18:44:34.416657 4749 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 19 18:44:35 crc kubenswrapper[4749]: I0219 18:44:35.250262 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4jvxz" Feb 19 18:44:37 crc kubenswrapper[4749]: I0219 18:44:37.863333 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5"] Feb 19 18:44:37 crc kubenswrapper[4749]: I0219 18:44:37.873693 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5" Feb 19 18:44:37 crc kubenswrapper[4749]: I0219 18:44:37.878143 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 18:44:37 crc kubenswrapper[4749]: I0219 18:44:37.884779 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5"] Feb 19 18:44:37 crc kubenswrapper[4749]: I0219 18:44:37.913589 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0278ac54-8aaa-4f8e-ba20-8986a1467d1f-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5\" (UID: \"0278ac54-8aaa-4f8e-ba20-8986a1467d1f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5" Feb 19 18:44:37 crc kubenswrapper[4749]: I0219 18:44:37.913748 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvtbj\" (UniqueName: \"kubernetes.io/projected/0278ac54-8aaa-4f8e-ba20-8986a1467d1f-kube-api-access-xvtbj\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5\" (UID: \"0278ac54-8aaa-4f8e-ba20-8986a1467d1f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5" Feb 19 18:44:37 crc kubenswrapper[4749]: I0219 18:44:37.913880 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0278ac54-8aaa-4f8e-ba20-8986a1467d1f-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5\" (UID: \"0278ac54-8aaa-4f8e-ba20-8986a1467d1f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5" Feb 19 18:44:38 crc kubenswrapper[4749]: I0219 18:44:38.014851 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0278ac54-8aaa-4f8e-ba20-8986a1467d1f-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5\" (UID: \"0278ac54-8aaa-4f8e-ba20-8986a1467d1f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5" Feb 19 18:44:38 crc kubenswrapper[4749]: I0219 18:44:38.014938 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvtbj\" (UniqueName: \"kubernetes.io/projected/0278ac54-8aaa-4f8e-ba20-8986a1467d1f-kube-api-access-xvtbj\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5\" (UID: \"0278ac54-8aaa-4f8e-ba20-8986a1467d1f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5" Feb 19 18:44:38 crc kubenswrapper[4749]: I0219 18:44:38.014975 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0278ac54-8aaa-4f8e-ba20-8986a1467d1f-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5\" (UID: \"0278ac54-8aaa-4f8e-ba20-8986a1467d1f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5" Feb 19 18:44:38 crc kubenswrapper[4749]: I0219 18:44:38.015464 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0278ac54-8aaa-4f8e-ba20-8986a1467d1f-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5\" (UID: \"0278ac54-8aaa-4f8e-ba20-8986a1467d1f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5" Feb 19 18:44:38 crc kubenswrapper[4749]: I0219 18:44:38.015578 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0278ac54-8aaa-4f8e-ba20-8986a1467d1f-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5\" (UID: \"0278ac54-8aaa-4f8e-ba20-8986a1467d1f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5" Feb 19 18:44:38 crc kubenswrapper[4749]: I0219 18:44:38.048552 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvtbj\" (UniqueName: \"kubernetes.io/projected/0278ac54-8aaa-4f8e-ba20-8986a1467d1f-kube-api-access-xvtbj\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5\" (UID: \"0278ac54-8aaa-4f8e-ba20-8986a1467d1f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5" Feb 19 18:44:38 crc kubenswrapper[4749]: I0219 18:44:38.221724 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5" Feb 19 18:44:38 crc kubenswrapper[4749]: I0219 18:44:38.609369 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5"] Feb 19 18:44:39 crc kubenswrapper[4749]: I0219 18:44:39.019983 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5" event={"ID":"0278ac54-8aaa-4f8e-ba20-8986a1467d1f","Type":"ContainerStarted","Data":"7b24a8ef424d3b9d3d7688de76fb9a542fa0701d257d07ad73e4c04a53e3a237"} Feb 19 18:44:39 crc kubenswrapper[4749]: I0219 18:44:39.020285 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5" event={"ID":"0278ac54-8aaa-4f8e-ba20-8986a1467d1f","Type":"ContainerStarted","Data":"e8c026c1e26a69c11d6fdf754af36e382b148cf41027fdbc1ea5562119a0af0a"} Feb 19 18:44:40 crc kubenswrapper[4749]: I0219 18:44:40.027401 4749 generic.go:334] "Generic (PLEG): container finished" podID="0278ac54-8aaa-4f8e-ba20-8986a1467d1f" containerID="7b24a8ef424d3b9d3d7688de76fb9a542fa0701d257d07ad73e4c04a53e3a237" exitCode=0 Feb 19 18:44:40 crc kubenswrapper[4749]: I0219 18:44:40.027465 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5" event={"ID":"0278ac54-8aaa-4f8e-ba20-8986a1467d1f","Type":"ContainerDied","Data":"7b24a8ef424d3b9d3d7688de76fb9a542fa0701d257d07ad73e4c04a53e3a237"} Feb 19 18:44:40 crc kubenswrapper[4749]: I0219 18:44:40.189293 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6cx7l"] Feb 19 18:44:40 crc kubenswrapper[4749]: I0219 18:44:40.190560 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6cx7l" Feb 19 18:44:40 crc kubenswrapper[4749]: I0219 18:44:40.207792 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6cx7l"] Feb 19 18:44:40 crc kubenswrapper[4749]: I0219 18:44:40.247561 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lkh6\" (UniqueName: \"kubernetes.io/projected/e64194f0-3c2f-4b40-9f7a-5130684fc9fa-kube-api-access-9lkh6\") pod \"redhat-operators-6cx7l\" (UID: \"e64194f0-3c2f-4b40-9f7a-5130684fc9fa\") " pod="openshift-marketplace/redhat-operators-6cx7l" Feb 19 18:44:40 crc kubenswrapper[4749]: I0219 18:44:40.247612 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e64194f0-3c2f-4b40-9f7a-5130684fc9fa-catalog-content\") pod \"redhat-operators-6cx7l\" (UID: \"e64194f0-3c2f-4b40-9f7a-5130684fc9fa\") " pod="openshift-marketplace/redhat-operators-6cx7l" Feb 19 18:44:40 crc kubenswrapper[4749]: I0219 18:44:40.247690 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e64194f0-3c2f-4b40-9f7a-5130684fc9fa-utilities\") pod \"redhat-operators-6cx7l\" (UID: \"e64194f0-3c2f-4b40-9f7a-5130684fc9fa\") " pod="openshift-marketplace/redhat-operators-6cx7l" Feb 19 18:44:40 crc kubenswrapper[4749]: I0219 18:44:40.349740 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e64194f0-3c2f-4b40-9f7a-5130684fc9fa-catalog-content\") pod \"redhat-operators-6cx7l\" (UID: \"e64194f0-3c2f-4b40-9f7a-5130684fc9fa\") " pod="openshift-marketplace/redhat-operators-6cx7l" Feb 19 18:44:40 crc kubenswrapper[4749]: I0219 18:44:40.350331 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lkh6\" (UniqueName: \"kubernetes.io/projected/e64194f0-3c2f-4b40-9f7a-5130684fc9fa-kube-api-access-9lkh6\") pod \"redhat-operators-6cx7l\" (UID: \"e64194f0-3c2f-4b40-9f7a-5130684fc9fa\") " pod="openshift-marketplace/redhat-operators-6cx7l" Feb 19 18:44:40 crc kubenswrapper[4749]: I0219 18:44:40.350498 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e64194f0-3c2f-4b40-9f7a-5130684fc9fa-catalog-content\") pod \"redhat-operators-6cx7l\" (UID: \"e64194f0-3c2f-4b40-9f7a-5130684fc9fa\") " pod="openshift-marketplace/redhat-operators-6cx7l" Feb 19 18:44:40 crc kubenswrapper[4749]: I0219 18:44:40.350633 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e64194f0-3c2f-4b40-9f7a-5130684fc9fa-utilities\") pod \"redhat-operators-6cx7l\" (UID: \"e64194f0-3c2f-4b40-9f7a-5130684fc9fa\") " pod="openshift-marketplace/redhat-operators-6cx7l" Feb 19 18:44:40 crc kubenswrapper[4749]: I0219 18:44:40.351755 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e64194f0-3c2f-4b40-9f7a-5130684fc9fa-utilities\") pod \"redhat-operators-6cx7l\" (UID: \"e64194f0-3c2f-4b40-9f7a-5130684fc9fa\") " pod="openshift-marketplace/redhat-operators-6cx7l" Feb 19 18:44:40 crc kubenswrapper[4749]: I0219 18:44:40.371824 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lkh6\" (UniqueName: \"kubernetes.io/projected/e64194f0-3c2f-4b40-9f7a-5130684fc9fa-kube-api-access-9lkh6\") pod \"redhat-operators-6cx7l\" (UID: \"e64194f0-3c2f-4b40-9f7a-5130684fc9fa\") " pod="openshift-marketplace/redhat-operators-6cx7l" Feb 19 18:44:40 crc kubenswrapper[4749]: I0219 18:44:40.512682 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6cx7l" Feb 19 18:44:40 crc kubenswrapper[4749]: I0219 18:44:40.687736 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6cx7l"] Feb 19 18:44:41 crc kubenswrapper[4749]: I0219 18:44:41.034035 4749 generic.go:334] "Generic (PLEG): container finished" podID="e64194f0-3c2f-4b40-9f7a-5130684fc9fa" containerID="b53f515be174410172ee1325d0fc4a81fea1661ae5bd2f9a29d5f99df96021a9" exitCode=0 Feb 19 18:44:41 crc kubenswrapper[4749]: I0219 18:44:41.034401 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6cx7l" event={"ID":"e64194f0-3c2f-4b40-9f7a-5130684fc9fa","Type":"ContainerDied","Data":"b53f515be174410172ee1325d0fc4a81fea1661ae5bd2f9a29d5f99df96021a9"} Feb 19 18:44:41 crc kubenswrapper[4749]: I0219 18:44:41.034645 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6cx7l" event={"ID":"e64194f0-3c2f-4b40-9f7a-5130684fc9fa","Type":"ContainerStarted","Data":"fe988f7b0b3005c4d9aadc1f3e424f76dd9f350720a65f1af0c912b237a186c2"} Feb 19 18:44:42 crc kubenswrapper[4749]: I0219 18:44:42.048975 4749 generic.go:334] "Generic (PLEG): container finished" podID="0278ac54-8aaa-4f8e-ba20-8986a1467d1f" containerID="919fc1ecf44948d0644d0518957da21d1744c8b95a6e31fd32fa8ca3d0281c5e" exitCode=0 Feb 19 18:44:42 crc kubenswrapper[4749]: I0219 18:44:42.049160 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5" event={"ID":"0278ac54-8aaa-4f8e-ba20-8986a1467d1f","Type":"ContainerDied","Data":"919fc1ecf44948d0644d0518957da21d1744c8b95a6e31fd32fa8ca3d0281c5e"} Feb 19 18:44:42 crc kubenswrapper[4749]: I0219 18:44:42.052747 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6cx7l" event={"ID":"e64194f0-3c2f-4b40-9f7a-5130684fc9fa","Type":"ContainerStarted","Data":"49a1f8a3f1c5593fda915ae93d4415ee988d9479ac79b2f7667719ca9d6a7a9a"} Feb 19 18:44:43 crc kubenswrapper[4749]: I0219 18:44:43.060063 4749 generic.go:334] "Generic (PLEG): container finished" podID="e64194f0-3c2f-4b40-9f7a-5130684fc9fa" containerID="49a1f8a3f1c5593fda915ae93d4415ee988d9479ac79b2f7667719ca9d6a7a9a" exitCode=0 Feb 19 18:44:43 crc kubenswrapper[4749]: I0219 18:44:43.060128 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6cx7l" event={"ID":"e64194f0-3c2f-4b40-9f7a-5130684fc9fa","Type":"ContainerDied","Data":"49a1f8a3f1c5593fda915ae93d4415ee988d9479ac79b2f7667719ca9d6a7a9a"} Feb 19 18:44:43 crc kubenswrapper[4749]: I0219 18:44:43.062940 4749 generic.go:334] "Generic (PLEG): container finished" podID="0278ac54-8aaa-4f8e-ba20-8986a1467d1f" containerID="1f997f50f43e90d5763c521574a948dd53b83198a88b4bb27e040ce87f1eecdd" exitCode=0 Feb 19 18:44:43 crc kubenswrapper[4749]: I0219 18:44:43.062972 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5" event={"ID":"0278ac54-8aaa-4f8e-ba20-8986a1467d1f","Type":"ContainerDied","Data":"1f997f50f43e90d5763c521574a948dd53b83198a88b4bb27e040ce87f1eecdd"} Feb 19 18:44:44 crc kubenswrapper[4749]: I0219 18:44:44.071530 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6cx7l" event={"ID":"e64194f0-3c2f-4b40-9f7a-5130684fc9fa","Type":"ContainerStarted","Data":"511cb7fa1ed21aaceec28e86e4bcbfbabc1697c9b83df34668e686196537a8de"} Feb 19 18:44:44 crc kubenswrapper[4749]: I0219 18:44:44.096658 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6cx7l" podStartSLOduration=1.470052069 podStartE2EDuration="4.096637459s" podCreationTimestamp="2026-02-19 18:44:40 +0000 UTC" firstStartedPulling="2026-02-19 18:44:41.047539367 +0000 UTC m=+655.008759321" lastFinishedPulling="2026-02-19 18:44:43.674124757 +0000 UTC m=+657.635344711" observedRunningTime="2026-02-19 18:44:44.094321824 +0000 UTC m=+658.055541798" watchObservedRunningTime="2026-02-19 18:44:44.096637459 +0000 UTC m=+658.057857423" Feb 19 18:44:44 crc kubenswrapper[4749]: I0219 18:44:44.304794 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5" Feb 19 18:44:44 crc kubenswrapper[4749]: I0219 18:44:44.399925 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0278ac54-8aaa-4f8e-ba20-8986a1467d1f-util\") pod \"0278ac54-8aaa-4f8e-ba20-8986a1467d1f\" (UID: \"0278ac54-8aaa-4f8e-ba20-8986a1467d1f\") " Feb 19 18:44:44 crc kubenswrapper[4749]: I0219 18:44:44.399988 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0278ac54-8aaa-4f8e-ba20-8986a1467d1f-bundle\") pod \"0278ac54-8aaa-4f8e-ba20-8986a1467d1f\" (UID: \"0278ac54-8aaa-4f8e-ba20-8986a1467d1f\") " Feb 19 18:44:44 crc kubenswrapper[4749]: I0219 18:44:44.400105 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvtbj\" (UniqueName: \"kubernetes.io/projected/0278ac54-8aaa-4f8e-ba20-8986a1467d1f-kube-api-access-xvtbj\") pod \"0278ac54-8aaa-4f8e-ba20-8986a1467d1f\" (UID: \"0278ac54-8aaa-4f8e-ba20-8986a1467d1f\") " Feb 19 18:44:44 crc kubenswrapper[4749]: I0219 18:44:44.403938 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0278ac54-8aaa-4f8e-ba20-8986a1467d1f-bundle" (OuterVolumeSpecName: "bundle") pod "0278ac54-8aaa-4f8e-ba20-8986a1467d1f" (UID: "0278ac54-8aaa-4f8e-ba20-8986a1467d1f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:44:44 crc kubenswrapper[4749]: I0219 18:44:44.405348 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0278ac54-8aaa-4f8e-ba20-8986a1467d1f-kube-api-access-xvtbj" (OuterVolumeSpecName: "kube-api-access-xvtbj") pod "0278ac54-8aaa-4f8e-ba20-8986a1467d1f" (UID: "0278ac54-8aaa-4f8e-ba20-8986a1467d1f"). InnerVolumeSpecName "kube-api-access-xvtbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:44:44 crc kubenswrapper[4749]: I0219 18:44:44.420831 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0278ac54-8aaa-4f8e-ba20-8986a1467d1f-util" (OuterVolumeSpecName: "util") pod "0278ac54-8aaa-4f8e-ba20-8986a1467d1f" (UID: "0278ac54-8aaa-4f8e-ba20-8986a1467d1f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:44:44 crc kubenswrapper[4749]: I0219 18:44:44.502115 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0278ac54-8aaa-4f8e-ba20-8986a1467d1f-util\") on node \"crc\" DevicePath \"\"" Feb 19 18:44:44 crc kubenswrapper[4749]: I0219 18:44:44.502158 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0278ac54-8aaa-4f8e-ba20-8986a1467d1f-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:44:44 crc kubenswrapper[4749]: I0219 18:44:44.502168 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvtbj\" (UniqueName: \"kubernetes.io/projected/0278ac54-8aaa-4f8e-ba20-8986a1467d1f-kube-api-access-xvtbj\") on node \"crc\" DevicePath \"\"" Feb 19 18:44:45 crc kubenswrapper[4749]: I0219 18:44:45.080799 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5" Feb 19 18:44:45 crc kubenswrapper[4749]: I0219 18:44:45.080798 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5" event={"ID":"0278ac54-8aaa-4f8e-ba20-8986a1467d1f","Type":"ContainerDied","Data":"e8c026c1e26a69c11d6fdf754af36e382b148cf41027fdbc1ea5562119a0af0a"} Feb 19 18:44:45 crc kubenswrapper[4749]: I0219 18:44:45.081422 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8c026c1e26a69c11d6fdf754af36e382b148cf41027fdbc1ea5562119a0af0a" Feb 19 18:44:50 crc kubenswrapper[4749]: I0219 18:44:50.513775 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6cx7l" Feb 19 18:44:50 crc kubenswrapper[4749]: I0219 18:44:50.514233 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6cx7l" Feb 19 18:44:50 crc kubenswrapper[4749]: I0219 18:44:50.556240 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6cx7l" Feb 19 18:44:51 crc kubenswrapper[4749]: I0219 18:44:51.556468 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6cx7l" Feb 19 18:44:52 crc kubenswrapper[4749]: I0219 18:44:52.951521 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6cx7l"] Feb 19 18:44:53 crc kubenswrapper[4749]: I0219 18:44:53.504366 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6cx7l" podUID="e64194f0-3c2f-4b40-9f7a-5130684fc9fa" containerName="registry-server" containerID="cri-o://511cb7fa1ed21aaceec28e86e4bcbfbabc1697c9b83df34668e686196537a8de" gracePeriod=2 Feb 19 18:44:53 crc kubenswrapper[4749]: I0219 18:44:53.847834 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6cx7l" Feb 19 18:44:53 crc kubenswrapper[4749]: I0219 18:44:53.925758 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lkh6\" (UniqueName: \"kubernetes.io/projected/e64194f0-3c2f-4b40-9f7a-5130684fc9fa-kube-api-access-9lkh6\") pod \"e64194f0-3c2f-4b40-9f7a-5130684fc9fa\" (UID: \"e64194f0-3c2f-4b40-9f7a-5130684fc9fa\") " Feb 19 18:44:53 crc kubenswrapper[4749]: I0219 18:44:53.925838 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e64194f0-3c2f-4b40-9f7a-5130684fc9fa-catalog-content\") pod \"e64194f0-3c2f-4b40-9f7a-5130684fc9fa\" (UID: \"e64194f0-3c2f-4b40-9f7a-5130684fc9fa\") " Feb 19 18:44:53 crc kubenswrapper[4749]: I0219 18:44:53.925855 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e64194f0-3c2f-4b40-9f7a-5130684fc9fa-utilities\") pod \"e64194f0-3c2f-4b40-9f7a-5130684fc9fa\" (UID: \"e64194f0-3c2f-4b40-9f7a-5130684fc9fa\") " Feb 19 18:44:53 crc kubenswrapper[4749]: I0219 18:44:53.926927 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e64194f0-3c2f-4b40-9f7a-5130684fc9fa-utilities" (OuterVolumeSpecName: "utilities") pod "e64194f0-3c2f-4b40-9f7a-5130684fc9fa" (UID: "e64194f0-3c2f-4b40-9f7a-5130684fc9fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:44:53 crc kubenswrapper[4749]: I0219 18:44:53.933357 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e64194f0-3c2f-4b40-9f7a-5130684fc9fa-kube-api-access-9lkh6" (OuterVolumeSpecName: "kube-api-access-9lkh6") pod "e64194f0-3c2f-4b40-9f7a-5130684fc9fa" (UID: "e64194f0-3c2f-4b40-9f7a-5130684fc9fa"). InnerVolumeSpecName "kube-api-access-9lkh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.027543 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e64194f0-3c2f-4b40-9f7a-5130684fc9fa-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.027575 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lkh6\" (UniqueName: \"kubernetes.io/projected/e64194f0-3c2f-4b40-9f7a-5130684fc9fa-kube-api-access-9lkh6\") on node \"crc\" DevicePath \"\"" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.057307 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e64194f0-3c2f-4b40-9f7a-5130684fc9fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e64194f0-3c2f-4b40-9f7a-5130684fc9fa" (UID: "e64194f0-3c2f-4b40-9f7a-5130684fc9fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.127974 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e64194f0-3c2f-4b40-9f7a-5130684fc9fa-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.395719 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-hcn88"] Feb 19 18:44:54 crc kubenswrapper[4749]: E0219 18:44:54.395959 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0278ac54-8aaa-4f8e-ba20-8986a1467d1f" containerName="util" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.395973 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0278ac54-8aaa-4f8e-ba20-8986a1467d1f" containerName="util" Feb 19 18:44:54 crc kubenswrapper[4749]: E0219 18:44:54.395989 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e64194f0-3c2f-4b40-9f7a-5130684fc9fa" containerName="registry-server" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.395997 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e64194f0-3c2f-4b40-9f7a-5130684fc9fa" containerName="registry-server" Feb 19 18:44:54 crc kubenswrapper[4749]: E0219 18:44:54.396028 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0278ac54-8aaa-4f8e-ba20-8986a1467d1f" containerName="pull" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.396037 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0278ac54-8aaa-4f8e-ba20-8986a1467d1f" containerName="pull" Feb 19 18:44:54 crc kubenswrapper[4749]: E0219 18:44:54.396071 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e64194f0-3c2f-4b40-9f7a-5130684fc9fa" containerName="extract-content" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.396080 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e64194f0-3c2f-4b40-9f7a-5130684fc9fa" containerName="extract-content" Feb 19 18:44:54 crc kubenswrapper[4749]: E0219 18:44:54.396095 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0278ac54-8aaa-4f8e-ba20-8986a1467d1f" containerName="extract" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.396105 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0278ac54-8aaa-4f8e-ba20-8986a1467d1f" containerName="extract" Feb 19 18:44:54 crc kubenswrapper[4749]: E0219 18:44:54.396127 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e64194f0-3c2f-4b40-9f7a-5130684fc9fa" containerName="extract-utilities" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.396137 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e64194f0-3c2f-4b40-9f7a-5130684fc9fa" containerName="extract-utilities" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.396312 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e64194f0-3c2f-4b40-9f7a-5130684fc9fa" containerName="registry-server" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.396342 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0278ac54-8aaa-4f8e-ba20-8986a1467d1f" containerName="extract" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.396889 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hcn88" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.398955 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.399252 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-zk4tt" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.399491 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.406212 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-hcn88"] Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.511569 4749 generic.go:334] "Generic (PLEG): container finished" podID="e64194f0-3c2f-4b40-9f7a-5130684fc9fa" containerID="511cb7fa1ed21aaceec28e86e4bcbfbabc1697c9b83df34668e686196537a8de" exitCode=0 Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.511610 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6cx7l" event={"ID":"e64194f0-3c2f-4b40-9f7a-5130684fc9fa","Type":"ContainerDied","Data":"511cb7fa1ed21aaceec28e86e4bcbfbabc1697c9b83df34668e686196537a8de"} Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.511650 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6cx7l" event={"ID":"e64194f0-3c2f-4b40-9f7a-5130684fc9fa","Type":"ContainerDied","Data":"fe988f7b0b3005c4d9aadc1f3e424f76dd9f350720a65f1af0c912b237a186c2"} Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.511669 4749 scope.go:117] "RemoveContainer" containerID="511cb7fa1ed21aaceec28e86e4bcbfbabc1697c9b83df34668e686196537a8de" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.511694 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6cx7l" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.533861 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-lhtjb"] Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.534165 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxclz\" (UniqueName: \"kubernetes.io/projected/f02e6039-b225-4177-9704-4cdd8b15f297-kube-api-access-dxclz\") pod \"obo-prometheus-operator-68bc856cb9-hcn88\" (UID: \"f02e6039-b225-4177-9704-4cdd8b15f297\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hcn88" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.534524 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-lhtjb" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.536854 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-v6ftp" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.537180 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.538020 4749 scope.go:117] "RemoveContainer" containerID="49a1f8a3f1c5593fda915ae93d4415ee988d9479ac79b2f7667719ca9d6a7a9a" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.543083 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-pb2c8"] Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.550367 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-pb2c8" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.555926 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-lhtjb"] Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.563005 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-pb2c8"] Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.573479 4749 scope.go:117] "RemoveContainer" containerID="b53f515be174410172ee1325d0fc4a81fea1661ae5bd2f9a29d5f99df96021a9" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.597522 4749 scope.go:117] "RemoveContainer" containerID="511cb7fa1ed21aaceec28e86e4bcbfbabc1697c9b83df34668e686196537a8de" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.598312 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6cx7l"] Feb 19 18:44:54 crc kubenswrapper[4749]: E0219 18:44:54.601340 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"511cb7fa1ed21aaceec28e86e4bcbfbabc1697c9b83df34668e686196537a8de\": container with ID starting with 511cb7fa1ed21aaceec28e86e4bcbfbabc1697c9b83df34668e686196537a8de not found: ID does not exist" containerID="511cb7fa1ed21aaceec28e86e4bcbfbabc1697c9b83df34668e686196537a8de" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.601381 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"511cb7fa1ed21aaceec28e86e4bcbfbabc1697c9b83df34668e686196537a8de"} err="failed to get container status \"511cb7fa1ed21aaceec28e86e4bcbfbabc1697c9b83df34668e686196537a8de\": rpc error: code = NotFound desc = could not find container \"511cb7fa1ed21aaceec28e86e4bcbfbabc1697c9b83df34668e686196537a8de\": container with ID starting with 511cb7fa1ed21aaceec28e86e4bcbfbabc1697c9b83df34668e686196537a8de not found: ID does not exist" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.601408 4749 scope.go:117] "RemoveContainer" containerID="49a1f8a3f1c5593fda915ae93d4415ee988d9479ac79b2f7667719ca9d6a7a9a" Feb 19 18:44:54 crc kubenswrapper[4749]: E0219 18:44:54.601769 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49a1f8a3f1c5593fda915ae93d4415ee988d9479ac79b2f7667719ca9d6a7a9a\": container with ID starting with 49a1f8a3f1c5593fda915ae93d4415ee988d9479ac79b2f7667719ca9d6a7a9a not found: ID does not exist" containerID="49a1f8a3f1c5593fda915ae93d4415ee988d9479ac79b2f7667719ca9d6a7a9a" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.601805 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49a1f8a3f1c5593fda915ae93d4415ee988d9479ac79b2f7667719ca9d6a7a9a"} err="failed to get container status \"49a1f8a3f1c5593fda915ae93d4415ee988d9479ac79b2f7667719ca9d6a7a9a\": rpc error: code = NotFound desc = could not find container \"49a1f8a3f1c5593fda915ae93d4415ee988d9479ac79b2f7667719ca9d6a7a9a\": container with ID starting with 49a1f8a3f1c5593fda915ae93d4415ee988d9479ac79b2f7667719ca9d6a7a9a not found: ID does not exist" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.601828 4749 scope.go:117] "RemoveContainer" containerID="b53f515be174410172ee1325d0fc4a81fea1661ae5bd2f9a29d5f99df96021a9" Feb 19 18:44:54 crc kubenswrapper[4749]: E0219 18:44:54.605155 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b53f515be174410172ee1325d0fc4a81fea1661ae5bd2f9a29d5f99df96021a9\": container with ID starting with b53f515be174410172ee1325d0fc4a81fea1661ae5bd2f9a29d5f99df96021a9 not found: ID does not exist" containerID="b53f515be174410172ee1325d0fc4a81fea1661ae5bd2f9a29d5f99df96021a9" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.605187 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b53f515be174410172ee1325d0fc4a81fea1661ae5bd2f9a29d5f99df96021a9"} err="failed to get container status \"b53f515be174410172ee1325d0fc4a81fea1661ae5bd2f9a29d5f99df96021a9\": rpc error: code = NotFound desc = could not find container \"b53f515be174410172ee1325d0fc4a81fea1661ae5bd2f9a29d5f99df96021a9\": container with ID starting with b53f515be174410172ee1325d0fc4a81fea1661ae5bd2f9a29d5f99df96021a9 not found: ID does not exist" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.608785 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6cx7l"] Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.637388 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxclz\" (UniqueName: \"kubernetes.io/projected/f02e6039-b225-4177-9704-4cdd8b15f297-kube-api-access-dxclz\") pod \"obo-prometheus-operator-68bc856cb9-hcn88\" (UID: \"f02e6039-b225-4177-9704-4cdd8b15f297\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hcn88" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.637470 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6107d2c9-e758-426c-8c42-d8a9241b1ce8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-58c6cdd4c6-lhtjb\" (UID: \"6107d2c9-e758-426c-8c42-d8a9241b1ce8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-lhtjb" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.637510 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6107d2c9-e758-426c-8c42-d8a9241b1ce8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-58c6cdd4c6-lhtjb\" (UID: \"6107d2c9-e758-426c-8c42-d8a9241b1ce8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-lhtjb" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.653417 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxclz\" (UniqueName: \"kubernetes.io/projected/f02e6039-b225-4177-9704-4cdd8b15f297-kube-api-access-dxclz\") pod \"obo-prometheus-operator-68bc856cb9-hcn88\" (UID: \"f02e6039-b225-4177-9704-4cdd8b15f297\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hcn88" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.685313 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e64194f0-3c2f-4b40-9f7a-5130684fc9fa" path="/var/lib/kubelet/pods/e64194f0-3c2f-4b40-9f7a-5130684fc9fa/volumes" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.715989 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hcn88" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.720837 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-74zcg"] Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.721625 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-74zcg" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.725309 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-l9lbn" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.728372 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.734822 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-74zcg"] Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.739574 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8b734a5b-56c7-4001-8c16-e4a75f50afb3-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-58c6cdd4c6-pb2c8\" (UID: \"8b734a5b-56c7-4001-8c16-e4a75f50afb3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-pb2c8" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.739623 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8b734a5b-56c7-4001-8c16-e4a75f50afb3-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-58c6cdd4c6-pb2c8\" (UID: \"8b734a5b-56c7-4001-8c16-e4a75f50afb3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-pb2c8" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.739679 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6107d2c9-e758-426c-8c42-d8a9241b1ce8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-58c6cdd4c6-lhtjb\" (UID: \"6107d2c9-e758-426c-8c42-d8a9241b1ce8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-lhtjb" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.739698 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6107d2c9-e758-426c-8c42-d8a9241b1ce8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-58c6cdd4c6-lhtjb\" (UID: \"6107d2c9-e758-426c-8c42-d8a9241b1ce8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-lhtjb" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.742866 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6107d2c9-e758-426c-8c42-d8a9241b1ce8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-58c6cdd4c6-lhtjb\" (UID: \"6107d2c9-e758-426c-8c42-d8a9241b1ce8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-lhtjb" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.744857 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6107d2c9-e758-426c-8c42-d8a9241b1ce8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-58c6cdd4c6-lhtjb\" (UID: \"6107d2c9-e758-426c-8c42-d8a9241b1ce8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-lhtjb" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.840566 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skb47\" (UniqueName: \"kubernetes.io/projected/74ad505e-d15c-43b4-b072-444ffdedf939-kube-api-access-skb47\") pod \"observability-operator-59bdc8b94-74zcg\" (UID: \"74ad505e-d15c-43b4-b072-444ffdedf939\") " pod="openshift-operators/observability-operator-59bdc8b94-74zcg" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.840633 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/74ad505e-d15c-43b4-b072-444ffdedf939-observability-operator-tls\") pod \"observability-operator-59bdc8b94-74zcg\" (UID: \"74ad505e-d15c-43b4-b072-444ffdedf939\") " pod="openshift-operators/observability-operator-59bdc8b94-74zcg" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.840669 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8b734a5b-56c7-4001-8c16-e4a75f50afb3-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-58c6cdd4c6-pb2c8\" (UID: \"8b734a5b-56c7-4001-8c16-e4a75f50afb3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-pb2c8" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.841094 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8b734a5b-56c7-4001-8c16-e4a75f50afb3-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-58c6cdd4c6-pb2c8\" (UID: \"8b734a5b-56c7-4001-8c16-e4a75f50afb3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-pb2c8" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.846606 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8b734a5b-56c7-4001-8c16-e4a75f50afb3-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-58c6cdd4c6-pb2c8\" (UID: \"8b734a5b-56c7-4001-8c16-e4a75f50afb3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-pb2c8" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.863494 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8b734a5b-56c7-4001-8c16-e4a75f50afb3-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-58c6cdd4c6-pb2c8\" (UID: \"8b734a5b-56c7-4001-8c16-e4a75f50afb3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-pb2c8" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.887927 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-lhtjb" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.910143 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-pb2c8" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.921035 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-qs22p"] Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.921780 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-qs22p" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.929506 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-rskrs" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.933211 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-qs22p"] Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.944245 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/456b4e23-1427-4b46-9672-87cff5dd12b9-openshift-service-ca\") pod \"perses-operator-5bf474d74f-qs22p\" (UID: \"456b4e23-1427-4b46-9672-87cff5dd12b9\") " pod="openshift-operators/perses-operator-5bf474d74f-qs22p" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.944301 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skb47\" (UniqueName: \"kubernetes.io/projected/74ad505e-d15c-43b4-b072-444ffdedf939-kube-api-access-skb47\") pod \"observability-operator-59bdc8b94-74zcg\" (UID: \"74ad505e-d15c-43b4-b072-444ffdedf939\") " pod="openshift-operators/observability-operator-59bdc8b94-74zcg" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.944344 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/74ad505e-d15c-43b4-b072-444ffdedf939-observability-operator-tls\") pod \"observability-operator-59bdc8b94-74zcg\" (UID: \"74ad505e-d15c-43b4-b072-444ffdedf939\") " pod="openshift-operators/observability-operator-59bdc8b94-74zcg" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.944365 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6blf\" (UniqueName: \"kubernetes.io/projected/456b4e23-1427-4b46-9672-87cff5dd12b9-kube-api-access-w6blf\") pod \"perses-operator-5bf474d74f-qs22p\" (UID: \"456b4e23-1427-4b46-9672-87cff5dd12b9\") " pod="openshift-operators/perses-operator-5bf474d74f-qs22p" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.954206 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/74ad505e-d15c-43b4-b072-444ffdedf939-observability-operator-tls\") pod \"observability-operator-59bdc8b94-74zcg\" (UID: \"74ad505e-d15c-43b4-b072-444ffdedf939\") " pod="openshift-operators/observability-operator-59bdc8b94-74zcg" Feb 19 18:44:54 crc kubenswrapper[4749]: I0219 18:44:54.965091 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skb47\" (UniqueName: \"kubernetes.io/projected/74ad505e-d15c-43b4-b072-444ffdedf939-kube-api-access-skb47\") pod \"observability-operator-59bdc8b94-74zcg\" (UID: \"74ad505e-d15c-43b4-b072-444ffdedf939\") " pod="openshift-operators/observability-operator-59bdc8b94-74zcg" Feb 19 18:44:55 crc kubenswrapper[4749]: I0219 18:44:55.046967 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6blf\" (UniqueName: \"kubernetes.io/projected/456b4e23-1427-4b46-9672-87cff5dd12b9-kube-api-access-w6blf\") pod \"perses-operator-5bf474d74f-qs22p\" (UID: \"456b4e23-1427-4b46-9672-87cff5dd12b9\") " pod="openshift-operators/perses-operator-5bf474d74f-qs22p" Feb 19 18:44:55 crc kubenswrapper[4749]: I0219 18:44:55.047063 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/456b4e23-1427-4b46-9672-87cff5dd12b9-openshift-service-ca\") pod \"perses-operator-5bf474d74f-qs22p\" (UID: \"456b4e23-1427-4b46-9672-87cff5dd12b9\") " pod="openshift-operators/perses-operator-5bf474d74f-qs22p" Feb 19 18:44:55 crc kubenswrapper[4749]: I0219 18:44:55.053306 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/456b4e23-1427-4b46-9672-87cff5dd12b9-openshift-service-ca\") pod \"perses-operator-5bf474d74f-qs22p\" (UID: \"456b4e23-1427-4b46-9672-87cff5dd12b9\") " pod="openshift-operators/perses-operator-5bf474d74f-qs22p" Feb 19 18:44:55 crc kubenswrapper[4749]: I0219 18:44:55.071454 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6blf\" (UniqueName: \"kubernetes.io/projected/456b4e23-1427-4b46-9672-87cff5dd12b9-kube-api-access-w6blf\") pod \"perses-operator-5bf474d74f-qs22p\" (UID: \"456b4e23-1427-4b46-9672-87cff5dd12b9\") " pod="openshift-operators/perses-operator-5bf474d74f-qs22p" Feb 19 18:44:55 crc kubenswrapper[4749]: I0219 18:44:55.072208 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-hcn88"] Feb 19 18:44:55 crc kubenswrapper[4749]: I0219 18:44:55.100621 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-74zcg" Feb 19 18:44:55 crc kubenswrapper[4749]: I0219 18:44:55.147113 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-pb2c8"] Feb 19 18:44:55 crc kubenswrapper[4749]: I0219 18:44:55.237598 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-lhtjb"] Feb 19 18:44:55 crc kubenswrapper[4749]: W0219 18:44:55.248590 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6107d2c9_e758_426c_8c42_d8a9241b1ce8.slice/crio-a2cd7e57239556614e0079a3cee5a42b8d0693e771664b009fea5e3ddbc35488 WatchSource:0}: Error finding container a2cd7e57239556614e0079a3cee5a42b8d0693e771664b009fea5e3ddbc35488: Status 404 returned error can't find the container with id a2cd7e57239556614e0079a3cee5a42b8d0693e771664b009fea5e3ddbc35488 Feb 19 18:44:55 crc kubenswrapper[4749]: I0219 18:44:55.265492 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-qs22p" Feb 19 18:44:55 crc kubenswrapper[4749]: I0219 18:44:55.322990 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-74zcg"] Feb 19 18:44:55 crc kubenswrapper[4749]: W0219 18:44:55.474805 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod456b4e23_1427_4b46_9672_87cff5dd12b9.slice/crio-8eec5fad353df8d14b63eac92de7084591214ec6133fe1ecff3dc89c4c8c596f WatchSource:0}: Error finding container 8eec5fad353df8d14b63eac92de7084591214ec6133fe1ecff3dc89c4c8c596f: Status 404 returned error can't find the container with id 8eec5fad353df8d14b63eac92de7084591214ec6133fe1ecff3dc89c4c8c596f Feb 19 18:44:55 crc kubenswrapper[4749]: I0219 18:44:55.479407 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-qs22p"] Feb 19 18:44:55 crc kubenswrapper[4749]: I0219 18:44:55.518474 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-74zcg" event={"ID":"74ad505e-d15c-43b4-b072-444ffdedf939","Type":"ContainerStarted","Data":"b90d80e4cac0b98a28b1a30e73fca55c7033737da535edd721e87b4288912eaf"} Feb 19 18:44:55 crc kubenswrapper[4749]: I0219 18:44:55.519839 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-lhtjb" event={"ID":"6107d2c9-e758-426c-8c42-d8a9241b1ce8","Type":"ContainerStarted","Data":"a2cd7e57239556614e0079a3cee5a42b8d0693e771664b009fea5e3ddbc35488"} Feb 19 18:44:55 crc kubenswrapper[4749]: I0219 18:44:55.520788 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-qs22p" event={"ID":"456b4e23-1427-4b46-9672-87cff5dd12b9","Type":"ContainerStarted","Data":"8eec5fad353df8d14b63eac92de7084591214ec6133fe1ecff3dc89c4c8c596f"} Feb 19 18:44:55 crc kubenswrapper[4749]: I0219 18:44:55.521670 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hcn88" event={"ID":"f02e6039-b225-4177-9704-4cdd8b15f297","Type":"ContainerStarted","Data":"1b3295956b55d3ef3551edffc4f5faac6a9936c055c97655417b802eeb683fee"} Feb 19 18:44:55 crc kubenswrapper[4749]: I0219 18:44:55.523238 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-pb2c8" event={"ID":"8b734a5b-56c7-4001-8c16-e4a75f50afb3","Type":"ContainerStarted","Data":"d04afcc4bcc424c88282223a374a5820012ecdf5bddb3d7c0e0f29be032cfeb4"} Feb 19 18:45:00 crc kubenswrapper[4749]: I0219 18:45:00.151585 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525445-n2zcb"] Feb 19 18:45:00 crc kubenswrapper[4749]: I0219 18:45:00.153086 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525445-n2zcb" Feb 19 18:45:00 crc kubenswrapper[4749]: I0219 18:45:00.162717 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 18:45:00 crc kubenswrapper[4749]: I0219 18:45:00.163870 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 18:45:00 crc kubenswrapper[4749]: I0219 18:45:00.193308 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525445-n2zcb"] Feb 19 18:45:00 crc kubenswrapper[4749]: I0219 18:45:00.325152 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f395667-c914-4b04-a6b4-52180a9b0356-config-volume\") pod \"collect-profiles-29525445-n2zcb\" (UID: \"6f395667-c914-4b04-a6b4-52180a9b0356\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525445-n2zcb" Feb 19 18:45:00 crc kubenswrapper[4749]: I0219 18:45:00.325229 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7trlx\" (UniqueName: \"kubernetes.io/projected/6f395667-c914-4b04-a6b4-52180a9b0356-kube-api-access-7trlx\") pod \"collect-profiles-29525445-n2zcb\" (UID: \"6f395667-c914-4b04-a6b4-52180a9b0356\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525445-n2zcb" Feb 19 18:45:00 crc kubenswrapper[4749]: I0219 18:45:00.325319 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f395667-c914-4b04-a6b4-52180a9b0356-secret-volume\") pod \"collect-profiles-29525445-n2zcb\" (UID: \"6f395667-c914-4b04-a6b4-52180a9b0356\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525445-n2zcb" Feb 19 18:45:00 crc kubenswrapper[4749]: I0219 18:45:00.426748 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f395667-c914-4b04-a6b4-52180a9b0356-config-volume\") pod \"collect-profiles-29525445-n2zcb\" (UID: \"6f395667-c914-4b04-a6b4-52180a9b0356\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525445-n2zcb" Feb 19 18:45:00 crc kubenswrapper[4749]: I0219 18:45:00.426789 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7trlx\" (UniqueName: \"kubernetes.io/projected/6f395667-c914-4b04-a6b4-52180a9b0356-kube-api-access-7trlx\") pod \"collect-profiles-29525445-n2zcb\" (UID: \"6f395667-c914-4b04-a6b4-52180a9b0356\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525445-n2zcb" Feb 19 18:45:00 crc kubenswrapper[4749]: I0219 18:45:00.426868 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f395667-c914-4b04-a6b4-52180a9b0356-secret-volume\") pod \"collect-profiles-29525445-n2zcb\" (UID: \"6f395667-c914-4b04-a6b4-52180a9b0356\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525445-n2zcb" Feb 19 18:45:00 crc kubenswrapper[4749]: I0219 18:45:00.427664 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f395667-c914-4b04-a6b4-52180a9b0356-config-volume\") pod \"collect-profiles-29525445-n2zcb\" (UID: \"6f395667-c914-4b04-a6b4-52180a9b0356\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525445-n2zcb" Feb 19 18:45:00 crc kubenswrapper[4749]: I0219 18:45:00.437732 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f395667-c914-4b04-a6b4-52180a9b0356-secret-volume\") pod \"collect-profiles-29525445-n2zcb\" (UID: \"6f395667-c914-4b04-a6b4-52180a9b0356\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525445-n2zcb" Feb 19 18:45:00 crc kubenswrapper[4749]: I0219 18:45:00.465358 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7trlx\" (UniqueName: \"kubernetes.io/projected/6f395667-c914-4b04-a6b4-52180a9b0356-kube-api-access-7trlx\") pod \"collect-profiles-29525445-n2zcb\" (UID: \"6f395667-c914-4b04-a6b4-52180a9b0356\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525445-n2zcb" Feb 19 18:45:00 crc kubenswrapper[4749]: I0219 18:45:00.474230 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525445-n2zcb" Feb 19 18:45:06 crc kubenswrapper[4749]: I0219 18:45:06.415284 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525445-n2zcb"] Feb 19 18:45:06 crc kubenswrapper[4749]: W0219 18:45:06.436650 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f395667_c914_4b04_a6b4_52180a9b0356.slice/crio-d0dbea33e878d1834405e7bb2f3d15f741252356091d81cd5c83d0c79d2fddc1 WatchSource:0}: Error finding container d0dbea33e878d1834405e7bb2f3d15f741252356091d81cd5c83d0c79d2fddc1: Status 404 returned error can't find the container with id d0dbea33e878d1834405e7bb2f3d15f741252356091d81cd5c83d0c79d2fddc1 Feb 19 18:45:06 crc kubenswrapper[4749]: I0219 18:45:06.612298 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-74zcg" event={"ID":"74ad505e-d15c-43b4-b072-444ffdedf939","Type":"ContainerStarted","Data":"e049627bf5b3eac6b0f64fdcaa39c0c193d1bd193eb947b42affcd1a6525aa5e"} Feb 19 18:45:06 crc kubenswrapper[4749]: I0219 18:45:06.612386 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-74zcg" Feb 19 18:45:06 crc kubenswrapper[4749]: I0219 18:45:06.614766 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-74zcg" Feb 19 18:45:06 crc kubenswrapper[4749]: I0219 18:45:06.615472 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-lhtjb" event={"ID":"6107d2c9-e758-426c-8c42-d8a9241b1ce8","Type":"ContainerStarted","Data":"1051758b656f167cbecfdd5b5f51a793de0f541dd0dd36a0eb8e0c49be504e1b"} Feb 19 18:45:06 crc kubenswrapper[4749]: I0219 18:45:06.618978 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-qs22p" event={"ID":"456b4e23-1427-4b46-9672-87cff5dd12b9","Type":"ContainerStarted","Data":"2adb65082dec96044ea9cd23dbb3ff496e534d5b632a9fb63a91f39ade234d85"} Feb 19 18:45:06 crc kubenswrapper[4749]: I0219 18:45:06.619454 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-qs22p" Feb 19 18:45:06 crc kubenswrapper[4749]: I0219 18:45:06.620986 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hcn88" event={"ID":"f02e6039-b225-4177-9704-4cdd8b15f297","Type":"ContainerStarted","Data":"5ad83343fc485b1d720a3ef1d659201ba37afe4f5c8e9edc5b890bf27429207a"} Feb 19 18:45:06 crc kubenswrapper[4749]: I0219 18:45:06.622628 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525445-n2zcb" event={"ID":"6f395667-c914-4b04-a6b4-52180a9b0356","Type":"ContainerStarted","Data":"891f89fc4ca6204e3e993bf46d34bb48a9e9cfc30359e7e4e2dd7c016f26e419"} Feb 19 18:45:06 crc kubenswrapper[4749]: I0219 18:45:06.622661 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525445-n2zcb" event={"ID":"6f395667-c914-4b04-a6b4-52180a9b0356","Type":"ContainerStarted","Data":"d0dbea33e878d1834405e7bb2f3d15f741252356091d81cd5c83d0c79d2fddc1"} Feb 19 18:45:06 crc kubenswrapper[4749]: I0219 18:45:06.624795 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-pb2c8" event={"ID":"8b734a5b-56c7-4001-8c16-e4a75f50afb3","Type":"ContainerStarted","Data":"c641a66e7d0ab16b5252cde4ea4e38bd008cea50a650abc59498612f65970c5f"} Feb 19 18:45:06 crc kubenswrapper[4749]: I0219 18:45:06.640283 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-74zcg" podStartSLOduration=1.828507959 podStartE2EDuration="12.640261178s" podCreationTimestamp="2026-02-19 18:44:54 +0000 UTC" firstStartedPulling="2026-02-19 18:44:55.333849132 +0000 UTC m=+669.295069086" lastFinishedPulling="2026-02-19 18:45:06.145602351 +0000 UTC m=+680.106822305" observedRunningTime="2026-02-19 18:45:06.637675416 +0000 UTC m=+680.598895400" watchObservedRunningTime="2026-02-19 18:45:06.640261178 +0000 UTC m=+680.601481132" Feb 19 18:45:06 crc kubenswrapper[4749]: I0219 18:45:06.683566 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-lhtjb" podStartSLOduration=1.750964126 podStartE2EDuration="12.683547399s" podCreationTimestamp="2026-02-19 18:44:54 +0000 UTC" firstStartedPulling="2026-02-19 18:44:55.250723365 +0000 UTC m=+669.211943309" lastFinishedPulling="2026-02-19 18:45:06.183306628 +0000 UTC m=+680.144526582" observedRunningTime="2026-02-19 18:45:06.678862407 +0000 UTC m=+680.640082371" watchObservedRunningTime="2026-02-19 18:45:06.683547399 +0000 UTC m=+680.644767353" Feb 19 18:45:06 crc kubenswrapper[4749]: I0219 18:45:06.703691 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-qs22p" podStartSLOduration=2.011399636 podStartE2EDuration="12.703655463s" podCreationTimestamp="2026-02-19 18:44:54 +0000 UTC" firstStartedPulling="2026-02-19 18:44:55.480567008 +0000 UTC m=+669.441786962" lastFinishedPulling="2026-02-19 18:45:06.172822835 +0000 UTC m=+680.134042789" observedRunningTime="2026-02-19 18:45:06.699685757 +0000 UTC m=+680.660905721" watchObservedRunningTime="2026-02-19 18:45:06.703655463 +0000 UTC m=+680.664875417" Feb 19 18:45:06 crc kubenswrapper[4749]: I0219 18:45:06.723392 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58c6cdd4c6-pb2c8" podStartSLOduration=1.7727504 podStartE2EDuration="12.723364386s" podCreationTimestamp="2026-02-19 18:44:54 +0000 UTC" firstStartedPulling="2026-02-19 18:44:55.179243837 +0000 UTC m=+669.140463791" lastFinishedPulling="2026-02-19 18:45:06.129857823 +0000 UTC m=+680.091077777" observedRunningTime="2026-02-19 18:45:06.718887598 +0000 UTC m=+680.680107552" watchObservedRunningTime="2026-02-19 18:45:06.723364386 +0000 UTC m=+680.684584340" Feb 19 18:45:06 crc kubenswrapper[4749]: I0219 18:45:06.743365 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hcn88" podStartSLOduration=1.645130722 podStartE2EDuration="12.743341676s" podCreationTimestamp="2026-02-19 18:44:54 +0000 UTC" firstStartedPulling="2026-02-19 18:44:55.084887649 +0000 UTC m=+669.046107603" lastFinishedPulling="2026-02-19 18:45:06.183098603 +0000 UTC m=+680.144318557" observedRunningTime="2026-02-19 18:45:06.736597484 +0000 UTC m=+680.697817458" watchObservedRunningTime="2026-02-19 18:45:06.743341676 +0000 UTC m=+680.704561630" Feb 19 18:45:06 crc kubenswrapper[4749]: I0219 18:45:06.765428 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525445-n2zcb" podStartSLOduration=6.765408016 podStartE2EDuration="6.765408016s" podCreationTimestamp="2026-02-19 18:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:45:06.763695725 +0000 UTC m=+680.724915689" watchObservedRunningTime="2026-02-19 18:45:06.765408016 +0000 UTC m=+680.726627970" Feb 19 18:45:07 crc kubenswrapper[4749]: I0219 18:45:07.630705 4749 generic.go:334] "Generic (PLEG): container finished" podID="6f395667-c914-4b04-a6b4-52180a9b0356" containerID="891f89fc4ca6204e3e993bf46d34bb48a9e9cfc30359e7e4e2dd7c016f26e419" exitCode=0 Feb 19 18:45:07 crc kubenswrapper[4749]: I0219 18:45:07.630801 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525445-n2zcb" event={"ID":"6f395667-c914-4b04-a6b4-52180a9b0356","Type":"ContainerDied","Data":"891f89fc4ca6204e3e993bf46d34bb48a9e9cfc30359e7e4e2dd7c016f26e419"} Feb 19 18:45:08 crc kubenswrapper[4749]: I0219 18:45:08.919448 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525445-n2zcb" Feb 19 18:45:08 crc kubenswrapper[4749]: I0219 18:45:08.950280 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f395667-c914-4b04-a6b4-52180a9b0356-secret-volume\") pod \"6f395667-c914-4b04-a6b4-52180a9b0356\" (UID: \"6f395667-c914-4b04-a6b4-52180a9b0356\") " Feb 19 18:45:08 crc kubenswrapper[4749]: I0219 18:45:08.950320 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f395667-c914-4b04-a6b4-52180a9b0356-config-volume\") pod \"6f395667-c914-4b04-a6b4-52180a9b0356\" (UID: \"6f395667-c914-4b04-a6b4-52180a9b0356\") " Feb 19 18:45:08 crc kubenswrapper[4749]: I0219 18:45:08.950387 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7trlx\" (UniqueName: \"kubernetes.io/projected/6f395667-c914-4b04-a6b4-52180a9b0356-kube-api-access-7trlx\") pod \"6f395667-c914-4b04-a6b4-52180a9b0356\" (UID: \"6f395667-c914-4b04-a6b4-52180a9b0356\") " Feb 19 18:45:08 crc kubenswrapper[4749]: I0219 18:45:08.950938 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f395667-c914-4b04-a6b4-52180a9b0356-config-volume" (OuterVolumeSpecName: "config-volume") pod "6f395667-c914-4b04-a6b4-52180a9b0356" (UID: "6f395667-c914-4b04-a6b4-52180a9b0356"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:45:08 crc kubenswrapper[4749]: I0219 18:45:08.958222 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f395667-c914-4b04-a6b4-52180a9b0356-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6f395667-c914-4b04-a6b4-52180a9b0356" (UID: "6f395667-c914-4b04-a6b4-52180a9b0356"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:45:08 crc kubenswrapper[4749]: I0219 18:45:08.967478 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f395667-c914-4b04-a6b4-52180a9b0356-kube-api-access-7trlx" (OuterVolumeSpecName: "kube-api-access-7trlx") pod "6f395667-c914-4b04-a6b4-52180a9b0356" (UID: "6f395667-c914-4b04-a6b4-52180a9b0356"). InnerVolumeSpecName "kube-api-access-7trlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:45:09 crc kubenswrapper[4749]: I0219 18:45:09.056668 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f395667-c914-4b04-a6b4-52180a9b0356-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 18:45:09 crc kubenswrapper[4749]: I0219 18:45:09.059235 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f395667-c914-4b04-a6b4-52180a9b0356-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 18:45:09 crc kubenswrapper[4749]: I0219 18:45:09.062270 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7trlx\" (UniqueName: \"kubernetes.io/projected/6f395667-c914-4b04-a6b4-52180a9b0356-kube-api-access-7trlx\") on node \"crc\" DevicePath \"\"" Feb 19 18:45:09 crc kubenswrapper[4749]: I0219 18:45:09.648635 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525445-n2zcb" event={"ID":"6f395667-c914-4b04-a6b4-52180a9b0356","Type":"ContainerDied","Data":"d0dbea33e878d1834405e7bb2f3d15f741252356091d81cd5c83d0c79d2fddc1"} Feb 19 18:45:09 crc kubenswrapper[4749]: I0219 18:45:09.648709 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0dbea33e878d1834405e7bb2f3d15f741252356091d81cd5c83d0c79d2fddc1" Feb 19 18:45:09 crc kubenswrapper[4749]: I0219 18:45:09.648775 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525445-n2zcb" Feb 19 18:45:15 crc kubenswrapper[4749]: I0219 18:45:15.270531 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-qs22p" Feb 19 18:45:32 crc kubenswrapper[4749]: I0219 18:45:32.512748 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr"] Feb 19 18:45:32 crc kubenswrapper[4749]: E0219 18:45:32.514628 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f395667-c914-4b04-a6b4-52180a9b0356" containerName="collect-profiles" Feb 19 18:45:32 crc kubenswrapper[4749]: I0219 18:45:32.514726 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f395667-c914-4b04-a6b4-52180a9b0356" containerName="collect-profiles" Feb 19 18:45:32 crc kubenswrapper[4749]: I0219 18:45:32.514918 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f395667-c914-4b04-a6b4-52180a9b0356" containerName="collect-profiles" Feb 19 18:45:32 crc kubenswrapper[4749]: I0219 18:45:32.515927 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr" Feb 19 18:45:32 crc kubenswrapper[4749]: I0219 18:45:32.518309 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 18:45:32 crc kubenswrapper[4749]: I0219 18:45:32.528761 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr"] Feb 19 18:45:32 crc kubenswrapper[4749]: I0219 18:45:32.590852 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9bc71b96-3cbf-4481-a9ac-d77071db9e39-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr\" (UID: \"9bc71b96-3cbf-4481-a9ac-d77071db9e39\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr" Feb 19 18:45:32 crc kubenswrapper[4749]: I0219 18:45:32.590900 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmp56\" (UniqueName: \"kubernetes.io/projected/9bc71b96-3cbf-4481-a9ac-d77071db9e39-kube-api-access-wmp56\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr\" (UID: \"9bc71b96-3cbf-4481-a9ac-d77071db9e39\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr" Feb 19 18:45:32 crc kubenswrapper[4749]: I0219 18:45:32.590927 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9bc71b96-3cbf-4481-a9ac-d77071db9e39-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr\" (UID: \"9bc71b96-3cbf-4481-a9ac-d77071db9e39\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr" Feb 19 18:45:32 crc kubenswrapper[4749]: I0219 18:45:32.692173 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9bc71b96-3cbf-4481-a9ac-d77071db9e39-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr\" (UID: \"9bc71b96-3cbf-4481-a9ac-d77071db9e39\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr" Feb 19 18:45:32 crc kubenswrapper[4749]: I0219 18:45:32.692211 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmp56\" (UniqueName: \"kubernetes.io/projected/9bc71b96-3cbf-4481-a9ac-d77071db9e39-kube-api-access-wmp56\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr\" (UID: \"9bc71b96-3cbf-4481-a9ac-d77071db9e39\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr" Feb 19 18:45:32 crc kubenswrapper[4749]: I0219 18:45:32.692235 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9bc71b96-3cbf-4481-a9ac-d77071db9e39-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr\" (UID: \"9bc71b96-3cbf-4481-a9ac-d77071db9e39\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr" Feb 19 18:45:32 crc kubenswrapper[4749]: I0219 18:45:32.692610 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9bc71b96-3cbf-4481-a9ac-d77071db9e39-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr\" (UID: \"9bc71b96-3cbf-4481-a9ac-d77071db9e39\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr" Feb 19 18:45:32 crc kubenswrapper[4749]: I0219 18:45:32.693057 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9bc71b96-3cbf-4481-a9ac-d77071db9e39-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr\" (UID: \"9bc71b96-3cbf-4481-a9ac-d77071db9e39\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr" Feb 19 18:45:32 crc kubenswrapper[4749]: I0219 18:45:32.713495 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmp56\" (UniqueName: \"kubernetes.io/projected/9bc71b96-3cbf-4481-a9ac-d77071db9e39-kube-api-access-wmp56\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr\" (UID: \"9bc71b96-3cbf-4481-a9ac-d77071db9e39\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr" Feb 19 18:45:32 crc kubenswrapper[4749]: I0219 18:45:32.831880 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr" Feb 19 18:45:33 crc kubenswrapper[4749]: I0219 18:45:33.266681 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr"] Feb 19 18:45:33 crc kubenswrapper[4749]: I0219 18:45:33.820943 4749 generic.go:334] "Generic (PLEG): container finished" podID="9bc71b96-3cbf-4481-a9ac-d77071db9e39" containerID="574be4ab98ec02b4a4a25bac0edff0109f57f4ab1ee14484b092a08ffede6cc8" exitCode=0 Feb 19 18:45:33 crc kubenswrapper[4749]: I0219 18:45:33.821003 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr" event={"ID":"9bc71b96-3cbf-4481-a9ac-d77071db9e39","Type":"ContainerDied","Data":"574be4ab98ec02b4a4a25bac0edff0109f57f4ab1ee14484b092a08ffede6cc8"} Feb 19 18:45:33 crc kubenswrapper[4749]: I0219 18:45:33.821293 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr" event={"ID":"9bc71b96-3cbf-4481-a9ac-d77071db9e39","Type":"ContainerStarted","Data":"2170c7490f5de58d3f98342b70765e68dc212ab1339709f540bc96fbf499a6d4"} Feb 19 18:45:37 crc kubenswrapper[4749]: I0219 18:45:37.845178 4749 generic.go:334] "Generic (PLEG): container finished" podID="9bc71b96-3cbf-4481-a9ac-d77071db9e39" containerID="63aa581a03017b328387187ac0f501bff034f42e92fc9e719d31e64d120e3918" exitCode=0 Feb 19 18:45:37 crc kubenswrapper[4749]: I0219 18:45:37.845216 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr" event={"ID":"9bc71b96-3cbf-4481-a9ac-d77071db9e39","Type":"ContainerDied","Data":"63aa581a03017b328387187ac0f501bff034f42e92fc9e719d31e64d120e3918"} Feb 19 18:45:38 crc kubenswrapper[4749]: I0219 18:45:38.852302 4749 generic.go:334] "Generic (PLEG): container finished" podID="9bc71b96-3cbf-4481-a9ac-d77071db9e39" containerID="cc903425dfa448d5a40321ef7795a026bf7822bf566e109528e53fcfb90352f6" exitCode=0 Feb 19 18:45:38 crc kubenswrapper[4749]: I0219 18:45:38.852388 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr" event={"ID":"9bc71b96-3cbf-4481-a9ac-d77071db9e39","Type":"ContainerDied","Data":"cc903425dfa448d5a40321ef7795a026bf7822bf566e109528e53fcfb90352f6"} Feb 19 18:45:40 crc kubenswrapper[4749]: I0219 18:45:40.060869 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr" Feb 19 18:45:40 crc kubenswrapper[4749]: I0219 18:45:40.185218 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmp56\" (UniqueName: \"kubernetes.io/projected/9bc71b96-3cbf-4481-a9ac-d77071db9e39-kube-api-access-wmp56\") pod \"9bc71b96-3cbf-4481-a9ac-d77071db9e39\" (UID: \"9bc71b96-3cbf-4481-a9ac-d77071db9e39\") " Feb 19 18:45:40 crc kubenswrapper[4749]: I0219 18:45:40.185278 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9bc71b96-3cbf-4481-a9ac-d77071db9e39-util\") pod \"9bc71b96-3cbf-4481-a9ac-d77071db9e39\" (UID: \"9bc71b96-3cbf-4481-a9ac-d77071db9e39\") " Feb 19 18:45:40 crc kubenswrapper[4749]: I0219 18:45:40.185323 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9bc71b96-3cbf-4481-a9ac-d77071db9e39-bundle\") pod \"9bc71b96-3cbf-4481-a9ac-d77071db9e39\" (UID: \"9bc71b96-3cbf-4481-a9ac-d77071db9e39\") " Feb 19 18:45:40 crc kubenswrapper[4749]: I0219 18:45:40.186104 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bc71b96-3cbf-4481-a9ac-d77071db9e39-bundle" (OuterVolumeSpecName: "bundle") pod "9bc71b96-3cbf-4481-a9ac-d77071db9e39" (UID: "9bc71b96-3cbf-4481-a9ac-d77071db9e39"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:45:40 crc kubenswrapper[4749]: I0219 18:45:40.192238 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bc71b96-3cbf-4481-a9ac-d77071db9e39-kube-api-access-wmp56" (OuterVolumeSpecName: "kube-api-access-wmp56") pod "9bc71b96-3cbf-4481-a9ac-d77071db9e39" (UID: "9bc71b96-3cbf-4481-a9ac-d77071db9e39"). InnerVolumeSpecName "kube-api-access-wmp56". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:45:40 crc kubenswrapper[4749]: I0219 18:45:40.195956 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bc71b96-3cbf-4481-a9ac-d77071db9e39-util" (OuterVolumeSpecName: "util") pod "9bc71b96-3cbf-4481-a9ac-d77071db9e39" (UID: "9bc71b96-3cbf-4481-a9ac-d77071db9e39"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:45:40 crc kubenswrapper[4749]: I0219 18:45:40.286941 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9bc71b96-3cbf-4481-a9ac-d77071db9e39-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:45:40 crc kubenswrapper[4749]: I0219 18:45:40.286980 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmp56\" (UniqueName: \"kubernetes.io/projected/9bc71b96-3cbf-4481-a9ac-d77071db9e39-kube-api-access-wmp56\") on node \"crc\" DevicePath \"\"" Feb 19 18:45:40 crc kubenswrapper[4749]: I0219 18:45:40.286990 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9bc71b96-3cbf-4481-a9ac-d77071db9e39-util\") on node \"crc\" DevicePath \"\"" Feb 19 18:45:40 crc kubenswrapper[4749]: I0219 18:45:40.866432 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr" event={"ID":"9bc71b96-3cbf-4481-a9ac-d77071db9e39","Type":"ContainerDied","Data":"2170c7490f5de58d3f98342b70765e68dc212ab1339709f540bc96fbf499a6d4"} Feb 19 18:45:40 crc kubenswrapper[4749]: I0219 18:45:40.866474 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2170c7490f5de58d3f98342b70765e68dc212ab1339709f540bc96fbf499a6d4" Feb 19 18:45:40 crc kubenswrapper[4749]: I0219 18:45:40.866506 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr" Feb 19 18:45:44 crc kubenswrapper[4749]: I0219 18:45:44.082880 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-4thb7"] Feb 19 18:45:44 crc kubenswrapper[4749]: E0219 18:45:44.083639 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bc71b96-3cbf-4481-a9ac-d77071db9e39" containerName="extract" Feb 19 18:45:44 crc kubenswrapper[4749]: I0219 18:45:44.083651 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bc71b96-3cbf-4481-a9ac-d77071db9e39" containerName="extract" Feb 19 18:45:44 crc kubenswrapper[4749]: E0219 18:45:44.083664 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bc71b96-3cbf-4481-a9ac-d77071db9e39" containerName="pull" Feb 19 18:45:44 crc kubenswrapper[4749]: I0219 18:45:44.083670 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bc71b96-3cbf-4481-a9ac-d77071db9e39" containerName="pull" Feb 19 18:45:44 crc kubenswrapper[4749]: E0219 18:45:44.083679 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bc71b96-3cbf-4481-a9ac-d77071db9e39" containerName="util" Feb 19 18:45:44 crc kubenswrapper[4749]: I0219 18:45:44.083685 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bc71b96-3cbf-4481-a9ac-d77071db9e39" containerName="util" Feb 19 18:45:44 crc kubenswrapper[4749]: I0219 18:45:44.083779 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bc71b96-3cbf-4481-a9ac-d77071db9e39" containerName="extract" Feb 19 18:45:44 crc kubenswrapper[4749]: I0219 18:45:44.084189 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-4thb7" Feb 19 18:45:44 crc kubenswrapper[4749]: I0219 18:45:44.087973 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-xbwm5" Feb 19 18:45:44 crc kubenswrapper[4749]: I0219 18:45:44.088137 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 19 18:45:44 crc kubenswrapper[4749]: I0219 18:45:44.088152 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 19 18:45:44 crc kubenswrapper[4749]: I0219 18:45:44.101188 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-4thb7"] Feb 19 18:45:44 crc kubenswrapper[4749]: I0219 18:45:44.133067 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4jj5\" (UniqueName: \"kubernetes.io/projected/0339e5b6-a614-4424-8375-01b24fd90b54-kube-api-access-d4jj5\") pod \"nmstate-operator-694c9596b7-4thb7\" (UID: \"0339e5b6-a614-4424-8375-01b24fd90b54\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-4thb7" Feb 19 18:45:44 crc kubenswrapper[4749]: I0219 18:45:44.233790 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4jj5\" (UniqueName: \"kubernetes.io/projected/0339e5b6-a614-4424-8375-01b24fd90b54-kube-api-access-d4jj5\") pod \"nmstate-operator-694c9596b7-4thb7\" (UID: \"0339e5b6-a614-4424-8375-01b24fd90b54\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-4thb7" Feb 19 18:45:44 crc kubenswrapper[4749]: I0219 18:45:44.254299 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4jj5\" (UniqueName: \"kubernetes.io/projected/0339e5b6-a614-4424-8375-01b24fd90b54-kube-api-access-d4jj5\") pod \"nmstate-operator-694c9596b7-4thb7\" (UID: \"0339e5b6-a614-4424-8375-01b24fd90b54\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-4thb7" Feb 19 18:45:44 crc kubenswrapper[4749]: I0219 18:45:44.408828 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-4thb7" Feb 19 18:45:44 crc kubenswrapper[4749]: I0219 18:45:44.604937 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-4thb7"] Feb 19 18:45:44 crc kubenswrapper[4749]: I0219 18:45:44.903656 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-4thb7" event={"ID":"0339e5b6-a614-4424-8375-01b24fd90b54","Type":"ContainerStarted","Data":"2e46c212455208a2bfdff0eebe65b5d41920f04d6e8064d73d3d073bec9c6ebb"} Feb 19 18:45:47 crc kubenswrapper[4749]: I0219 18:45:47.922874 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-4thb7" event={"ID":"0339e5b6-a614-4424-8375-01b24fd90b54","Type":"ContainerStarted","Data":"9e945cf1b155d23b4495b68814bcabe1c4eaf8851092fff580476f9d1e1bca0a"} Feb 19 18:45:47 crc kubenswrapper[4749]: I0219 18:45:47.944912 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-4thb7" podStartSLOduration=1.778907304 podStartE2EDuration="3.944875135s" podCreationTimestamp="2026-02-19 18:45:44 +0000 UTC" firstStartedPulling="2026-02-19 18:45:44.62193587 +0000 UTC m=+718.583155824" lastFinishedPulling="2026-02-19 18:45:46.787903701 +0000 UTC m=+720.749123655" observedRunningTime="2026-02-19 18:45:47.941514544 +0000 UTC m=+721.902734498" watchObservedRunningTime="2026-02-19 18:45:47.944875135 +0000 UTC m=+721.906095119" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.104616 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-kc58p"] Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.106093 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-kc58p" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.110702 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-9lfmz"] Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.111400 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9lfmz" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.115418 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-5rsrl" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.116619 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.121834 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-kc58p"] Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.140170 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-9lfmz"] Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.152675 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-vg82k"] Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.153384 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-vg82k" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.162340 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9c7d501a-b552-4c50-960c-63ae5826b93a-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-9lfmz\" (UID: \"9c7d501a-b552-4c50-960c-63ae5826b93a\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9lfmz" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.162443 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6cacdce3-57f6-4ae5-bcdd-6d94b938a155-dbus-socket\") pod \"nmstate-handler-vg82k\" (UID: \"6cacdce3-57f6-4ae5-bcdd-6d94b938a155\") " pod="openshift-nmstate/nmstate-handler-vg82k" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.162479 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt69j\" (UniqueName: \"kubernetes.io/projected/9c7d501a-b552-4c50-960c-63ae5826b93a-kube-api-access-qt69j\") pod \"nmstate-webhook-866bcb46dc-9lfmz\" (UID: \"9c7d501a-b552-4c50-960c-63ae5826b93a\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9lfmz" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.162509 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgdlb\" (UniqueName: \"kubernetes.io/projected/6cacdce3-57f6-4ae5-bcdd-6d94b938a155-kube-api-access-kgdlb\") pod \"nmstate-handler-vg82k\" (UID: \"6cacdce3-57f6-4ae5-bcdd-6d94b938a155\") " pod="openshift-nmstate/nmstate-handler-vg82k" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.162557 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4746l\" (UniqueName: \"kubernetes.io/projected/2f498838-9a5f-4320-9044-3602de46b7cb-kube-api-access-4746l\") pod \"nmstate-metrics-58c85c668d-kc58p\" (UID: \"2f498838-9a5f-4320-9044-3602de46b7cb\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-kc58p" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.162586 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6cacdce3-57f6-4ae5-bcdd-6d94b938a155-nmstate-lock\") pod \"nmstate-handler-vg82k\" (UID: \"6cacdce3-57f6-4ae5-bcdd-6d94b938a155\") " pod="openshift-nmstate/nmstate-handler-vg82k" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.162608 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6cacdce3-57f6-4ae5-bcdd-6d94b938a155-ovs-socket\") pod \"nmstate-handler-vg82k\" (UID: \"6cacdce3-57f6-4ae5-bcdd-6d94b938a155\") " pod="openshift-nmstate/nmstate-handler-vg82k" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.245237 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-b57rs"] Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.245902 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-b57rs" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.247925 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.248295 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-llkdh" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.249566 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.255720 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-b57rs"] Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.263948 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9c7d501a-b552-4c50-960c-63ae5826b93a-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-9lfmz\" (UID: \"9c7d501a-b552-4c50-960c-63ae5826b93a\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9lfmz" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.264040 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6cacdce3-57f6-4ae5-bcdd-6d94b938a155-dbus-socket\") pod \"nmstate-handler-vg82k\" (UID: \"6cacdce3-57f6-4ae5-bcdd-6d94b938a155\") " pod="openshift-nmstate/nmstate-handler-vg82k" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.264066 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt69j\" (UniqueName: \"kubernetes.io/projected/9c7d501a-b552-4c50-960c-63ae5826b93a-kube-api-access-qt69j\") pod \"nmstate-webhook-866bcb46dc-9lfmz\" (UID: \"9c7d501a-b552-4c50-960c-63ae5826b93a\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9lfmz" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.264086 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgdlb\" (UniqueName: \"kubernetes.io/projected/6cacdce3-57f6-4ae5-bcdd-6d94b938a155-kube-api-access-kgdlb\") pod \"nmstate-handler-vg82k\" (UID: \"6cacdce3-57f6-4ae5-bcdd-6d94b938a155\") " pod="openshift-nmstate/nmstate-handler-vg82k" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.264120 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4746l\" (UniqueName: \"kubernetes.io/projected/2f498838-9a5f-4320-9044-3602de46b7cb-kube-api-access-4746l\") pod \"nmstate-metrics-58c85c668d-kc58p\" (UID: \"2f498838-9a5f-4320-9044-3602de46b7cb\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-kc58p" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.264138 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6cacdce3-57f6-4ae5-bcdd-6d94b938a155-nmstate-lock\") pod \"nmstate-handler-vg82k\" (UID: \"6cacdce3-57f6-4ae5-bcdd-6d94b938a155\") " pod="openshift-nmstate/nmstate-handler-vg82k" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.264156 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6cacdce3-57f6-4ae5-bcdd-6d94b938a155-ovs-socket\") pod \"nmstate-handler-vg82k\" (UID: \"6cacdce3-57f6-4ae5-bcdd-6d94b938a155\") " pod="openshift-nmstate/nmstate-handler-vg82k" Feb 19 18:45:54 crc kubenswrapper[4749]: E0219 18:45:54.264164 4749 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.264228 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6cacdce3-57f6-4ae5-bcdd-6d94b938a155-ovs-socket\") pod \"nmstate-handler-vg82k\" (UID: \"6cacdce3-57f6-4ae5-bcdd-6d94b938a155\") " pod="openshift-nmstate/nmstate-handler-vg82k" Feb 19 18:45:54 crc kubenswrapper[4749]: E0219 18:45:54.264233 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c7d501a-b552-4c50-960c-63ae5826b93a-tls-key-pair podName:9c7d501a-b552-4c50-960c-63ae5826b93a nodeName:}" failed. No retries permitted until 2026-02-19 18:45:54.764213016 +0000 UTC m=+728.725433050 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/9c7d501a-b552-4c50-960c-63ae5826b93a-tls-key-pair") pod "nmstate-webhook-866bcb46dc-9lfmz" (UID: "9c7d501a-b552-4c50-960c-63ae5826b93a") : secret "openshift-nmstate-webhook" not found Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.264544 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6cacdce3-57f6-4ae5-bcdd-6d94b938a155-dbus-socket\") pod \"nmstate-handler-vg82k\" (UID: \"6cacdce3-57f6-4ae5-bcdd-6d94b938a155\") " pod="openshift-nmstate/nmstate-handler-vg82k" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.264741 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6cacdce3-57f6-4ae5-bcdd-6d94b938a155-nmstate-lock\") pod \"nmstate-handler-vg82k\" (UID: \"6cacdce3-57f6-4ae5-bcdd-6d94b938a155\") " pod="openshift-nmstate/nmstate-handler-vg82k" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.283742 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgdlb\" (UniqueName: \"kubernetes.io/projected/6cacdce3-57f6-4ae5-bcdd-6d94b938a155-kube-api-access-kgdlb\") pod \"nmstate-handler-vg82k\" (UID: \"6cacdce3-57f6-4ae5-bcdd-6d94b938a155\") " pod="openshift-nmstate/nmstate-handler-vg82k" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.285594 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt69j\" (UniqueName: \"kubernetes.io/projected/9c7d501a-b552-4c50-960c-63ae5826b93a-kube-api-access-qt69j\") pod \"nmstate-webhook-866bcb46dc-9lfmz\" (UID: \"9c7d501a-b552-4c50-960c-63ae5826b93a\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9lfmz" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.285848 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4746l\" (UniqueName: \"kubernetes.io/projected/2f498838-9a5f-4320-9044-3602de46b7cb-kube-api-access-4746l\") pod \"nmstate-metrics-58c85c668d-kc58p\" (UID: \"2f498838-9a5f-4320-9044-3602de46b7cb\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-kc58p" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.365956 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/84b14d9d-fa97-4391-9bd8-f3c5680ad7d1-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-b57rs\" (UID: \"84b14d9d-fa97-4391-9bd8-f3c5680ad7d1\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-b57rs" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.366065 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/84b14d9d-fa97-4391-9bd8-f3c5680ad7d1-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-b57rs\" (UID: \"84b14d9d-fa97-4391-9bd8-f3c5680ad7d1\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-b57rs" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.366126 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z8jv\" (UniqueName: \"kubernetes.io/projected/84b14d9d-fa97-4391-9bd8-f3c5680ad7d1-kube-api-access-8z8jv\") pod \"nmstate-console-plugin-5c78fc5d65-b57rs\" (UID: \"84b14d9d-fa97-4391-9bd8-f3c5680ad7d1\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-b57rs" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.420060 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-ccb7b5f57-twmb6"] Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.420748 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ccb7b5f57-twmb6" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.424772 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-kc58p" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.467418 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/84b14d9d-fa97-4391-9bd8-f3c5680ad7d1-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-b57rs\" (UID: \"84b14d9d-fa97-4391-9bd8-f3c5680ad7d1\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-b57rs" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.467495 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z8jv\" (UniqueName: \"kubernetes.io/projected/84b14d9d-fa97-4391-9bd8-f3c5680ad7d1-kube-api-access-8z8jv\") pod \"nmstate-console-plugin-5c78fc5d65-b57rs\" (UID: \"84b14d9d-fa97-4391-9bd8-f3c5680ad7d1\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-b57rs" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.467542 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/84b14d9d-fa97-4391-9bd8-f3c5680ad7d1-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-b57rs\" (UID: \"84b14d9d-fa97-4391-9bd8-f3c5680ad7d1\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-b57rs" Feb 19 18:45:54 crc kubenswrapper[4749]: E0219 18:45:54.467695 4749 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Feb 19 18:45:54 crc kubenswrapper[4749]: E0219 18:45:54.467756 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84b14d9d-fa97-4391-9bd8-f3c5680ad7d1-plugin-serving-cert podName:84b14d9d-fa97-4391-9bd8-f3c5680ad7d1 nodeName:}" failed. No retries permitted until 2026-02-19 18:45:54.967740757 +0000 UTC m=+728.928960711 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/84b14d9d-fa97-4391-9bd8-f3c5680ad7d1-plugin-serving-cert") pod "nmstate-console-plugin-5c78fc5d65-b57rs" (UID: "84b14d9d-fa97-4391-9bd8-f3c5680ad7d1") : secret "plugin-serving-cert" not found Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.467990 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-vg82k" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.468592 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/84b14d9d-fa97-4391-9bd8-f3c5680ad7d1-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-b57rs\" (UID: \"84b14d9d-fa97-4391-9bd8-f3c5680ad7d1\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-b57rs" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.488145 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-ccb7b5f57-twmb6"] Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.497651 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z8jv\" (UniqueName: \"kubernetes.io/projected/84b14d9d-fa97-4391-9bd8-f3c5680ad7d1-kube-api-access-8z8jv\") pod \"nmstate-console-plugin-5c78fc5d65-b57rs\" (UID: \"84b14d9d-fa97-4391-9bd8-f3c5680ad7d1\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-b57rs" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.568301 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0099dbe8-2656-4437-b63c-c7c2b05375d0-trusted-ca-bundle\") pod \"console-ccb7b5f57-twmb6\" (UID: \"0099dbe8-2656-4437-b63c-c7c2b05375d0\") " pod="openshift-console/console-ccb7b5f57-twmb6" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.568574 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0099dbe8-2656-4437-b63c-c7c2b05375d0-oauth-serving-cert\") pod \"console-ccb7b5f57-twmb6\" (UID: \"0099dbe8-2656-4437-b63c-c7c2b05375d0\") " pod="openshift-console/console-ccb7b5f57-twmb6" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.568603 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0099dbe8-2656-4437-b63c-c7c2b05375d0-console-oauth-config\") pod \"console-ccb7b5f57-twmb6\" (UID: \"0099dbe8-2656-4437-b63c-c7c2b05375d0\") " pod="openshift-console/console-ccb7b5f57-twmb6" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.568644 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0099dbe8-2656-4437-b63c-c7c2b05375d0-console-config\") pod \"console-ccb7b5f57-twmb6\" (UID: \"0099dbe8-2656-4437-b63c-c7c2b05375d0\") " pod="openshift-console/console-ccb7b5f57-twmb6" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.568679 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0099dbe8-2656-4437-b63c-c7c2b05375d0-service-ca\") pod \"console-ccb7b5f57-twmb6\" (UID: \"0099dbe8-2656-4437-b63c-c7c2b05375d0\") " pod="openshift-console/console-ccb7b5f57-twmb6" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.568717 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0099dbe8-2656-4437-b63c-c7c2b05375d0-console-serving-cert\") pod \"console-ccb7b5f57-twmb6\" (UID: \"0099dbe8-2656-4437-b63c-c7c2b05375d0\") " pod="openshift-console/console-ccb7b5f57-twmb6" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.568758 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58jt6\" (UniqueName: \"kubernetes.io/projected/0099dbe8-2656-4437-b63c-c7c2b05375d0-kube-api-access-58jt6\") pod \"console-ccb7b5f57-twmb6\" (UID: \"0099dbe8-2656-4437-b63c-c7c2b05375d0\") " pod="openshift-console/console-ccb7b5f57-twmb6" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.670908 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0099dbe8-2656-4437-b63c-c7c2b05375d0-console-serving-cert\") pod \"console-ccb7b5f57-twmb6\" (UID: \"0099dbe8-2656-4437-b63c-c7c2b05375d0\") " pod="openshift-console/console-ccb7b5f57-twmb6" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.670943 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58jt6\" (UniqueName: \"kubernetes.io/projected/0099dbe8-2656-4437-b63c-c7c2b05375d0-kube-api-access-58jt6\") pod \"console-ccb7b5f57-twmb6\" (UID: \"0099dbe8-2656-4437-b63c-c7c2b05375d0\") " pod="openshift-console/console-ccb7b5f57-twmb6" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.670992 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0099dbe8-2656-4437-b63c-c7c2b05375d0-trusted-ca-bundle\") pod \"console-ccb7b5f57-twmb6\" (UID: \"0099dbe8-2656-4437-b63c-c7c2b05375d0\") " pod="openshift-console/console-ccb7b5f57-twmb6" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.671049 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0099dbe8-2656-4437-b63c-c7c2b05375d0-oauth-serving-cert\") pod \"console-ccb7b5f57-twmb6\" (UID: \"0099dbe8-2656-4437-b63c-c7c2b05375d0\") " pod="openshift-console/console-ccb7b5f57-twmb6" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.671086 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0099dbe8-2656-4437-b63c-c7c2b05375d0-console-oauth-config\") pod \"console-ccb7b5f57-twmb6\" (UID: \"0099dbe8-2656-4437-b63c-c7c2b05375d0\") " pod="openshift-console/console-ccb7b5f57-twmb6" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.671113 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0099dbe8-2656-4437-b63c-c7c2b05375d0-console-config\") pod \"console-ccb7b5f57-twmb6\" (UID: \"0099dbe8-2656-4437-b63c-c7c2b05375d0\") " pod="openshift-console/console-ccb7b5f57-twmb6" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.671146 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0099dbe8-2656-4437-b63c-c7c2b05375d0-service-ca\") pod \"console-ccb7b5f57-twmb6\" (UID: \"0099dbe8-2656-4437-b63c-c7c2b05375d0\") " pod="openshift-console/console-ccb7b5f57-twmb6" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.671964 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0099dbe8-2656-4437-b63c-c7c2b05375d0-service-ca\") pod \"console-ccb7b5f57-twmb6\" (UID: \"0099dbe8-2656-4437-b63c-c7c2b05375d0\") " pod="openshift-console/console-ccb7b5f57-twmb6" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.672358 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0099dbe8-2656-4437-b63c-c7c2b05375d0-trusted-ca-bundle\") pod \"console-ccb7b5f57-twmb6\" (UID: \"0099dbe8-2656-4437-b63c-c7c2b05375d0\") " pod="openshift-console/console-ccb7b5f57-twmb6" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.674542 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0099dbe8-2656-4437-b63c-c7c2b05375d0-console-serving-cert\") pod \"console-ccb7b5f57-twmb6\" (UID: \"0099dbe8-2656-4437-b63c-c7c2b05375d0\") " pod="openshift-console/console-ccb7b5f57-twmb6" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.674924 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0099dbe8-2656-4437-b63c-c7c2b05375d0-console-config\") pod \"console-ccb7b5f57-twmb6\" (UID: \"0099dbe8-2656-4437-b63c-c7c2b05375d0\") " pod="openshift-console/console-ccb7b5f57-twmb6" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.675599 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0099dbe8-2656-4437-b63c-c7c2b05375d0-console-oauth-config\") pod \"console-ccb7b5f57-twmb6\" (UID: \"0099dbe8-2656-4437-b63c-c7c2b05375d0\") " pod="openshift-console/console-ccb7b5f57-twmb6" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.676302 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0099dbe8-2656-4437-b63c-c7c2b05375d0-oauth-serving-cert\") pod \"console-ccb7b5f57-twmb6\" (UID: \"0099dbe8-2656-4437-b63c-c7c2b05375d0\") " pod="openshift-console/console-ccb7b5f57-twmb6" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.686396 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58jt6\" (UniqueName: \"kubernetes.io/projected/0099dbe8-2656-4437-b63c-c7c2b05375d0-kube-api-access-58jt6\") pod \"console-ccb7b5f57-twmb6\" (UID: \"0099dbe8-2656-4437-b63c-c7c2b05375d0\") " pod="openshift-console/console-ccb7b5f57-twmb6" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.738445 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ccb7b5f57-twmb6" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.784747 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9c7d501a-b552-4c50-960c-63ae5826b93a-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-9lfmz\" (UID: \"9c7d501a-b552-4c50-960c-63ae5826b93a\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9lfmz" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.803782 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9c7d501a-b552-4c50-960c-63ae5826b93a-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-9lfmz\" (UID: \"9c7d501a-b552-4c50-960c-63ae5826b93a\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9lfmz" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.876235 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-kc58p"] Feb 19 18:45:54 crc kubenswrapper[4749]: W0219 18:45:54.881710 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f498838_9a5f_4320_9044_3602de46b7cb.slice/crio-f5da8a7c3a850f1089c994ce4555ccf834261e8a1afa5a1cd16cf15b040ba99a WatchSource:0}: Error finding container f5da8a7c3a850f1089c994ce4555ccf834261e8a1afa5a1cd16cf15b040ba99a: Status 404 returned error can't find the container with id f5da8a7c3a850f1089c994ce4555ccf834261e8a1afa5a1cd16cf15b040ba99a Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.965985 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-kc58p" event={"ID":"2f498838-9a5f-4320-9044-3602de46b7cb","Type":"ContainerStarted","Data":"f5da8a7c3a850f1089c994ce4555ccf834261e8a1afa5a1cd16cf15b040ba99a"} Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.967088 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-vg82k" event={"ID":"6cacdce3-57f6-4ae5-bcdd-6d94b938a155","Type":"ContainerStarted","Data":"8e308f3d7999313f9d59d936b0fe52060d8348d42a5684324336ffabc80a74ea"} Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.986901 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/84b14d9d-fa97-4391-9bd8-f3c5680ad7d1-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-b57rs\" (UID: \"84b14d9d-fa97-4391-9bd8-f3c5680ad7d1\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-b57rs" Feb 19 18:45:54 crc kubenswrapper[4749]: I0219 18:45:54.992661 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/84b14d9d-fa97-4391-9bd8-f3c5680ad7d1-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-b57rs\" (UID: \"84b14d9d-fa97-4391-9bd8-f3c5680ad7d1\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-b57rs" Feb 19 18:45:55 crc kubenswrapper[4749]: I0219 18:45:55.035113 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9lfmz" Feb 19 18:45:55 crc kubenswrapper[4749]: I0219 18:45:55.158448 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-b57rs" Feb 19 18:45:55 crc kubenswrapper[4749]: I0219 18:45:55.234478 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-ccb7b5f57-twmb6"] Feb 19 18:45:55 crc kubenswrapper[4749]: I0219 18:45:55.242081 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-9lfmz"] Feb 19 18:45:55 crc kubenswrapper[4749]: W0219 18:45:55.247231 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0099dbe8_2656_4437_b63c_c7c2b05375d0.slice/crio-b385adc71029966c68c054401ea7101e18e13ed49190a5b342bb9cf3115cb671 WatchSource:0}: Error finding container b385adc71029966c68c054401ea7101e18e13ed49190a5b342bb9cf3115cb671: Status 404 returned error can't find the container with id b385adc71029966c68c054401ea7101e18e13ed49190a5b342bb9cf3115cb671 Feb 19 18:45:55 crc kubenswrapper[4749]: I0219 18:45:55.380449 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-b57rs"] Feb 19 18:45:55 crc kubenswrapper[4749]: W0219 18:45:55.387947 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84b14d9d_fa97_4391_9bd8_f3c5680ad7d1.slice/crio-50c972474363576dc0011820646b9312fcdd87b6df5366dd6a268cb0c3d3c30d WatchSource:0}: Error finding container 50c972474363576dc0011820646b9312fcdd87b6df5366dd6a268cb0c3d3c30d: Status 404 returned error can't find the container with id 50c972474363576dc0011820646b9312fcdd87b6df5366dd6a268cb0c3d3c30d Feb 19 18:45:55 crc kubenswrapper[4749]: I0219 18:45:55.978333 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9lfmz" event={"ID":"9c7d501a-b552-4c50-960c-63ae5826b93a","Type":"ContainerStarted","Data":"85d9fdce4234b210bea251316394af7949734d06fccfd9d399a1d19e1daf2e71"} Feb 19 18:45:55 crc kubenswrapper[4749]: I0219 18:45:55.980314 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-b57rs" event={"ID":"84b14d9d-fa97-4391-9bd8-f3c5680ad7d1","Type":"ContainerStarted","Data":"50c972474363576dc0011820646b9312fcdd87b6df5366dd6a268cb0c3d3c30d"} Feb 19 18:45:55 crc kubenswrapper[4749]: I0219 18:45:55.981852 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ccb7b5f57-twmb6" event={"ID":"0099dbe8-2656-4437-b63c-c7c2b05375d0","Type":"ContainerStarted","Data":"51da961501b15cb9e00e26295d88002043b8a3af7afdf6bc224d59b2a48e0f06"} Feb 19 18:45:55 crc kubenswrapper[4749]: I0219 18:45:55.981877 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ccb7b5f57-twmb6" event={"ID":"0099dbe8-2656-4437-b63c-c7c2b05375d0","Type":"ContainerStarted","Data":"b385adc71029966c68c054401ea7101e18e13ed49190a5b342bb9cf3115cb671"} Feb 19 18:45:56 crc kubenswrapper[4749]: I0219 18:45:56.007826 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-ccb7b5f57-twmb6" podStartSLOduration=2.007806204 podStartE2EDuration="2.007806204s" podCreationTimestamp="2026-02-19 18:45:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:45:56.000541 +0000 UTC m=+729.961760974" watchObservedRunningTime="2026-02-19 18:45:56.007806204 +0000 UTC m=+729.969026168" Feb 19 18:45:56 crc kubenswrapper[4749]: I0219 18:45:56.986866 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9lfmz" event={"ID":"9c7d501a-b552-4c50-960c-63ae5826b93a","Type":"ContainerStarted","Data":"b62aff1b54d403ac959ab3881b4c1c29b2dc3fd8ef6f7711aa45ac6809d845ab"} Feb 19 18:45:56 crc kubenswrapper[4749]: I0219 18:45:56.987327 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9lfmz" Feb 19 18:45:56 crc kubenswrapper[4749]: I0219 18:45:56.988013 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-kc58p" event={"ID":"2f498838-9a5f-4320-9044-3602de46b7cb","Type":"ContainerStarted","Data":"01edbd629e496863bfa9c12eaa58f92070bee0390bb845e8d9ce32bb0023a160"} Feb 19 18:45:57 crc kubenswrapper[4749]: I0219 18:45:57.009659 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9lfmz" podStartSLOduration=1.599013801 podStartE2EDuration="3.00963892s" podCreationTimestamp="2026-02-19 18:45:54 +0000 UTC" firstStartedPulling="2026-02-19 18:45:55.279681637 +0000 UTC m=+729.240901591" lastFinishedPulling="2026-02-19 18:45:56.690306756 +0000 UTC m=+730.651526710" observedRunningTime="2026-02-19 18:45:57.003364189 +0000 UTC m=+730.964584143" watchObservedRunningTime="2026-02-19 18:45:57.00963892 +0000 UTC m=+730.970858874" Feb 19 18:45:58 crc kubenswrapper[4749]: I0219 18:45:58.000834 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-b57rs" event={"ID":"84b14d9d-fa97-4391-9bd8-f3c5680ad7d1","Type":"ContainerStarted","Data":"f358d94c6064a11337c624b952f570fe5cffc1447489a1efe7a88bb0cf4c8842"} Feb 19 18:45:58 crc kubenswrapper[4749]: I0219 18:45:58.003856 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-vg82k" event={"ID":"6cacdce3-57f6-4ae5-bcdd-6d94b938a155","Type":"ContainerStarted","Data":"d8b476232a1abbc99c6640876a49af1aafdac54df669eb70fd54c15deaf38b8f"} Feb 19 18:45:58 crc kubenswrapper[4749]: I0219 18:45:58.004325 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-vg82k" Feb 19 18:45:58 crc kubenswrapper[4749]: I0219 18:45:58.015691 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-b57rs" podStartSLOduration=1.785661487 podStartE2EDuration="4.015675946s" podCreationTimestamp="2026-02-19 18:45:54 +0000 UTC" firstStartedPulling="2026-02-19 18:45:55.391597497 +0000 UTC m=+729.352817451" lastFinishedPulling="2026-02-19 18:45:57.621611956 +0000 UTC m=+731.582831910" observedRunningTime="2026-02-19 18:45:58.012352246 +0000 UTC m=+731.973572200" watchObservedRunningTime="2026-02-19 18:45:58.015675946 +0000 UTC m=+731.976895900" Feb 19 18:45:58 crc kubenswrapper[4749]: I0219 18:45:58.031470 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-vg82k" podStartSLOduration=1.8541886650000001 podStartE2EDuration="4.031447175s" podCreationTimestamp="2026-02-19 18:45:54 +0000 UTC" firstStartedPulling="2026-02-19 18:45:54.511123879 +0000 UTC m=+728.472343833" lastFinishedPulling="2026-02-19 18:45:56.688382359 +0000 UTC m=+730.649602343" observedRunningTime="2026-02-19 18:45:58.027439219 +0000 UTC m=+731.988659173" watchObservedRunningTime="2026-02-19 18:45:58.031447175 +0000 UTC m=+731.992667129" Feb 19 18:46:00 crc kubenswrapper[4749]: I0219 18:46:00.018441 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-kc58p" event={"ID":"2f498838-9a5f-4320-9044-3602de46b7cb","Type":"ContainerStarted","Data":"4806c54f239b04dcfb0f46bdcb69585ab8dbbd87f812ddf9102c35e96ce2187c"} Feb 19 18:46:04 crc kubenswrapper[4749]: I0219 18:46:04.497892 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-vg82k" Feb 19 18:46:04 crc kubenswrapper[4749]: I0219 18:46:04.523106 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-kc58p" podStartSLOduration=6.119038341 podStartE2EDuration="10.523086524s" podCreationTimestamp="2026-02-19 18:45:54 +0000 UTC" firstStartedPulling="2026-02-19 18:45:54.884246625 +0000 UTC m=+728.845466579" lastFinishedPulling="2026-02-19 18:45:59.288294808 +0000 UTC m=+733.249514762" observedRunningTime="2026-02-19 18:46:00.036929489 +0000 UTC m=+733.998149453" watchObservedRunningTime="2026-02-19 18:46:04.523086524 +0000 UTC m=+738.484306488" Feb 19 18:46:04 crc kubenswrapper[4749]: I0219 18:46:04.739360 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-ccb7b5f57-twmb6" Feb 19 18:46:04 crc kubenswrapper[4749]: I0219 18:46:04.740261 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-ccb7b5f57-twmb6" Feb 19 18:46:04 crc kubenswrapper[4749]: I0219 18:46:04.746049 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-ccb7b5f57-twmb6" Feb 19 18:46:05 crc kubenswrapper[4749]: I0219 18:46:05.066779 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-ccb7b5f57-twmb6" Feb 19 18:46:05 crc kubenswrapper[4749]: I0219 18:46:05.155146 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rnkj5"] Feb 19 18:46:15 crc kubenswrapper[4749]: I0219 18:46:15.041380 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9lfmz" Feb 19 18:46:24 crc kubenswrapper[4749]: I0219 18:46:24.725613 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 18:46:24 crc kubenswrapper[4749]: I0219 18:46:24.726460 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 18:46:28 crc kubenswrapper[4749]: I0219 18:46:28.689446 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b"] Feb 19 18:46:28 crc kubenswrapper[4749]: I0219 18:46:28.691386 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b"] Feb 19 18:46:28 crc kubenswrapper[4749]: I0219 18:46:28.691480 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b" Feb 19 18:46:28 crc kubenswrapper[4749]: I0219 18:46:28.694441 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 18:46:28 crc kubenswrapper[4749]: I0219 18:46:28.873883 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b\" (UID: \"8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b" Feb 19 18:46:28 crc kubenswrapper[4749]: I0219 18:46:28.874006 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b\" (UID: \"8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b" Feb 19 18:46:28 crc kubenswrapper[4749]: I0219 18:46:28.874090 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vxdk\" (UniqueName: \"kubernetes.io/projected/8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe-kube-api-access-7vxdk\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b\" (UID: \"8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b" Feb 19 18:46:28 crc kubenswrapper[4749]: I0219 18:46:28.975233 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b\" (UID: \"8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b" Feb 19 18:46:28 crc kubenswrapper[4749]: I0219 18:46:28.975311 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vxdk\" (UniqueName: \"kubernetes.io/projected/8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe-kube-api-access-7vxdk\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b\" (UID: \"8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b" Feb 19 18:46:28 crc kubenswrapper[4749]: I0219 18:46:28.975347 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b\" (UID: \"8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b" Feb 19 18:46:28 crc kubenswrapper[4749]: I0219 18:46:28.975727 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b\" (UID: \"8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b" Feb 19 18:46:28 crc kubenswrapper[4749]: I0219 18:46:28.975811 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b\" (UID: \"8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b" Feb 19 18:46:28 crc kubenswrapper[4749]: I0219 18:46:28.993687 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vxdk\" (UniqueName: \"kubernetes.io/projected/8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe-kube-api-access-7vxdk\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b\" (UID: \"8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b" Feb 19 18:46:29 crc kubenswrapper[4749]: I0219 18:46:29.014918 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b" Feb 19 18:46:29 crc kubenswrapper[4749]: I0219 18:46:29.471701 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b"] Feb 19 18:46:29 crc kubenswrapper[4749]: W0219 18:46:29.476555 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e0e297e_8bb2_4e13_95d4_ad2c7b2a79fe.slice/crio-97808a798cf2dedd47156a278c5bbd11821eaa57fd4c83cc4d143781224e4113 WatchSource:0}: Error finding container 97808a798cf2dedd47156a278c5bbd11821eaa57fd4c83cc4d143781224e4113: Status 404 returned error can't find the container with id 97808a798cf2dedd47156a278c5bbd11821eaa57fd4c83cc4d143781224e4113 Feb 19 18:46:30 crc kubenswrapper[4749]: I0219 18:46:30.197595 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-rnkj5" podUID="be9f2c50-2589-4bbb-b193-9ee20f2a6c0d" containerName="console" containerID="cri-o://30a872228616b59c714cf50fa1aa236b3d75323377aa298e361fb4a89d4f411b" gracePeriod=15 Feb 19 18:46:30 crc kubenswrapper[4749]: I0219 18:46:30.277411 4749 generic.go:334] "Generic (PLEG): container finished" podID="8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe" containerID="1d3523bc0db40e74a5927108257a95b4b791e27792807fe3cfd92bd0caf5c136" exitCode=0 Feb 19 18:46:30 crc kubenswrapper[4749]: I0219 18:46:30.277473 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b" event={"ID":"8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe","Type":"ContainerDied","Data":"1d3523bc0db40e74a5927108257a95b4b791e27792807fe3cfd92bd0caf5c136"} Feb 19 18:46:30 crc kubenswrapper[4749]: I0219 18:46:30.277511 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b" event={"ID":"8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe","Type":"ContainerStarted","Data":"97808a798cf2dedd47156a278c5bbd11821eaa57fd4c83cc4d143781224e4113"} Feb 19 18:46:30 crc kubenswrapper[4749]: I0219 18:46:30.575853 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rnkj5_be9f2c50-2589-4bbb-b193-9ee20f2a6c0d/console/0.log" Feb 19 18:46:30 crc kubenswrapper[4749]: I0219 18:46:30.576180 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:46:30 crc kubenswrapper[4749]: I0219 18:46:30.698461 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-oauth-serving-cert\") pod \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " Feb 19 18:46:30 crc kubenswrapper[4749]: I0219 18:46:30.698517 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-console-oauth-config\") pod \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " Feb 19 18:46:30 crc kubenswrapper[4749]: I0219 18:46:30.698542 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-trusted-ca-bundle\") pod \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " Feb 19 18:46:30 crc kubenswrapper[4749]: I0219 18:46:30.698604 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-console-config\") pod \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " Feb 19 18:46:30 crc kubenswrapper[4749]: I0219 18:46:30.698674 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-service-ca\") pod \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " Feb 19 18:46:30 crc kubenswrapper[4749]: I0219 18:46:30.699317 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "be9f2c50-2589-4bbb-b193-9ee20f2a6c0d" (UID: "be9f2c50-2589-4bbb-b193-9ee20f2a6c0d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:46:30 crc kubenswrapper[4749]: I0219 18:46:30.699361 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-service-ca" (OuterVolumeSpecName: "service-ca") pod "be9f2c50-2589-4bbb-b193-9ee20f2a6c0d" (UID: "be9f2c50-2589-4bbb-b193-9ee20f2a6c0d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:46:30 crc kubenswrapper[4749]: I0219 18:46:30.699326 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-console-config" (OuterVolumeSpecName: "console-config") pod "be9f2c50-2589-4bbb-b193-9ee20f2a6c0d" (UID: "be9f2c50-2589-4bbb-b193-9ee20f2a6c0d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:46:30 crc kubenswrapper[4749]: I0219 18:46:30.699674 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-console-serving-cert\") pod \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " Feb 19 18:46:30 crc kubenswrapper[4749]: I0219 18:46:30.699716 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "be9f2c50-2589-4bbb-b193-9ee20f2a6c0d" (UID: "be9f2c50-2589-4bbb-b193-9ee20f2a6c0d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:46:30 crc kubenswrapper[4749]: I0219 18:46:30.699748 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkx6v\" (UniqueName: \"kubernetes.io/projected/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-kube-api-access-kkx6v\") pod \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\" (UID: \"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d\") " Feb 19 18:46:30 crc kubenswrapper[4749]: I0219 18:46:30.700409 4749 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-console-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:46:30 crc kubenswrapper[4749]: I0219 18:46:30.700441 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:46:30 crc kubenswrapper[4749]: I0219 18:46:30.700453 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:46:30 crc kubenswrapper[4749]: I0219 18:46:30.704128 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-kube-api-access-kkx6v" (OuterVolumeSpecName: "kube-api-access-kkx6v") pod "be9f2c50-2589-4bbb-b193-9ee20f2a6c0d" (UID: "be9f2c50-2589-4bbb-b193-9ee20f2a6c0d"). InnerVolumeSpecName "kube-api-access-kkx6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:46:30 crc kubenswrapper[4749]: I0219 18:46:30.705872 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "be9f2c50-2589-4bbb-b193-9ee20f2a6c0d" (UID: "be9f2c50-2589-4bbb-b193-9ee20f2a6c0d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:46:30 crc kubenswrapper[4749]: I0219 18:46:30.714656 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "be9f2c50-2589-4bbb-b193-9ee20f2a6c0d" (UID: "be9f2c50-2589-4bbb-b193-9ee20f2a6c0d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:46:30 crc kubenswrapper[4749]: I0219 18:46:30.801307 4749 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:46:30 crc kubenswrapper[4749]: I0219 18:46:30.801337 4749 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:46:30 crc kubenswrapper[4749]: I0219 18:46:30.801346 4749 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 18:46:30 crc kubenswrapper[4749]: I0219 18:46:30.801356 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkx6v\" (UniqueName: \"kubernetes.io/projected/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d-kube-api-access-kkx6v\") on node \"crc\" DevicePath \"\"" Feb 19 18:46:31 crc kubenswrapper[4749]: I0219 18:46:31.286642 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rnkj5_be9f2c50-2589-4bbb-b193-9ee20f2a6c0d/console/0.log" Feb 19 18:46:31 crc kubenswrapper[4749]: I0219 18:46:31.286731 4749 generic.go:334] "Generic (PLEG): container finished" podID="be9f2c50-2589-4bbb-b193-9ee20f2a6c0d" containerID="30a872228616b59c714cf50fa1aa236b3d75323377aa298e361fb4a89d4f411b" exitCode=2 Feb 19 18:46:31 crc kubenswrapper[4749]: I0219 18:46:31.286777 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rnkj5" event={"ID":"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d","Type":"ContainerDied","Data":"30a872228616b59c714cf50fa1aa236b3d75323377aa298e361fb4a89d4f411b"} Feb 19 18:46:31 crc kubenswrapper[4749]: I0219 18:46:31.286813 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rnkj5" event={"ID":"be9f2c50-2589-4bbb-b193-9ee20f2a6c0d","Type":"ContainerDied","Data":"27a35c674a36e91af9928f6e8410b6f6ec228b27013665bd0074ddeaf38e663b"} Feb 19 18:46:31 crc kubenswrapper[4749]: I0219 18:46:31.286841 4749 scope.go:117] "RemoveContainer" containerID="30a872228616b59c714cf50fa1aa236b3d75323377aa298e361fb4a89d4f411b" Feb 19 18:46:31 crc kubenswrapper[4749]: I0219 18:46:31.286923 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rnkj5" Feb 19 18:46:31 crc kubenswrapper[4749]: I0219 18:46:31.324260 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rnkj5"] Feb 19 18:46:31 crc kubenswrapper[4749]: I0219 18:46:31.329563 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-rnkj5"] Feb 19 18:46:31 crc kubenswrapper[4749]: I0219 18:46:31.478083 4749 scope.go:117] "RemoveContainer" containerID="30a872228616b59c714cf50fa1aa236b3d75323377aa298e361fb4a89d4f411b" Feb 19 18:46:31 crc kubenswrapper[4749]: E0219 18:46:31.478624 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30a872228616b59c714cf50fa1aa236b3d75323377aa298e361fb4a89d4f411b\": container with ID starting with 30a872228616b59c714cf50fa1aa236b3d75323377aa298e361fb4a89d4f411b not found: ID does not exist" containerID="30a872228616b59c714cf50fa1aa236b3d75323377aa298e361fb4a89d4f411b" Feb 19 18:46:31 crc kubenswrapper[4749]: I0219 18:46:31.478682 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30a872228616b59c714cf50fa1aa236b3d75323377aa298e361fb4a89d4f411b"} err="failed to get container status \"30a872228616b59c714cf50fa1aa236b3d75323377aa298e361fb4a89d4f411b\": rpc error: code = NotFound desc = could not find container \"30a872228616b59c714cf50fa1aa236b3d75323377aa298e361fb4a89d4f411b\": container with ID starting with 30a872228616b59c714cf50fa1aa236b3d75323377aa298e361fb4a89d4f411b not found: ID does not exist" Feb 19 18:46:32 crc kubenswrapper[4749]: I0219 18:46:32.295046 4749 generic.go:334] "Generic (PLEG): container finished" podID="8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe" containerID="71e853c5ae0811ed09a666ec2d6cea7ca378edffa72ba096e85b11950e78378c" exitCode=0 Feb 19 18:46:32 crc kubenswrapper[4749]: I0219 18:46:32.295148 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b" event={"ID":"8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe","Type":"ContainerDied","Data":"71e853c5ae0811ed09a666ec2d6cea7ca378edffa72ba096e85b11950e78378c"} Feb 19 18:46:32 crc kubenswrapper[4749]: I0219 18:46:32.685731 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be9f2c50-2589-4bbb-b193-9ee20f2a6c0d" path="/var/lib/kubelet/pods/be9f2c50-2589-4bbb-b193-9ee20f2a6c0d/volumes" Feb 19 18:46:33 crc kubenswrapper[4749]: I0219 18:46:33.305989 4749 generic.go:334] "Generic (PLEG): container finished" podID="8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe" containerID="1556dfaf6e65605ca7813f94861150119d4b4284bf5e50713e29ea73597faf3c" exitCode=0 Feb 19 18:46:33 crc kubenswrapper[4749]: I0219 18:46:33.306051 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b" event={"ID":"8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe","Type":"ContainerDied","Data":"1556dfaf6e65605ca7813f94861150119d4b4284bf5e50713e29ea73597faf3c"} Feb 19 18:46:34 crc kubenswrapper[4749]: I0219 18:46:34.568781 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b" Feb 19 18:46:34 crc kubenswrapper[4749]: I0219 18:46:34.746381 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe-bundle\") pod \"8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe\" (UID: \"8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe\") " Feb 19 18:46:34 crc kubenswrapper[4749]: I0219 18:46:34.746714 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe-util\") pod \"8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe\" (UID: \"8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe\") " Feb 19 18:46:34 crc kubenswrapper[4749]: I0219 18:46:34.746780 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vxdk\" (UniqueName: \"kubernetes.io/projected/8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe-kube-api-access-7vxdk\") pod \"8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe\" (UID: \"8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe\") " Feb 19 18:46:34 crc kubenswrapper[4749]: I0219 18:46:34.748697 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe-bundle" (OuterVolumeSpecName: "bundle") pod "8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe" (UID: "8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:46:34 crc kubenswrapper[4749]: I0219 18:46:34.753180 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe-kube-api-access-7vxdk" (OuterVolumeSpecName: "kube-api-access-7vxdk") pod "8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe" (UID: "8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe"). InnerVolumeSpecName "kube-api-access-7vxdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:46:34 crc kubenswrapper[4749]: I0219 18:46:34.777401 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe-util" (OuterVolumeSpecName: "util") pod "8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe" (UID: "8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:46:34 crc kubenswrapper[4749]: I0219 18:46:34.848284 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:46:34 crc kubenswrapper[4749]: I0219 18:46:34.848352 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe-util\") on node \"crc\" DevicePath \"\"" Feb 19 18:46:34 crc kubenswrapper[4749]: I0219 18:46:34.848382 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vxdk\" (UniqueName: \"kubernetes.io/projected/8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe-kube-api-access-7vxdk\") on node \"crc\" DevicePath \"\"" Feb 19 18:46:35 crc kubenswrapper[4749]: I0219 18:46:35.040406 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wbv9h"] Feb 19 18:46:35 crc kubenswrapper[4749]: E0219 18:46:35.040728 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be9f2c50-2589-4bbb-b193-9ee20f2a6c0d" containerName="console" Feb 19 18:46:35 crc kubenswrapper[4749]: I0219 18:46:35.040745 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="be9f2c50-2589-4bbb-b193-9ee20f2a6c0d" containerName="console" Feb 19 18:46:35 crc kubenswrapper[4749]: E0219 18:46:35.040792 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe" containerName="util" Feb 19 18:46:35 crc kubenswrapper[4749]: I0219 18:46:35.040801 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe" containerName="util" Feb 19 18:46:35 crc kubenswrapper[4749]: E0219 18:46:35.040813 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe" containerName="pull" Feb 19 18:46:35 crc kubenswrapper[4749]: I0219 18:46:35.040820 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe" containerName="pull" Feb 19 18:46:35 crc kubenswrapper[4749]: E0219 18:46:35.040830 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe" containerName="extract" Feb 19 18:46:35 crc kubenswrapper[4749]: I0219 18:46:35.040838 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe" containerName="extract" Feb 19 18:46:35 crc kubenswrapper[4749]: I0219 18:46:35.041054 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="be9f2c50-2589-4bbb-b193-9ee20f2a6c0d" containerName="console" Feb 19 18:46:35 crc kubenswrapper[4749]: I0219 18:46:35.041078 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe" containerName="extract" Feb 19 18:46:35 crc kubenswrapper[4749]: I0219 18:46:35.042339 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wbv9h" Feb 19 18:46:35 crc kubenswrapper[4749]: I0219 18:46:35.060222 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wbv9h"] Feb 19 18:46:35 crc kubenswrapper[4749]: I0219 18:46:35.151673 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247fdb46-14d6-40a6-bf70-23ed9a1c4841-utilities\") pod \"redhat-marketplace-wbv9h\" (UID: \"247fdb46-14d6-40a6-bf70-23ed9a1c4841\") " pod="openshift-marketplace/redhat-marketplace-wbv9h" Feb 19 18:46:35 crc kubenswrapper[4749]: I0219 18:46:35.151742 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247fdb46-14d6-40a6-bf70-23ed9a1c4841-catalog-content\") pod \"redhat-marketplace-wbv9h\" (UID: \"247fdb46-14d6-40a6-bf70-23ed9a1c4841\") " pod="openshift-marketplace/redhat-marketplace-wbv9h" Feb 19 18:46:35 crc kubenswrapper[4749]: I0219 18:46:35.151800 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdsdb\" (UniqueName: \"kubernetes.io/projected/247fdb46-14d6-40a6-bf70-23ed9a1c4841-kube-api-access-vdsdb\") pod \"redhat-marketplace-wbv9h\" (UID: \"247fdb46-14d6-40a6-bf70-23ed9a1c4841\") " pod="openshift-marketplace/redhat-marketplace-wbv9h" Feb 19 18:46:35 crc kubenswrapper[4749]: I0219 18:46:35.253706 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdsdb\" (UniqueName: \"kubernetes.io/projected/247fdb46-14d6-40a6-bf70-23ed9a1c4841-kube-api-access-vdsdb\") pod \"redhat-marketplace-wbv9h\" (UID: \"247fdb46-14d6-40a6-bf70-23ed9a1c4841\") " pod="openshift-marketplace/redhat-marketplace-wbv9h" Feb 19 18:46:35 crc kubenswrapper[4749]: I0219 18:46:35.253782 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247fdb46-14d6-40a6-bf70-23ed9a1c4841-utilities\") pod \"redhat-marketplace-wbv9h\" (UID: \"247fdb46-14d6-40a6-bf70-23ed9a1c4841\") " pod="openshift-marketplace/redhat-marketplace-wbv9h" Feb 19 18:46:35 crc kubenswrapper[4749]: I0219 18:46:35.253808 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247fdb46-14d6-40a6-bf70-23ed9a1c4841-catalog-content\") pod \"redhat-marketplace-wbv9h\" (UID: \"247fdb46-14d6-40a6-bf70-23ed9a1c4841\") " pod="openshift-marketplace/redhat-marketplace-wbv9h" Feb 19 18:46:35 crc kubenswrapper[4749]: I0219 18:46:35.254259 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247fdb46-14d6-40a6-bf70-23ed9a1c4841-catalog-content\") pod \"redhat-marketplace-wbv9h\" (UID: \"247fdb46-14d6-40a6-bf70-23ed9a1c4841\") " pod="openshift-marketplace/redhat-marketplace-wbv9h" Feb 19 18:46:35 crc kubenswrapper[4749]: I0219 18:46:35.254342 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247fdb46-14d6-40a6-bf70-23ed9a1c4841-utilities\") pod \"redhat-marketplace-wbv9h\" (UID: \"247fdb46-14d6-40a6-bf70-23ed9a1c4841\") " pod="openshift-marketplace/redhat-marketplace-wbv9h" Feb 19 18:46:35 crc kubenswrapper[4749]: I0219 18:46:35.271582 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdsdb\" (UniqueName: \"kubernetes.io/projected/247fdb46-14d6-40a6-bf70-23ed9a1c4841-kube-api-access-vdsdb\") pod \"redhat-marketplace-wbv9h\" (UID: \"247fdb46-14d6-40a6-bf70-23ed9a1c4841\") " pod="openshift-marketplace/redhat-marketplace-wbv9h" Feb 19 18:46:35 crc kubenswrapper[4749]: I0219 18:46:35.318727 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b" event={"ID":"8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe","Type":"ContainerDied","Data":"97808a798cf2dedd47156a278c5bbd11821eaa57fd4c83cc4d143781224e4113"} Feb 19 18:46:35 crc kubenswrapper[4749]: I0219 18:46:35.319043 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97808a798cf2dedd47156a278c5bbd11821eaa57fd4c83cc4d143781224e4113" Feb 19 18:46:35 crc kubenswrapper[4749]: I0219 18:46:35.318814 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b" Feb 19 18:46:35 crc kubenswrapper[4749]: I0219 18:46:35.426878 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wbv9h" Feb 19 18:46:35 crc kubenswrapper[4749]: I0219 18:46:35.853256 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wbv9h"] Feb 19 18:46:36 crc kubenswrapper[4749]: I0219 18:46:36.326853 4749 generic.go:334] "Generic (PLEG): container finished" podID="247fdb46-14d6-40a6-bf70-23ed9a1c4841" containerID="53b6281b165e657ca42cc38c9829498fa3bc9206eed56d7882272a861c64a378" exitCode=0 Feb 19 18:46:36 crc kubenswrapper[4749]: I0219 18:46:36.326920 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbv9h" event={"ID":"247fdb46-14d6-40a6-bf70-23ed9a1c4841","Type":"ContainerDied","Data":"53b6281b165e657ca42cc38c9829498fa3bc9206eed56d7882272a861c64a378"} Feb 19 18:46:36 crc kubenswrapper[4749]: I0219 18:46:36.327197 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbv9h" event={"ID":"247fdb46-14d6-40a6-bf70-23ed9a1c4841","Type":"ContainerStarted","Data":"cd890cfc9628a96378e86936aed40d88b27fb22f3c79c46fa063472ac447b9a8"} Feb 19 18:46:38 crc kubenswrapper[4749]: I0219 18:46:38.339549 4749 generic.go:334] "Generic (PLEG): container finished" podID="247fdb46-14d6-40a6-bf70-23ed9a1c4841" containerID="e9ddbea5c3fbfdec50ae62f4eb7b129a5fd5d32a83bbb1a7d8b2aafaca18884d" exitCode=0 Feb 19 18:46:38 crc kubenswrapper[4749]: I0219 18:46:38.339598 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbv9h" event={"ID":"247fdb46-14d6-40a6-bf70-23ed9a1c4841","Type":"ContainerDied","Data":"e9ddbea5c3fbfdec50ae62f4eb7b129a5fd5d32a83bbb1a7d8b2aafaca18884d"} Feb 19 18:46:39 crc kubenswrapper[4749]: I0219 18:46:39.348208 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbv9h" event={"ID":"247fdb46-14d6-40a6-bf70-23ed9a1c4841","Type":"ContainerStarted","Data":"5920f7e8e48bd9db54656e5fa85d999dbef914999968cb69de532e4da7aa1a46"} Feb 19 18:46:41 crc kubenswrapper[4749]: I0219 18:46:41.028141 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wbv9h" podStartSLOduration=3.63420513 podStartE2EDuration="6.028122638s" podCreationTimestamp="2026-02-19 18:46:35 +0000 UTC" firstStartedPulling="2026-02-19 18:46:36.329122576 +0000 UTC m=+770.290342530" lastFinishedPulling="2026-02-19 18:46:38.723040084 +0000 UTC m=+772.684260038" observedRunningTime="2026-02-19 18:46:39.369857468 +0000 UTC m=+773.331077442" watchObservedRunningTime="2026-02-19 18:46:41.028122638 +0000 UTC m=+774.989342592" Feb 19 18:46:41 crc kubenswrapper[4749]: I0219 18:46:41.029424 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-brrht"] Feb 19 18:46:41 crc kubenswrapper[4749]: I0219 18:46:41.030574 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-brrht" Feb 19 18:46:41 crc kubenswrapper[4749]: I0219 18:46:41.040876 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-brrht"] Feb 19 18:46:41 crc kubenswrapper[4749]: I0219 18:46:41.226187 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6511820d-c20f-4210-a8a1-2d8479ed7331-catalog-content\") pod \"community-operators-brrht\" (UID: \"6511820d-c20f-4210-a8a1-2d8479ed7331\") " pod="openshift-marketplace/community-operators-brrht" Feb 19 18:46:41 crc kubenswrapper[4749]: I0219 18:46:41.226297 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6511820d-c20f-4210-a8a1-2d8479ed7331-utilities\") pod \"community-operators-brrht\" (UID: \"6511820d-c20f-4210-a8a1-2d8479ed7331\") " pod="openshift-marketplace/community-operators-brrht" Feb 19 18:46:41 crc kubenswrapper[4749]: I0219 18:46:41.226346 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn5b7\" (UniqueName: \"kubernetes.io/projected/6511820d-c20f-4210-a8a1-2d8479ed7331-kube-api-access-vn5b7\") pod \"community-operators-brrht\" (UID: \"6511820d-c20f-4210-a8a1-2d8479ed7331\") " pod="openshift-marketplace/community-operators-brrht" Feb 19 18:46:41 crc kubenswrapper[4749]: I0219 18:46:41.327016 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6511820d-c20f-4210-a8a1-2d8479ed7331-utilities\") pod \"community-operators-brrht\" (UID: \"6511820d-c20f-4210-a8a1-2d8479ed7331\") " pod="openshift-marketplace/community-operators-brrht" Feb 19 18:46:41 crc kubenswrapper[4749]: I0219 18:46:41.327126 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn5b7\" (UniqueName: \"kubernetes.io/projected/6511820d-c20f-4210-a8a1-2d8479ed7331-kube-api-access-vn5b7\") pod \"community-operators-brrht\" (UID: \"6511820d-c20f-4210-a8a1-2d8479ed7331\") " pod="openshift-marketplace/community-operators-brrht" Feb 19 18:46:41 crc kubenswrapper[4749]: I0219 18:46:41.327155 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6511820d-c20f-4210-a8a1-2d8479ed7331-catalog-content\") pod \"community-operators-brrht\" (UID: \"6511820d-c20f-4210-a8a1-2d8479ed7331\") " pod="openshift-marketplace/community-operators-brrht" Feb 19 18:46:41 crc kubenswrapper[4749]: I0219 18:46:41.327625 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6511820d-c20f-4210-a8a1-2d8479ed7331-utilities\") pod \"community-operators-brrht\" (UID: \"6511820d-c20f-4210-a8a1-2d8479ed7331\") " pod="openshift-marketplace/community-operators-brrht" Feb 19 18:46:41 crc kubenswrapper[4749]: I0219 18:46:41.327652 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6511820d-c20f-4210-a8a1-2d8479ed7331-catalog-content\") pod \"community-operators-brrht\" (UID: \"6511820d-c20f-4210-a8a1-2d8479ed7331\") " pod="openshift-marketplace/community-operators-brrht" Feb 19 18:46:41 crc kubenswrapper[4749]: I0219 18:46:41.345380 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn5b7\" (UniqueName: \"kubernetes.io/projected/6511820d-c20f-4210-a8a1-2d8479ed7331-kube-api-access-vn5b7\") pod \"community-operators-brrht\" (UID: \"6511820d-c20f-4210-a8a1-2d8479ed7331\") " pod="openshift-marketplace/community-operators-brrht" Feb 19 18:46:41 crc kubenswrapper[4749]: I0219 18:46:41.359305 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-brrht" Feb 19 18:46:41 crc kubenswrapper[4749]: I0219 18:46:41.752188 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-brrht"] Feb 19 18:46:42 crc kubenswrapper[4749]: I0219 18:46:42.364708 4749 generic.go:334] "Generic (PLEG): container finished" podID="6511820d-c20f-4210-a8a1-2d8479ed7331" containerID="aea9119caacca98b02c9a0bf6923912c73bac88ea1d33a9ba60ca88d98b648d1" exitCode=0 Feb 19 18:46:42 crc kubenswrapper[4749]: I0219 18:46:42.364812 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brrht" event={"ID":"6511820d-c20f-4210-a8a1-2d8479ed7331","Type":"ContainerDied","Data":"aea9119caacca98b02c9a0bf6923912c73bac88ea1d33a9ba60ca88d98b648d1"} Feb 19 18:46:42 crc kubenswrapper[4749]: I0219 18:46:42.365065 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brrht" event={"ID":"6511820d-c20f-4210-a8a1-2d8479ed7331","Type":"ContainerStarted","Data":"931f4bbb375fe94c58b5d0298c66b5e418f403955dd2f16e0c3c0573683fff60"} Feb 19 18:46:43 crc kubenswrapper[4749]: I0219 18:46:43.376523 4749 generic.go:334] "Generic (PLEG): container finished" podID="6511820d-c20f-4210-a8a1-2d8479ed7331" containerID="07eb24987ecc3af7d0a1db9108969c7a9b089105c8729c7e43d0e851dbf649a2" exitCode=0 Feb 19 18:46:43 crc kubenswrapper[4749]: I0219 18:46:43.376620 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brrht" event={"ID":"6511820d-c20f-4210-a8a1-2d8479ed7331","Type":"ContainerDied","Data":"07eb24987ecc3af7d0a1db9108969c7a9b089105c8729c7e43d0e851dbf649a2"} Feb 19 18:46:44 crc kubenswrapper[4749]: I0219 18:46:44.384092 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brrht" event={"ID":"6511820d-c20f-4210-a8a1-2d8479ed7331","Type":"ContainerStarted","Data":"93f529e2838bb0e0038c821f094efa1bf49599a2f519841bebf74aff7282c408"} Feb 19 18:46:44 crc kubenswrapper[4749]: I0219 18:46:44.404239 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-brrht" podStartSLOduration=1.886723753 podStartE2EDuration="3.40421922s" podCreationTimestamp="2026-02-19 18:46:41 +0000 UTC" firstStartedPulling="2026-02-19 18:46:42.367370552 +0000 UTC m=+776.328590506" lastFinishedPulling="2026-02-19 18:46:43.884866019 +0000 UTC m=+777.846085973" observedRunningTime="2026-02-19 18:46:44.400627983 +0000 UTC m=+778.361847937" watchObservedRunningTime="2026-02-19 18:46:44.40421922 +0000 UTC m=+778.365439174" Feb 19 18:46:44 crc kubenswrapper[4749]: I0219 18:46:44.727168 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-74989bddb6-dcst5"] Feb 19 18:46:44 crc kubenswrapper[4749]: I0219 18:46:44.727846 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-74989bddb6-dcst5" Feb 19 18:46:44 crc kubenswrapper[4749]: I0219 18:46:44.730700 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 19 18:46:44 crc kubenswrapper[4749]: I0219 18:46:44.730784 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-cqbql" Feb 19 18:46:44 crc kubenswrapper[4749]: I0219 18:46:44.730874 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 19 18:46:44 crc kubenswrapper[4749]: I0219 18:46:44.730934 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 19 18:46:44 crc kubenswrapper[4749]: I0219 18:46:44.731050 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 19 18:46:44 crc kubenswrapper[4749]: I0219 18:46:44.752181 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-74989bddb6-dcst5"] Feb 19 18:46:44 crc kubenswrapper[4749]: I0219 18:46:44.873788 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bcecd22a-15ba-4bca-8be3-9cc08843c86d-apiservice-cert\") pod \"metallb-operator-controller-manager-74989bddb6-dcst5\" (UID: \"bcecd22a-15ba-4bca-8be3-9cc08843c86d\") " pod="metallb-system/metallb-operator-controller-manager-74989bddb6-dcst5" Feb 19 18:46:44 crc kubenswrapper[4749]: I0219 18:46:44.873851 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bcecd22a-15ba-4bca-8be3-9cc08843c86d-webhook-cert\") pod \"metallb-operator-controller-manager-74989bddb6-dcst5\" (UID: \"bcecd22a-15ba-4bca-8be3-9cc08843c86d\") " pod="metallb-system/metallb-operator-controller-manager-74989bddb6-dcst5" Feb 19 18:46:44 crc kubenswrapper[4749]: I0219 18:46:44.874058 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw8nz\" (UniqueName: \"kubernetes.io/projected/bcecd22a-15ba-4bca-8be3-9cc08843c86d-kube-api-access-lw8nz\") pod \"metallb-operator-controller-manager-74989bddb6-dcst5\" (UID: \"bcecd22a-15ba-4bca-8be3-9cc08843c86d\") " pod="metallb-system/metallb-operator-controller-manager-74989bddb6-dcst5" Feb 19 18:46:44 crc kubenswrapper[4749]: I0219 18:46:44.975400 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bcecd22a-15ba-4bca-8be3-9cc08843c86d-apiservice-cert\") pod \"metallb-operator-controller-manager-74989bddb6-dcst5\" (UID: \"bcecd22a-15ba-4bca-8be3-9cc08843c86d\") " pod="metallb-system/metallb-operator-controller-manager-74989bddb6-dcst5" Feb 19 18:46:44 crc kubenswrapper[4749]: I0219 18:46:44.975457 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bcecd22a-15ba-4bca-8be3-9cc08843c86d-webhook-cert\") pod \"metallb-operator-controller-manager-74989bddb6-dcst5\" (UID: \"bcecd22a-15ba-4bca-8be3-9cc08843c86d\") " pod="metallb-system/metallb-operator-controller-manager-74989bddb6-dcst5" Feb 19 18:46:44 crc kubenswrapper[4749]: I0219 18:46:44.975487 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw8nz\" (UniqueName: \"kubernetes.io/projected/bcecd22a-15ba-4bca-8be3-9cc08843c86d-kube-api-access-lw8nz\") pod \"metallb-operator-controller-manager-74989bddb6-dcst5\" (UID: \"bcecd22a-15ba-4bca-8be3-9cc08843c86d\") " pod="metallb-system/metallb-operator-controller-manager-74989bddb6-dcst5" Feb 19 18:46:44 crc kubenswrapper[4749]: I0219 18:46:44.981317 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bcecd22a-15ba-4bca-8be3-9cc08843c86d-webhook-cert\") pod \"metallb-operator-controller-manager-74989bddb6-dcst5\" (UID: \"bcecd22a-15ba-4bca-8be3-9cc08843c86d\") " pod="metallb-system/metallb-operator-controller-manager-74989bddb6-dcst5" Feb 19 18:46:44 crc kubenswrapper[4749]: I0219 18:46:44.982787 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bcecd22a-15ba-4bca-8be3-9cc08843c86d-apiservice-cert\") pod \"metallb-operator-controller-manager-74989bddb6-dcst5\" (UID: \"bcecd22a-15ba-4bca-8be3-9cc08843c86d\") " pod="metallb-system/metallb-operator-controller-manager-74989bddb6-dcst5" Feb 19 18:46:44 crc kubenswrapper[4749]: I0219 18:46:44.994694 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw8nz\" (UniqueName: \"kubernetes.io/projected/bcecd22a-15ba-4bca-8be3-9cc08843c86d-kube-api-access-lw8nz\") pod \"metallb-operator-controller-manager-74989bddb6-dcst5\" (UID: \"bcecd22a-15ba-4bca-8be3-9cc08843c86d\") " pod="metallb-system/metallb-operator-controller-manager-74989bddb6-dcst5" Feb 19 18:46:45 crc kubenswrapper[4749]: I0219 18:46:45.030297 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-66b7b94c9b-n69x7"] Feb 19 18:46:45 crc kubenswrapper[4749]: I0219 18:46:45.031042 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-66b7b94c9b-n69x7" Feb 19 18:46:45 crc kubenswrapper[4749]: I0219 18:46:45.032890 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 19 18:46:45 crc kubenswrapper[4749]: I0219 18:46:45.033237 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 19 18:46:45 crc kubenswrapper[4749]: I0219 18:46:45.035605 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-k5sll" Feb 19 18:46:45 crc kubenswrapper[4749]: I0219 18:46:45.044541 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-74989bddb6-dcst5" Feb 19 18:46:45 crc kubenswrapper[4749]: I0219 18:46:45.071798 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-66b7b94c9b-n69x7"] Feb 19 18:46:45 crc kubenswrapper[4749]: I0219 18:46:45.180330 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9c6d5734-2093-42d7-a330-59c6dc0dc138-apiservice-cert\") pod \"metallb-operator-webhook-server-66b7b94c9b-n69x7\" (UID: \"9c6d5734-2093-42d7-a330-59c6dc0dc138\") " pod="metallb-system/metallb-operator-webhook-server-66b7b94c9b-n69x7" Feb 19 18:46:45 crc kubenswrapper[4749]: I0219 18:46:45.180385 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhfdv\" (UniqueName: \"kubernetes.io/projected/9c6d5734-2093-42d7-a330-59c6dc0dc138-kube-api-access-jhfdv\") pod \"metallb-operator-webhook-server-66b7b94c9b-n69x7\" (UID: \"9c6d5734-2093-42d7-a330-59c6dc0dc138\") " pod="metallb-system/metallb-operator-webhook-server-66b7b94c9b-n69x7" Feb 19 18:46:45 crc kubenswrapper[4749]: I0219 18:46:45.180438 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9c6d5734-2093-42d7-a330-59c6dc0dc138-webhook-cert\") pod \"metallb-operator-webhook-server-66b7b94c9b-n69x7\" (UID: \"9c6d5734-2093-42d7-a330-59c6dc0dc138\") " pod="metallb-system/metallb-operator-webhook-server-66b7b94c9b-n69x7" Feb 19 18:46:45 crc kubenswrapper[4749]: I0219 18:46:45.281985 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9c6d5734-2093-42d7-a330-59c6dc0dc138-apiservice-cert\") pod \"metallb-operator-webhook-server-66b7b94c9b-n69x7\" (UID: \"9c6d5734-2093-42d7-a330-59c6dc0dc138\") " pod="metallb-system/metallb-operator-webhook-server-66b7b94c9b-n69x7" Feb 19 18:46:45 crc kubenswrapper[4749]: I0219 18:46:45.282465 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhfdv\" (UniqueName: \"kubernetes.io/projected/9c6d5734-2093-42d7-a330-59c6dc0dc138-kube-api-access-jhfdv\") pod \"metallb-operator-webhook-server-66b7b94c9b-n69x7\" (UID: \"9c6d5734-2093-42d7-a330-59c6dc0dc138\") " pod="metallb-system/metallb-operator-webhook-server-66b7b94c9b-n69x7" Feb 19 18:46:45 crc kubenswrapper[4749]: I0219 18:46:45.288476 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9c6d5734-2093-42d7-a330-59c6dc0dc138-webhook-cert\") pod \"metallb-operator-webhook-server-66b7b94c9b-n69x7\" (UID: \"9c6d5734-2093-42d7-a330-59c6dc0dc138\") " pod="metallb-system/metallb-operator-webhook-server-66b7b94c9b-n69x7" Feb 19 18:46:45 crc kubenswrapper[4749]: I0219 18:46:45.292173 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9c6d5734-2093-42d7-a330-59c6dc0dc138-webhook-cert\") pod \"metallb-operator-webhook-server-66b7b94c9b-n69x7\" (UID: \"9c6d5734-2093-42d7-a330-59c6dc0dc138\") " pod="metallb-system/metallb-operator-webhook-server-66b7b94c9b-n69x7" Feb 19 18:46:45 crc kubenswrapper[4749]: I0219 18:46:45.292469 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9c6d5734-2093-42d7-a330-59c6dc0dc138-apiservice-cert\") pod \"metallb-operator-webhook-server-66b7b94c9b-n69x7\" (UID: \"9c6d5734-2093-42d7-a330-59c6dc0dc138\") " pod="metallb-system/metallb-operator-webhook-server-66b7b94c9b-n69x7" Feb 19 18:46:45 crc kubenswrapper[4749]: I0219 18:46:45.305365 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhfdv\" (UniqueName: \"kubernetes.io/projected/9c6d5734-2093-42d7-a330-59c6dc0dc138-kube-api-access-jhfdv\") pod \"metallb-operator-webhook-server-66b7b94c9b-n69x7\" (UID: \"9c6d5734-2093-42d7-a330-59c6dc0dc138\") " pod="metallb-system/metallb-operator-webhook-server-66b7b94c9b-n69x7" Feb 19 18:46:45 crc kubenswrapper[4749]: I0219 18:46:45.314734 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-74989bddb6-dcst5"] Feb 19 18:46:45 crc kubenswrapper[4749]: I0219 18:46:45.345729 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-66b7b94c9b-n69x7" Feb 19 18:46:45 crc kubenswrapper[4749]: I0219 18:46:45.394202 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-74989bddb6-dcst5" event={"ID":"bcecd22a-15ba-4bca-8be3-9cc08843c86d","Type":"ContainerStarted","Data":"0395756f5083b23da96dd3456037a205437ec38300776d669baf58e376235cd9"} Feb 19 18:46:45 crc kubenswrapper[4749]: I0219 18:46:45.427444 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wbv9h" Feb 19 18:46:45 crc kubenswrapper[4749]: I0219 18:46:45.427496 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wbv9h" Feb 19 18:46:45 crc kubenswrapper[4749]: I0219 18:46:45.476549 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wbv9h" Feb 19 18:46:45 crc kubenswrapper[4749]: I0219 18:46:45.828661 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-66b7b94c9b-n69x7"] Feb 19 18:46:45 crc kubenswrapper[4749]: W0219 18:46:45.833829 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c6d5734_2093_42d7_a330_59c6dc0dc138.slice/crio-2f54283d43ddab98d662f2fe181193648e46c17cd596d639264b5b93af0a08c5 WatchSource:0}: Error finding container 2f54283d43ddab98d662f2fe181193648e46c17cd596d639264b5b93af0a08c5: Status 404 returned error can't find the container with id 2f54283d43ddab98d662f2fe181193648e46c17cd596d639264b5b93af0a08c5 Feb 19 18:46:46 crc kubenswrapper[4749]: I0219 18:46:46.400941 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-66b7b94c9b-n69x7" event={"ID":"9c6d5734-2093-42d7-a330-59c6dc0dc138","Type":"ContainerStarted","Data":"2f54283d43ddab98d662f2fe181193648e46c17cd596d639264b5b93af0a08c5"} Feb 19 18:46:46 crc kubenswrapper[4749]: I0219 18:46:46.445772 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wbv9h" Feb 19 18:46:47 crc kubenswrapper[4749]: I0219 18:46:47.621433 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wbv9h"] Feb 19 18:46:48 crc kubenswrapper[4749]: I0219 18:46:48.425169 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wbv9h" podUID="247fdb46-14d6-40a6-bf70-23ed9a1c4841" containerName="registry-server" containerID="cri-o://5920f7e8e48bd9db54656e5fa85d999dbef914999968cb69de532e4da7aa1a46" gracePeriod=2 Feb 19 18:46:49 crc kubenswrapper[4749]: I0219 18:46:49.436198 4749 generic.go:334] "Generic (PLEG): container finished" podID="247fdb46-14d6-40a6-bf70-23ed9a1c4841" containerID="5920f7e8e48bd9db54656e5fa85d999dbef914999968cb69de532e4da7aa1a46" exitCode=0 Feb 19 18:46:49 crc kubenswrapper[4749]: I0219 18:46:49.436235 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbv9h" event={"ID":"247fdb46-14d6-40a6-bf70-23ed9a1c4841","Type":"ContainerDied","Data":"5920f7e8e48bd9db54656e5fa85d999dbef914999968cb69de532e4da7aa1a46"} Feb 19 18:46:51 crc kubenswrapper[4749]: I0219 18:46:51.359563 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-brrht" Feb 19 18:46:51 crc kubenswrapper[4749]: I0219 18:46:51.360895 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-brrht" Feb 19 18:46:51 crc kubenswrapper[4749]: I0219 18:46:51.404479 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-brrht" Feb 19 18:46:51 crc kubenswrapper[4749]: I0219 18:46:51.453652 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbv9h" event={"ID":"247fdb46-14d6-40a6-bf70-23ed9a1c4841","Type":"ContainerDied","Data":"cd890cfc9628a96378e86936aed40d88b27fb22f3c79c46fa063472ac447b9a8"} Feb 19 18:46:51 crc kubenswrapper[4749]: I0219 18:46:51.453732 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd890cfc9628a96378e86936aed40d88b27fb22f3c79c46fa063472ac447b9a8" Feb 19 18:46:51 crc kubenswrapper[4749]: I0219 18:46:51.470682 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wbv9h" Feb 19 18:46:51 crc kubenswrapper[4749]: I0219 18:46:51.502052 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-brrht" Feb 19 18:46:51 crc kubenswrapper[4749]: I0219 18:46:51.516294 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247fdb46-14d6-40a6-bf70-23ed9a1c4841-catalog-content\") pod \"247fdb46-14d6-40a6-bf70-23ed9a1c4841\" (UID: \"247fdb46-14d6-40a6-bf70-23ed9a1c4841\") " Feb 19 18:46:51 crc kubenswrapper[4749]: I0219 18:46:51.516396 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdsdb\" (UniqueName: \"kubernetes.io/projected/247fdb46-14d6-40a6-bf70-23ed9a1c4841-kube-api-access-vdsdb\") pod \"247fdb46-14d6-40a6-bf70-23ed9a1c4841\" (UID: \"247fdb46-14d6-40a6-bf70-23ed9a1c4841\") " Feb 19 18:46:51 crc kubenswrapper[4749]: I0219 18:46:51.516440 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247fdb46-14d6-40a6-bf70-23ed9a1c4841-utilities\") pod \"247fdb46-14d6-40a6-bf70-23ed9a1c4841\" (UID: \"247fdb46-14d6-40a6-bf70-23ed9a1c4841\") " Feb 19 18:46:51 crc kubenswrapper[4749]: I0219 18:46:51.517288 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/247fdb46-14d6-40a6-bf70-23ed9a1c4841-utilities" (OuterVolumeSpecName: "utilities") pod "247fdb46-14d6-40a6-bf70-23ed9a1c4841" (UID: "247fdb46-14d6-40a6-bf70-23ed9a1c4841"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:46:51 crc kubenswrapper[4749]: I0219 18:46:51.523741 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/247fdb46-14d6-40a6-bf70-23ed9a1c4841-kube-api-access-vdsdb" (OuterVolumeSpecName: "kube-api-access-vdsdb") pod "247fdb46-14d6-40a6-bf70-23ed9a1c4841" (UID: "247fdb46-14d6-40a6-bf70-23ed9a1c4841"). InnerVolumeSpecName "kube-api-access-vdsdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:46:51 crc kubenswrapper[4749]: I0219 18:46:51.548214 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/247fdb46-14d6-40a6-bf70-23ed9a1c4841-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "247fdb46-14d6-40a6-bf70-23ed9a1c4841" (UID: "247fdb46-14d6-40a6-bf70-23ed9a1c4841"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:46:51 crc kubenswrapper[4749]: I0219 18:46:51.618344 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247fdb46-14d6-40a6-bf70-23ed9a1c4841-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 18:46:51 crc kubenswrapper[4749]: I0219 18:46:51.618399 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdsdb\" (UniqueName: \"kubernetes.io/projected/247fdb46-14d6-40a6-bf70-23ed9a1c4841-kube-api-access-vdsdb\") on node \"crc\" DevicePath \"\"" Feb 19 18:46:51 crc kubenswrapper[4749]: I0219 18:46:51.618415 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247fdb46-14d6-40a6-bf70-23ed9a1c4841-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 18:46:52 crc kubenswrapper[4749]: I0219 18:46:52.503747 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wbv9h" Feb 19 18:46:52 crc kubenswrapper[4749]: I0219 18:46:52.554204 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wbv9h"] Feb 19 18:46:52 crc kubenswrapper[4749]: I0219 18:46:52.558750 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wbv9h"] Feb 19 18:46:52 crc kubenswrapper[4749]: I0219 18:46:52.685897 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="247fdb46-14d6-40a6-bf70-23ed9a1c4841" path="/var/lib/kubelet/pods/247fdb46-14d6-40a6-bf70-23ed9a1c4841/volumes" Feb 19 18:46:53 crc kubenswrapper[4749]: I0219 18:46:53.511839 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-66b7b94c9b-n69x7" event={"ID":"9c6d5734-2093-42d7-a330-59c6dc0dc138","Type":"ContainerStarted","Data":"dfdc45ab4f8353267a5000bf3661c4bf59e342ea5f4abb475bb652d58a3ad909"} Feb 19 18:46:53 crc kubenswrapper[4749]: I0219 18:46:53.513167 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-66b7b94c9b-n69x7" Feb 19 18:46:53 crc kubenswrapper[4749]: I0219 18:46:53.513864 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-74989bddb6-dcst5" event={"ID":"bcecd22a-15ba-4bca-8be3-9cc08843c86d","Type":"ContainerStarted","Data":"dfca17aeea0fe0c46aa834e2df0e31c255d2c0a15fc51bb77169284dff2f52d6"} Feb 19 18:46:53 crc kubenswrapper[4749]: I0219 18:46:53.514077 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-74989bddb6-dcst5" Feb 19 18:46:53 crc kubenswrapper[4749]: I0219 18:46:53.537988 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-66b7b94c9b-n69x7" podStartSLOduration=2.837052675 podStartE2EDuration="8.537961454s" podCreationTimestamp="2026-02-19 18:46:45 +0000 UTC" firstStartedPulling="2026-02-19 18:46:45.84028434 +0000 UTC m=+779.801504294" lastFinishedPulling="2026-02-19 18:46:51.541193119 +0000 UTC m=+785.502413073" observedRunningTime="2026-02-19 18:46:53.534284346 +0000 UTC m=+787.495504300" watchObservedRunningTime="2026-02-19 18:46:53.537961454 +0000 UTC m=+787.499181428" Feb 19 18:46:53 crc kubenswrapper[4749]: I0219 18:46:53.560209 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-74989bddb6-dcst5" podStartSLOduration=3.395831742 podStartE2EDuration="9.560181928s" podCreationTimestamp="2026-02-19 18:46:44 +0000 UTC" firstStartedPulling="2026-02-19 18:46:45.325570961 +0000 UTC m=+779.286790915" lastFinishedPulling="2026-02-19 18:46:51.489921147 +0000 UTC m=+785.451141101" observedRunningTime="2026-02-19 18:46:53.556666714 +0000 UTC m=+787.517886668" watchObservedRunningTime="2026-02-19 18:46:53.560181928 +0000 UTC m=+787.521401872" Feb 19 18:46:54 crc kubenswrapper[4749]: I0219 18:46:54.725823 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 18:46:54 crc kubenswrapper[4749]: I0219 18:46:54.725944 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 18:46:56 crc kubenswrapper[4749]: I0219 18:46:56.228848 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-brrht"] Feb 19 18:46:56 crc kubenswrapper[4749]: I0219 18:46:56.229186 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-brrht" podUID="6511820d-c20f-4210-a8a1-2d8479ed7331" containerName="registry-server" containerID="cri-o://93f529e2838bb0e0038c821f094efa1bf49599a2f519841bebf74aff7282c408" gracePeriod=2 Feb 19 18:46:56 crc kubenswrapper[4749]: I0219 18:46:56.537717 4749 generic.go:334] "Generic (PLEG): container finished" podID="6511820d-c20f-4210-a8a1-2d8479ed7331" containerID="93f529e2838bb0e0038c821f094efa1bf49599a2f519841bebf74aff7282c408" exitCode=0 Feb 19 18:46:56 crc kubenswrapper[4749]: I0219 18:46:56.538120 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brrht" event={"ID":"6511820d-c20f-4210-a8a1-2d8479ed7331","Type":"ContainerDied","Data":"93f529e2838bb0e0038c821f094efa1bf49599a2f519841bebf74aff7282c408"} Feb 19 18:46:56 crc kubenswrapper[4749]: I0219 18:46:56.583682 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-brrht" Feb 19 18:46:56 crc kubenswrapper[4749]: I0219 18:46:56.748366 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn5b7\" (UniqueName: \"kubernetes.io/projected/6511820d-c20f-4210-a8a1-2d8479ed7331-kube-api-access-vn5b7\") pod \"6511820d-c20f-4210-a8a1-2d8479ed7331\" (UID: \"6511820d-c20f-4210-a8a1-2d8479ed7331\") " Feb 19 18:46:56 crc kubenswrapper[4749]: I0219 18:46:56.748555 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6511820d-c20f-4210-a8a1-2d8479ed7331-utilities\") pod \"6511820d-c20f-4210-a8a1-2d8479ed7331\" (UID: \"6511820d-c20f-4210-a8a1-2d8479ed7331\") " Feb 19 18:46:56 crc kubenswrapper[4749]: I0219 18:46:56.748654 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6511820d-c20f-4210-a8a1-2d8479ed7331-catalog-content\") pod \"6511820d-c20f-4210-a8a1-2d8479ed7331\" (UID: \"6511820d-c20f-4210-a8a1-2d8479ed7331\") " Feb 19 18:46:56 crc kubenswrapper[4749]: I0219 18:46:56.750988 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6511820d-c20f-4210-a8a1-2d8479ed7331-utilities" (OuterVolumeSpecName: "utilities") pod "6511820d-c20f-4210-a8a1-2d8479ed7331" (UID: "6511820d-c20f-4210-a8a1-2d8479ed7331"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:46:56 crc kubenswrapper[4749]: I0219 18:46:56.770264 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6511820d-c20f-4210-a8a1-2d8479ed7331-kube-api-access-vn5b7" (OuterVolumeSpecName: "kube-api-access-vn5b7") pod "6511820d-c20f-4210-a8a1-2d8479ed7331" (UID: "6511820d-c20f-4210-a8a1-2d8479ed7331"). InnerVolumeSpecName "kube-api-access-vn5b7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:46:56 crc kubenswrapper[4749]: I0219 18:46:56.820913 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6511820d-c20f-4210-a8a1-2d8479ed7331-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6511820d-c20f-4210-a8a1-2d8479ed7331" (UID: "6511820d-c20f-4210-a8a1-2d8479ed7331"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:46:56 crc kubenswrapper[4749]: I0219 18:46:56.850184 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6511820d-c20f-4210-a8a1-2d8479ed7331-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 18:46:56 crc kubenswrapper[4749]: I0219 18:46:56.850209 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vn5b7\" (UniqueName: \"kubernetes.io/projected/6511820d-c20f-4210-a8a1-2d8479ed7331-kube-api-access-vn5b7\") on node \"crc\" DevicePath \"\"" Feb 19 18:46:56 crc kubenswrapper[4749]: I0219 18:46:56.850222 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6511820d-c20f-4210-a8a1-2d8479ed7331-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 18:46:57 crc kubenswrapper[4749]: I0219 18:46:57.546472 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brrht" event={"ID":"6511820d-c20f-4210-a8a1-2d8479ed7331","Type":"ContainerDied","Data":"931f4bbb375fe94c58b5d0298c66b5e418f403955dd2f16e0c3c0573683fff60"} Feb 19 18:46:57 crc kubenswrapper[4749]: I0219 18:46:57.546538 4749 scope.go:117] "RemoveContainer" containerID="93f529e2838bb0e0038c821f094efa1bf49599a2f519841bebf74aff7282c408" Feb 19 18:46:57 crc kubenswrapper[4749]: I0219 18:46:57.546567 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-brrht" Feb 19 18:46:57 crc kubenswrapper[4749]: I0219 18:46:57.567600 4749 scope.go:117] "RemoveContainer" containerID="07eb24987ecc3af7d0a1db9108969c7a9b089105c8729c7e43d0e851dbf649a2" Feb 19 18:46:57 crc kubenswrapper[4749]: I0219 18:46:57.581800 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-brrht"] Feb 19 18:46:57 crc kubenswrapper[4749]: I0219 18:46:57.586730 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-brrht"] Feb 19 18:46:57 crc kubenswrapper[4749]: I0219 18:46:57.595980 4749 scope.go:117] "RemoveContainer" containerID="aea9119caacca98b02c9a0bf6923912c73bac88ea1d33a9ba60ca88d98b648d1" Feb 19 18:46:58 crc kubenswrapper[4749]: I0219 18:46:58.687094 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6511820d-c20f-4210-a8a1-2d8479ed7331" path="/var/lib/kubelet/pods/6511820d-c20f-4210-a8a1-2d8479ed7331/volumes" Feb 19 18:47:00 crc kubenswrapper[4749]: I0219 18:47:00.273337 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vmt29"] Feb 19 18:47:00 crc kubenswrapper[4749]: E0219 18:47:00.273864 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247fdb46-14d6-40a6-bf70-23ed9a1c4841" containerName="extract-utilities" Feb 19 18:47:00 crc kubenswrapper[4749]: I0219 18:47:00.273876 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="247fdb46-14d6-40a6-bf70-23ed9a1c4841" containerName="extract-utilities" Feb 19 18:47:00 crc kubenswrapper[4749]: E0219 18:47:00.273885 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6511820d-c20f-4210-a8a1-2d8479ed7331" containerName="extract-utilities" Feb 19 18:47:00 crc kubenswrapper[4749]: I0219 18:47:00.273891 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6511820d-c20f-4210-a8a1-2d8479ed7331" containerName="extract-utilities" Feb 19 18:47:00 crc kubenswrapper[4749]: E0219 18:47:00.273902 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6511820d-c20f-4210-a8a1-2d8479ed7331" containerName="extract-content" Feb 19 18:47:00 crc kubenswrapper[4749]: I0219 18:47:00.273909 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6511820d-c20f-4210-a8a1-2d8479ed7331" containerName="extract-content" Feb 19 18:47:00 crc kubenswrapper[4749]: E0219 18:47:00.273917 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6511820d-c20f-4210-a8a1-2d8479ed7331" containerName="registry-server" Feb 19 18:47:00 crc kubenswrapper[4749]: I0219 18:47:00.273924 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6511820d-c20f-4210-a8a1-2d8479ed7331" containerName="registry-server" Feb 19 18:47:00 crc kubenswrapper[4749]: E0219 18:47:00.273935 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247fdb46-14d6-40a6-bf70-23ed9a1c4841" containerName="registry-server" Feb 19 18:47:00 crc kubenswrapper[4749]: I0219 18:47:00.273941 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="247fdb46-14d6-40a6-bf70-23ed9a1c4841" containerName="registry-server" Feb 19 18:47:00 crc kubenswrapper[4749]: E0219 18:47:00.273951 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247fdb46-14d6-40a6-bf70-23ed9a1c4841" containerName="extract-content" Feb 19 18:47:00 crc kubenswrapper[4749]: I0219 18:47:00.273957 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="247fdb46-14d6-40a6-bf70-23ed9a1c4841" containerName="extract-content" Feb 19 18:47:00 crc kubenswrapper[4749]: I0219 18:47:00.274069 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6511820d-c20f-4210-a8a1-2d8479ed7331" containerName="registry-server" Feb 19 18:47:00 crc kubenswrapper[4749]: I0219 18:47:00.274079 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="247fdb46-14d6-40a6-bf70-23ed9a1c4841" containerName="registry-server" Feb 19 18:47:00 crc kubenswrapper[4749]: I0219 18:47:00.274780 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vmt29" Feb 19 18:47:00 crc kubenswrapper[4749]: I0219 18:47:00.286503 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vmt29"] Feb 19 18:47:00 crc kubenswrapper[4749]: I0219 18:47:00.293673 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24ll8\" (UniqueName: \"kubernetes.io/projected/b2e75082-e2e5-42c5-adde-61d2cafcc12b-kube-api-access-24ll8\") pod \"certified-operators-vmt29\" (UID: \"b2e75082-e2e5-42c5-adde-61d2cafcc12b\") " pod="openshift-marketplace/certified-operators-vmt29" Feb 19 18:47:00 crc kubenswrapper[4749]: I0219 18:47:00.293957 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2e75082-e2e5-42c5-adde-61d2cafcc12b-catalog-content\") pod \"certified-operators-vmt29\" (UID: \"b2e75082-e2e5-42c5-adde-61d2cafcc12b\") " pod="openshift-marketplace/certified-operators-vmt29" Feb 19 18:47:00 crc kubenswrapper[4749]: I0219 18:47:00.294150 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2e75082-e2e5-42c5-adde-61d2cafcc12b-utilities\") pod \"certified-operators-vmt29\" (UID: \"b2e75082-e2e5-42c5-adde-61d2cafcc12b\") " pod="openshift-marketplace/certified-operators-vmt29" Feb 19 18:47:00 crc kubenswrapper[4749]: I0219 18:47:00.396052 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2e75082-e2e5-42c5-adde-61d2cafcc12b-utilities\") pod \"certified-operators-vmt29\" (UID: \"b2e75082-e2e5-42c5-adde-61d2cafcc12b\") " pod="openshift-marketplace/certified-operators-vmt29" Feb 19 18:47:00 crc kubenswrapper[4749]: I0219 18:47:00.396144 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24ll8\" (UniqueName: \"kubernetes.io/projected/b2e75082-e2e5-42c5-adde-61d2cafcc12b-kube-api-access-24ll8\") pod \"certified-operators-vmt29\" (UID: \"b2e75082-e2e5-42c5-adde-61d2cafcc12b\") " pod="openshift-marketplace/certified-operators-vmt29" Feb 19 18:47:00 crc kubenswrapper[4749]: I0219 18:47:00.396210 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2e75082-e2e5-42c5-adde-61d2cafcc12b-catalog-content\") pod \"certified-operators-vmt29\" (UID: \"b2e75082-e2e5-42c5-adde-61d2cafcc12b\") " pod="openshift-marketplace/certified-operators-vmt29" Feb 19 18:47:00 crc kubenswrapper[4749]: I0219 18:47:00.396694 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2e75082-e2e5-42c5-adde-61d2cafcc12b-utilities\") pod \"certified-operators-vmt29\" (UID: \"b2e75082-e2e5-42c5-adde-61d2cafcc12b\") " pod="openshift-marketplace/certified-operators-vmt29" Feb 19 18:47:00 crc kubenswrapper[4749]: I0219 18:47:00.396746 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2e75082-e2e5-42c5-adde-61d2cafcc12b-catalog-content\") pod \"certified-operators-vmt29\" (UID: \"b2e75082-e2e5-42c5-adde-61d2cafcc12b\") " pod="openshift-marketplace/certified-operators-vmt29" Feb 19 18:47:00 crc kubenswrapper[4749]: I0219 18:47:00.419104 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24ll8\" (UniqueName: \"kubernetes.io/projected/b2e75082-e2e5-42c5-adde-61d2cafcc12b-kube-api-access-24ll8\") pod \"certified-operators-vmt29\" (UID: \"b2e75082-e2e5-42c5-adde-61d2cafcc12b\") " pod="openshift-marketplace/certified-operators-vmt29" Feb 19 18:47:00 crc kubenswrapper[4749]: I0219 18:47:00.592950 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vmt29" Feb 19 18:47:00 crc kubenswrapper[4749]: I0219 18:47:00.846165 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vmt29"] Feb 19 18:47:01 crc kubenswrapper[4749]: I0219 18:47:01.572317 4749 generic.go:334] "Generic (PLEG): container finished" podID="b2e75082-e2e5-42c5-adde-61d2cafcc12b" containerID="b1fe7084b7736705ea7e83c05c530a0f1fd3f6d9e53471964369357c61e99f96" exitCode=0 Feb 19 18:47:01 crc kubenswrapper[4749]: I0219 18:47:01.572377 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmt29" event={"ID":"b2e75082-e2e5-42c5-adde-61d2cafcc12b","Type":"ContainerDied","Data":"b1fe7084b7736705ea7e83c05c530a0f1fd3f6d9e53471964369357c61e99f96"} Feb 19 18:47:01 crc kubenswrapper[4749]: I0219 18:47:01.572642 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmt29" event={"ID":"b2e75082-e2e5-42c5-adde-61d2cafcc12b","Type":"ContainerStarted","Data":"71eecc91bd7a8ac72578d3056618f3674f3b22456a6b78dc91e9b7b178319491"} Feb 19 18:47:03 crc kubenswrapper[4749]: I0219 18:47:03.585862 4749 generic.go:334] "Generic (PLEG): container finished" podID="b2e75082-e2e5-42c5-adde-61d2cafcc12b" containerID="863589a8222fd429fcf8ebd9c465df97dc24112bae55c352efe2677b68dc9563" exitCode=0 Feb 19 18:47:03 crc kubenswrapper[4749]: I0219 18:47:03.585909 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmt29" event={"ID":"b2e75082-e2e5-42c5-adde-61d2cafcc12b","Type":"ContainerDied","Data":"863589a8222fd429fcf8ebd9c465df97dc24112bae55c352efe2677b68dc9563"} Feb 19 18:47:04 crc kubenswrapper[4749]: I0219 18:47:04.592879 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmt29" event={"ID":"b2e75082-e2e5-42c5-adde-61d2cafcc12b","Type":"ContainerStarted","Data":"e704521b259361c53f175997f64c144bcf25e36e16603e75aa99fc4ec424c49f"} Feb 19 18:47:04 crc kubenswrapper[4749]: I0219 18:47:04.612387 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vmt29" podStartSLOduration=2.140876702 podStartE2EDuration="4.612371926s" podCreationTimestamp="2026-02-19 18:47:00 +0000 UTC" firstStartedPulling="2026-02-19 18:47:01.573588409 +0000 UTC m=+795.534808363" lastFinishedPulling="2026-02-19 18:47:04.045083613 +0000 UTC m=+798.006303587" observedRunningTime="2026-02-19 18:47:04.608999864 +0000 UTC m=+798.570219818" watchObservedRunningTime="2026-02-19 18:47:04.612371926 +0000 UTC m=+798.573591890" Feb 19 18:47:05 crc kubenswrapper[4749]: I0219 18:47:05.350896 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-66b7b94c9b-n69x7" Feb 19 18:47:10 crc kubenswrapper[4749]: I0219 18:47:10.594305 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vmt29" Feb 19 18:47:10 crc kubenswrapper[4749]: I0219 18:47:10.594656 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vmt29" Feb 19 18:47:10 crc kubenswrapper[4749]: I0219 18:47:10.635746 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vmt29" Feb 19 18:47:10 crc kubenswrapper[4749]: I0219 18:47:10.671198 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vmt29" Feb 19 18:47:12 crc kubenswrapper[4749]: I0219 18:47:12.421415 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vmt29"] Feb 19 18:47:12 crc kubenswrapper[4749]: I0219 18:47:12.633222 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vmt29" podUID="b2e75082-e2e5-42c5-adde-61d2cafcc12b" containerName="registry-server" containerID="cri-o://e704521b259361c53f175997f64c144bcf25e36e16603e75aa99fc4ec424c49f" gracePeriod=2 Feb 19 18:47:13 crc kubenswrapper[4749]: I0219 18:47:13.016215 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vmt29" Feb 19 18:47:13 crc kubenswrapper[4749]: I0219 18:47:13.147783 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2e75082-e2e5-42c5-adde-61d2cafcc12b-catalog-content\") pod \"b2e75082-e2e5-42c5-adde-61d2cafcc12b\" (UID: \"b2e75082-e2e5-42c5-adde-61d2cafcc12b\") " Feb 19 18:47:13 crc kubenswrapper[4749]: I0219 18:47:13.147919 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2e75082-e2e5-42c5-adde-61d2cafcc12b-utilities\") pod \"b2e75082-e2e5-42c5-adde-61d2cafcc12b\" (UID: \"b2e75082-e2e5-42c5-adde-61d2cafcc12b\") " Feb 19 18:47:13 crc kubenswrapper[4749]: I0219 18:47:13.147978 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24ll8\" (UniqueName: \"kubernetes.io/projected/b2e75082-e2e5-42c5-adde-61d2cafcc12b-kube-api-access-24ll8\") pod \"b2e75082-e2e5-42c5-adde-61d2cafcc12b\" (UID: \"b2e75082-e2e5-42c5-adde-61d2cafcc12b\") " Feb 19 18:47:13 crc kubenswrapper[4749]: I0219 18:47:13.148734 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2e75082-e2e5-42c5-adde-61d2cafcc12b-utilities" (OuterVolumeSpecName: "utilities") pod "b2e75082-e2e5-42c5-adde-61d2cafcc12b" (UID: "b2e75082-e2e5-42c5-adde-61d2cafcc12b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:47:13 crc kubenswrapper[4749]: I0219 18:47:13.170588 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2e75082-e2e5-42c5-adde-61d2cafcc12b-kube-api-access-24ll8" (OuterVolumeSpecName: "kube-api-access-24ll8") pod "b2e75082-e2e5-42c5-adde-61d2cafcc12b" (UID: "b2e75082-e2e5-42c5-adde-61d2cafcc12b"). InnerVolumeSpecName "kube-api-access-24ll8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:47:13 crc kubenswrapper[4749]: I0219 18:47:13.211542 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2e75082-e2e5-42c5-adde-61d2cafcc12b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2e75082-e2e5-42c5-adde-61d2cafcc12b" (UID: "b2e75082-e2e5-42c5-adde-61d2cafcc12b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:47:13 crc kubenswrapper[4749]: I0219 18:47:13.256675 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2e75082-e2e5-42c5-adde-61d2cafcc12b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 18:47:13 crc kubenswrapper[4749]: I0219 18:47:13.257049 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2e75082-e2e5-42c5-adde-61d2cafcc12b-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 18:47:13 crc kubenswrapper[4749]: I0219 18:47:13.257059 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24ll8\" (UniqueName: \"kubernetes.io/projected/b2e75082-e2e5-42c5-adde-61d2cafcc12b-kube-api-access-24ll8\") on node \"crc\" DevicePath \"\"" Feb 19 18:47:13 crc kubenswrapper[4749]: I0219 18:47:13.641611 4749 generic.go:334] "Generic (PLEG): container finished" podID="b2e75082-e2e5-42c5-adde-61d2cafcc12b" containerID="e704521b259361c53f175997f64c144bcf25e36e16603e75aa99fc4ec424c49f" exitCode=0 Feb 19 18:47:13 crc kubenswrapper[4749]: I0219 18:47:13.641652 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmt29" event={"ID":"b2e75082-e2e5-42c5-adde-61d2cafcc12b","Type":"ContainerDied","Data":"e704521b259361c53f175997f64c144bcf25e36e16603e75aa99fc4ec424c49f"} Feb 19 18:47:13 crc kubenswrapper[4749]: I0219 18:47:13.641676 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmt29" event={"ID":"b2e75082-e2e5-42c5-adde-61d2cafcc12b","Type":"ContainerDied","Data":"71eecc91bd7a8ac72578d3056618f3674f3b22456a6b78dc91e9b7b178319491"} Feb 19 18:47:13 crc kubenswrapper[4749]: I0219 18:47:13.641696 4749 scope.go:117] "RemoveContainer" containerID="e704521b259361c53f175997f64c144bcf25e36e16603e75aa99fc4ec424c49f" Feb 19 18:47:13 crc kubenswrapper[4749]: I0219 18:47:13.641713 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vmt29" Feb 19 18:47:13 crc kubenswrapper[4749]: I0219 18:47:13.668280 4749 scope.go:117] "RemoveContainer" containerID="863589a8222fd429fcf8ebd9c465df97dc24112bae55c352efe2677b68dc9563" Feb 19 18:47:13 crc kubenswrapper[4749]: I0219 18:47:13.680934 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vmt29"] Feb 19 18:47:13 crc kubenswrapper[4749]: I0219 18:47:13.685567 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vmt29"] Feb 19 18:47:13 crc kubenswrapper[4749]: I0219 18:47:13.701218 4749 scope.go:117] "RemoveContainer" containerID="b1fe7084b7736705ea7e83c05c530a0f1fd3f6d9e53471964369357c61e99f96" Feb 19 18:47:13 crc kubenswrapper[4749]: I0219 18:47:13.714459 4749 scope.go:117] "RemoveContainer" containerID="e704521b259361c53f175997f64c144bcf25e36e16603e75aa99fc4ec424c49f" Feb 19 18:47:13 crc kubenswrapper[4749]: E0219 18:47:13.714933 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e704521b259361c53f175997f64c144bcf25e36e16603e75aa99fc4ec424c49f\": container with ID starting with e704521b259361c53f175997f64c144bcf25e36e16603e75aa99fc4ec424c49f not found: ID does not exist" containerID="e704521b259361c53f175997f64c144bcf25e36e16603e75aa99fc4ec424c49f" Feb 19 18:47:13 crc kubenswrapper[4749]: I0219 18:47:13.714970 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e704521b259361c53f175997f64c144bcf25e36e16603e75aa99fc4ec424c49f"} err="failed to get container status \"e704521b259361c53f175997f64c144bcf25e36e16603e75aa99fc4ec424c49f\": rpc error: code = NotFound desc = could not find container \"e704521b259361c53f175997f64c144bcf25e36e16603e75aa99fc4ec424c49f\": container with ID starting with e704521b259361c53f175997f64c144bcf25e36e16603e75aa99fc4ec424c49f not found: ID does not exist" Feb 19 18:47:13 crc kubenswrapper[4749]: I0219 18:47:13.714995 4749 scope.go:117] "RemoveContainer" containerID="863589a8222fd429fcf8ebd9c465df97dc24112bae55c352efe2677b68dc9563" Feb 19 18:47:13 crc kubenswrapper[4749]: E0219 18:47:13.715359 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"863589a8222fd429fcf8ebd9c465df97dc24112bae55c352efe2677b68dc9563\": container with ID starting with 863589a8222fd429fcf8ebd9c465df97dc24112bae55c352efe2677b68dc9563 not found: ID does not exist" containerID="863589a8222fd429fcf8ebd9c465df97dc24112bae55c352efe2677b68dc9563" Feb 19 18:47:13 crc kubenswrapper[4749]: I0219 18:47:13.715380 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"863589a8222fd429fcf8ebd9c465df97dc24112bae55c352efe2677b68dc9563"} err="failed to get container status \"863589a8222fd429fcf8ebd9c465df97dc24112bae55c352efe2677b68dc9563\": rpc error: code = NotFound desc = could not find container \"863589a8222fd429fcf8ebd9c465df97dc24112bae55c352efe2677b68dc9563\": container with ID starting with 863589a8222fd429fcf8ebd9c465df97dc24112bae55c352efe2677b68dc9563 not found: ID does not exist" Feb 19 18:47:13 crc kubenswrapper[4749]: I0219 18:47:13.715394 4749 scope.go:117] "RemoveContainer" containerID="b1fe7084b7736705ea7e83c05c530a0f1fd3f6d9e53471964369357c61e99f96" Feb 19 18:47:13 crc kubenswrapper[4749]: E0219 18:47:13.715810 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1fe7084b7736705ea7e83c05c530a0f1fd3f6d9e53471964369357c61e99f96\": container with ID starting with b1fe7084b7736705ea7e83c05c530a0f1fd3f6d9e53471964369357c61e99f96 not found: ID does not exist" containerID="b1fe7084b7736705ea7e83c05c530a0f1fd3f6d9e53471964369357c61e99f96" Feb 19 18:47:13 crc kubenswrapper[4749]: I0219 18:47:13.715833 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1fe7084b7736705ea7e83c05c530a0f1fd3f6d9e53471964369357c61e99f96"} err="failed to get container status \"b1fe7084b7736705ea7e83c05c530a0f1fd3f6d9e53471964369357c61e99f96\": rpc error: code = NotFound desc = could not find container \"b1fe7084b7736705ea7e83c05c530a0f1fd3f6d9e53471964369357c61e99f96\": container with ID starting with b1fe7084b7736705ea7e83c05c530a0f1fd3f6d9e53471964369357c61e99f96 not found: ID does not exist" Feb 19 18:47:14 crc kubenswrapper[4749]: I0219 18:47:14.687912 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2e75082-e2e5-42c5-adde-61d2cafcc12b" path="/var/lib/kubelet/pods/b2e75082-e2e5-42c5-adde-61d2cafcc12b/volumes" Feb 19 18:47:24 crc kubenswrapper[4749]: I0219 18:47:24.725889 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 18:47:24 crc kubenswrapper[4749]: I0219 18:47:24.726727 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 18:47:24 crc kubenswrapper[4749]: I0219 18:47:24.726785 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 18:47:24 crc kubenswrapper[4749]: I0219 18:47:24.727473 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d32840142f59a3f51a6617459783496dfcb99167a3d91ff021347454591db672"} pod="openshift-machine-config-operator/machine-config-daemon-nzldt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 18:47:24 crc kubenswrapper[4749]: I0219 18:47:24.727545 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" containerID="cri-o://d32840142f59a3f51a6617459783496dfcb99167a3d91ff021347454591db672" gracePeriod=600 Feb 19 18:47:25 crc kubenswrapper[4749]: I0219 18:47:25.047368 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-74989bddb6-dcst5" Feb 19 18:47:25 crc kubenswrapper[4749]: I0219 18:47:25.711656 4749 generic.go:334] "Generic (PLEG): container finished" podID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerID="d32840142f59a3f51a6617459783496dfcb99167a3d91ff021347454591db672" exitCode=0 Feb 19 18:47:25 crc kubenswrapper[4749]: I0219 18:47:25.711685 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerDied","Data":"d32840142f59a3f51a6617459783496dfcb99167a3d91ff021347454591db672"} Feb 19 18:47:25 crc kubenswrapper[4749]: I0219 18:47:25.712083 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerStarted","Data":"53301e57110cdb8ea70a0d60fa7f17a0a5c063180c8f2db9d35e0ad01b3622e9"} Feb 19 18:47:25 crc kubenswrapper[4749]: I0219 18:47:25.712105 4749 scope.go:117] "RemoveContainer" containerID="834fb3e32a872fa725853bfb5119dcd730968979e3bcf1f345f3d75fe740a490" Feb 19 18:47:25 crc kubenswrapper[4749]: I0219 18:47:25.909755 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-qx65k"] Feb 19 18:47:25 crc kubenswrapper[4749]: E0219 18:47:25.909971 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2e75082-e2e5-42c5-adde-61d2cafcc12b" containerName="extract-content" Feb 19 18:47:25 crc kubenswrapper[4749]: I0219 18:47:25.909982 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2e75082-e2e5-42c5-adde-61d2cafcc12b" containerName="extract-content" Feb 19 18:47:25 crc kubenswrapper[4749]: E0219 18:47:25.909994 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2e75082-e2e5-42c5-adde-61d2cafcc12b" containerName="extract-utilities" Feb 19 18:47:25 crc kubenswrapper[4749]: I0219 18:47:25.910001 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2e75082-e2e5-42c5-adde-61d2cafcc12b" containerName="extract-utilities" Feb 19 18:47:25 crc kubenswrapper[4749]: E0219 18:47:25.910014 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2e75082-e2e5-42c5-adde-61d2cafcc12b" containerName="registry-server" Feb 19 18:47:25 crc kubenswrapper[4749]: I0219 18:47:25.910035 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2e75082-e2e5-42c5-adde-61d2cafcc12b" containerName="registry-server" Feb 19 18:47:25 crc kubenswrapper[4749]: I0219 18:47:25.910150 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2e75082-e2e5-42c5-adde-61d2cafcc12b" containerName="registry-server" Feb 19 18:47:25 crc kubenswrapper[4749]: I0219 18:47:25.912227 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:25 crc kubenswrapper[4749]: I0219 18:47:25.917103 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 19 18:47:25 crc kubenswrapper[4749]: I0219 18:47:25.922510 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-4qvvj" Feb 19 18:47:25 crc kubenswrapper[4749]: I0219 18:47:25.922511 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 19 18:47:25 crc kubenswrapper[4749]: I0219 18:47:25.952774 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-66v4b"] Feb 19 18:47:25 crc kubenswrapper[4749]: I0219 18:47:25.953492 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-66v4b" Feb 19 18:47:25 crc kubenswrapper[4749]: I0219 18:47:25.956018 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 19 18:47:25 crc kubenswrapper[4749]: I0219 18:47:25.963044 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-66v4b"] Feb 19 18:47:25 crc kubenswrapper[4749]: I0219 18:47:25.963626 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgnx8\" (UniqueName: \"kubernetes.io/projected/ec4d9a98-6ae9-4282-8b6e-eb107c8fadc5-kube-api-access-jgnx8\") pod \"frr-k8s-webhook-server-78b44bf5bb-66v4b\" (UID: \"ec4d9a98-6ae9-4282-8b6e-eb107c8fadc5\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-66v4b" Feb 19 18:47:25 crc kubenswrapper[4749]: I0219 18:47:25.963687 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7166b5c-848f-44ba-a13a-a8c5b6301844-metrics-certs\") pod \"frr-k8s-qx65k\" (UID: \"c7166b5c-848f-44ba-a13a-a8c5b6301844\") " pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:25 crc kubenswrapper[4749]: I0219 18:47:25.963767 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c7166b5c-848f-44ba-a13a-a8c5b6301844-frr-conf\") pod \"frr-k8s-qx65k\" (UID: \"c7166b5c-848f-44ba-a13a-a8c5b6301844\") " pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:25 crc kubenswrapper[4749]: I0219 18:47:25.963856 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c7166b5c-848f-44ba-a13a-a8c5b6301844-metrics\") pod \"frr-k8s-qx65k\" (UID: \"c7166b5c-848f-44ba-a13a-a8c5b6301844\") " pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:25 crc kubenswrapper[4749]: I0219 18:47:25.963910 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c7166b5c-848f-44ba-a13a-a8c5b6301844-frr-startup\") pod \"frr-k8s-qx65k\" (UID: \"c7166b5c-848f-44ba-a13a-a8c5b6301844\") " pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:25 crc kubenswrapper[4749]: I0219 18:47:25.963934 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhkdn\" (UniqueName: \"kubernetes.io/projected/c7166b5c-848f-44ba-a13a-a8c5b6301844-kube-api-access-hhkdn\") pod \"frr-k8s-qx65k\" (UID: \"c7166b5c-848f-44ba-a13a-a8c5b6301844\") " pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:25 crc kubenswrapper[4749]: I0219 18:47:25.963957 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec4d9a98-6ae9-4282-8b6e-eb107c8fadc5-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-66v4b\" (UID: \"ec4d9a98-6ae9-4282-8b6e-eb107c8fadc5\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-66v4b" Feb 19 18:47:25 crc kubenswrapper[4749]: I0219 18:47:25.964014 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c7166b5c-848f-44ba-a13a-a8c5b6301844-frr-sockets\") pod \"frr-k8s-qx65k\" (UID: \"c7166b5c-848f-44ba-a13a-a8c5b6301844\") " pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:25 crc kubenswrapper[4749]: I0219 18:47:25.964059 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c7166b5c-848f-44ba-a13a-a8c5b6301844-reloader\") pod \"frr-k8s-qx65k\" (UID: \"c7166b5c-848f-44ba-a13a-a8c5b6301844\") " pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.016513 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-6x54g"] Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.017514 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-6x54g" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.019202 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.019209 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.019441 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.019552 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-gh5l6" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.028054 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-trs7x"] Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.028880 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-trs7x" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.031040 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.052291 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-trs7x"] Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.064830 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c7166b5c-848f-44ba-a13a-a8c5b6301844-reloader\") pod \"frr-k8s-qx65k\" (UID: \"c7166b5c-848f-44ba-a13a-a8c5b6301844\") " pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.064871 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d4ac8583-8e6a-4c40-9903-37fe6f82d038-metallb-excludel2\") pod \"speaker-6x54g\" (UID: \"d4ac8583-8e6a-4c40-9903-37fe6f82d038\") " pod="metallb-system/speaker-6x54g" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.064890 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4ac8583-8e6a-4c40-9903-37fe6f82d038-metrics-certs\") pod \"speaker-6x54g\" (UID: \"d4ac8583-8e6a-4c40-9903-37fe6f82d038\") " pod="metallb-system/speaker-6x54g" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.064904 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0920f1a5-2f5c-4ec0-b20a-4c1e2a152ccc-cert\") pod \"controller-69bbfbf88f-trs7x\" (UID: \"0920f1a5-2f5c-4ec0-b20a-4c1e2a152ccc\") " pod="metallb-system/controller-69bbfbf88f-trs7x" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.064930 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgnx8\" (UniqueName: \"kubernetes.io/projected/ec4d9a98-6ae9-4282-8b6e-eb107c8fadc5-kube-api-access-jgnx8\") pod \"frr-k8s-webhook-server-78b44bf5bb-66v4b\" (UID: \"ec4d9a98-6ae9-4282-8b6e-eb107c8fadc5\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-66v4b" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.064949 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7166b5c-848f-44ba-a13a-a8c5b6301844-metrics-certs\") pod \"frr-k8s-qx65k\" (UID: \"c7166b5c-848f-44ba-a13a-a8c5b6301844\") " pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:26 crc kubenswrapper[4749]: E0219 18:47:26.065050 4749 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Feb 19 18:47:26 crc kubenswrapper[4749]: E0219 18:47:26.065106 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7166b5c-848f-44ba-a13a-a8c5b6301844-metrics-certs podName:c7166b5c-848f-44ba-a13a-a8c5b6301844 nodeName:}" failed. No retries permitted until 2026-02-19 18:47:26.565087328 +0000 UTC m=+820.526307282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c7166b5c-848f-44ba-a13a-a8c5b6301844-metrics-certs") pod "frr-k8s-qx65k" (UID: "c7166b5c-848f-44ba-a13a-a8c5b6301844") : secret "frr-k8s-certs-secret" not found Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.065152 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0920f1a5-2f5c-4ec0-b20a-4c1e2a152ccc-metrics-certs\") pod \"controller-69bbfbf88f-trs7x\" (UID: \"0920f1a5-2f5c-4ec0-b20a-4c1e2a152ccc\") " pod="metallb-system/controller-69bbfbf88f-trs7x" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.065231 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c7166b5c-848f-44ba-a13a-a8c5b6301844-frr-conf\") pod \"frr-k8s-qx65k\" (UID: \"c7166b5c-848f-44ba-a13a-a8c5b6301844\") " pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.065276 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c7166b5c-848f-44ba-a13a-a8c5b6301844-reloader\") pod \"frr-k8s-qx65k\" (UID: \"c7166b5c-848f-44ba-a13a-a8c5b6301844\") " pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.065291 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c7166b5c-848f-44ba-a13a-a8c5b6301844-metrics\") pod \"frr-k8s-qx65k\" (UID: \"c7166b5c-848f-44ba-a13a-a8c5b6301844\") " pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.065379 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c7166b5c-848f-44ba-a13a-a8c5b6301844-frr-startup\") pod \"frr-k8s-qx65k\" (UID: \"c7166b5c-848f-44ba-a13a-a8c5b6301844\") " pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.065411 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhkdn\" (UniqueName: \"kubernetes.io/projected/c7166b5c-848f-44ba-a13a-a8c5b6301844-kube-api-access-hhkdn\") pod \"frr-k8s-qx65k\" (UID: \"c7166b5c-848f-44ba-a13a-a8c5b6301844\") " pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.065451 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d4ac8583-8e6a-4c40-9903-37fe6f82d038-memberlist\") pod \"speaker-6x54g\" (UID: \"d4ac8583-8e6a-4c40-9903-37fe6f82d038\") " pod="metallb-system/speaker-6x54g" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.065482 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec4d9a98-6ae9-4282-8b6e-eb107c8fadc5-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-66v4b\" (UID: \"ec4d9a98-6ae9-4282-8b6e-eb107c8fadc5\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-66v4b" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.065492 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c7166b5c-848f-44ba-a13a-a8c5b6301844-metrics\") pod \"frr-k8s-qx65k\" (UID: \"c7166b5c-848f-44ba-a13a-a8c5b6301844\") " pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.065522 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btk75\" (UniqueName: \"kubernetes.io/projected/d4ac8583-8e6a-4c40-9903-37fe6f82d038-kube-api-access-btk75\") pod \"speaker-6x54g\" (UID: \"d4ac8583-8e6a-4c40-9903-37fe6f82d038\") " pod="metallb-system/speaker-6x54g" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.065575 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c7166b5c-848f-44ba-a13a-a8c5b6301844-frr-conf\") pod \"frr-k8s-qx65k\" (UID: \"c7166b5c-848f-44ba-a13a-a8c5b6301844\") " pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.065719 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c7166b5c-848f-44ba-a13a-a8c5b6301844-frr-sockets\") pod \"frr-k8s-qx65k\" (UID: \"c7166b5c-848f-44ba-a13a-a8c5b6301844\") " pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.065752 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snkhf\" (UniqueName: \"kubernetes.io/projected/0920f1a5-2f5c-4ec0-b20a-4c1e2a152ccc-kube-api-access-snkhf\") pod \"controller-69bbfbf88f-trs7x\" (UID: \"0920f1a5-2f5c-4ec0-b20a-4c1e2a152ccc\") " pod="metallb-system/controller-69bbfbf88f-trs7x" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.066002 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c7166b5c-848f-44ba-a13a-a8c5b6301844-frr-sockets\") pod \"frr-k8s-qx65k\" (UID: \"c7166b5c-848f-44ba-a13a-a8c5b6301844\") " pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.066443 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c7166b5c-848f-44ba-a13a-a8c5b6301844-frr-startup\") pod \"frr-k8s-qx65k\" (UID: \"c7166b5c-848f-44ba-a13a-a8c5b6301844\") " pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.077436 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec4d9a98-6ae9-4282-8b6e-eb107c8fadc5-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-66v4b\" (UID: \"ec4d9a98-6ae9-4282-8b6e-eb107c8fadc5\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-66v4b" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.079400 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgnx8\" (UniqueName: \"kubernetes.io/projected/ec4d9a98-6ae9-4282-8b6e-eb107c8fadc5-kube-api-access-jgnx8\") pod \"frr-k8s-webhook-server-78b44bf5bb-66v4b\" (UID: \"ec4d9a98-6ae9-4282-8b6e-eb107c8fadc5\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-66v4b" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.084801 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhkdn\" (UniqueName: \"kubernetes.io/projected/c7166b5c-848f-44ba-a13a-a8c5b6301844-kube-api-access-hhkdn\") pod \"frr-k8s-qx65k\" (UID: \"c7166b5c-848f-44ba-a13a-a8c5b6301844\") " pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.166395 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snkhf\" (UniqueName: \"kubernetes.io/projected/0920f1a5-2f5c-4ec0-b20a-4c1e2a152ccc-kube-api-access-snkhf\") pod \"controller-69bbfbf88f-trs7x\" (UID: \"0920f1a5-2f5c-4ec0-b20a-4c1e2a152ccc\") " pod="metallb-system/controller-69bbfbf88f-trs7x" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.166457 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d4ac8583-8e6a-4c40-9903-37fe6f82d038-metallb-excludel2\") pod \"speaker-6x54g\" (UID: \"d4ac8583-8e6a-4c40-9903-37fe6f82d038\") " pod="metallb-system/speaker-6x54g" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.166482 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4ac8583-8e6a-4c40-9903-37fe6f82d038-metrics-certs\") pod \"speaker-6x54g\" (UID: \"d4ac8583-8e6a-4c40-9903-37fe6f82d038\") " pod="metallb-system/speaker-6x54g" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.166501 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0920f1a5-2f5c-4ec0-b20a-4c1e2a152ccc-cert\") pod \"controller-69bbfbf88f-trs7x\" (UID: \"0920f1a5-2f5c-4ec0-b20a-4c1e2a152ccc\") " pod="metallb-system/controller-69bbfbf88f-trs7x" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.167124 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d4ac8583-8e6a-4c40-9903-37fe6f82d038-metallb-excludel2\") pod \"speaker-6x54g\" (UID: \"d4ac8583-8e6a-4c40-9903-37fe6f82d038\") " pod="metallb-system/speaker-6x54g" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.167323 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0920f1a5-2f5c-4ec0-b20a-4c1e2a152ccc-metrics-certs\") pod \"controller-69bbfbf88f-trs7x\" (UID: \"0920f1a5-2f5c-4ec0-b20a-4c1e2a152ccc\") " pod="metallb-system/controller-69bbfbf88f-trs7x" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.167372 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d4ac8583-8e6a-4c40-9903-37fe6f82d038-memberlist\") pod \"speaker-6x54g\" (UID: \"d4ac8583-8e6a-4c40-9903-37fe6f82d038\") " pod="metallb-system/speaker-6x54g" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.167391 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btk75\" (UniqueName: \"kubernetes.io/projected/d4ac8583-8e6a-4c40-9903-37fe6f82d038-kube-api-access-btk75\") pod \"speaker-6x54g\" (UID: \"d4ac8583-8e6a-4c40-9903-37fe6f82d038\") " pod="metallb-system/speaker-6x54g" Feb 19 18:47:26 crc kubenswrapper[4749]: E0219 18:47:26.167545 4749 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 19 18:47:26 crc kubenswrapper[4749]: E0219 18:47:26.167592 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4ac8583-8e6a-4c40-9903-37fe6f82d038-memberlist podName:d4ac8583-8e6a-4c40-9903-37fe6f82d038 nodeName:}" failed. No retries permitted until 2026-02-19 18:47:26.667579431 +0000 UTC m=+820.628799385 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d4ac8583-8e6a-4c40-9903-37fe6f82d038-memberlist") pod "speaker-6x54g" (UID: "d4ac8583-8e6a-4c40-9903-37fe6f82d038") : secret "metallb-memberlist" not found Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.169705 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.170199 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4ac8583-8e6a-4c40-9903-37fe6f82d038-metrics-certs\") pod \"speaker-6x54g\" (UID: \"d4ac8583-8e6a-4c40-9903-37fe6f82d038\") " pod="metallb-system/speaker-6x54g" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.171200 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0920f1a5-2f5c-4ec0-b20a-4c1e2a152ccc-metrics-certs\") pod \"controller-69bbfbf88f-trs7x\" (UID: \"0920f1a5-2f5c-4ec0-b20a-4c1e2a152ccc\") " pod="metallb-system/controller-69bbfbf88f-trs7x" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.181900 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0920f1a5-2f5c-4ec0-b20a-4c1e2a152ccc-cert\") pod \"controller-69bbfbf88f-trs7x\" (UID: \"0920f1a5-2f5c-4ec0-b20a-4c1e2a152ccc\") " pod="metallb-system/controller-69bbfbf88f-trs7x" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.186627 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btk75\" (UniqueName: \"kubernetes.io/projected/d4ac8583-8e6a-4c40-9903-37fe6f82d038-kube-api-access-btk75\") pod \"speaker-6x54g\" (UID: \"d4ac8583-8e6a-4c40-9903-37fe6f82d038\") " pod="metallb-system/speaker-6x54g" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.194810 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snkhf\" (UniqueName: \"kubernetes.io/projected/0920f1a5-2f5c-4ec0-b20a-4c1e2a152ccc-kube-api-access-snkhf\") pod \"controller-69bbfbf88f-trs7x\" (UID: \"0920f1a5-2f5c-4ec0-b20a-4c1e2a152ccc\") " pod="metallb-system/controller-69bbfbf88f-trs7x" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.267476 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-66v4b" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.343384 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-trs7x" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.477516 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-66v4b"] Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.575121 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7166b5c-848f-44ba-a13a-a8c5b6301844-metrics-certs\") pod \"frr-k8s-qx65k\" (UID: \"c7166b5c-848f-44ba-a13a-a8c5b6301844\") " pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.581461 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7166b5c-848f-44ba-a13a-a8c5b6301844-metrics-certs\") pod \"frr-k8s-qx65k\" (UID: \"c7166b5c-848f-44ba-a13a-a8c5b6301844\") " pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.676532 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d4ac8583-8e6a-4c40-9903-37fe6f82d038-memberlist\") pod \"speaker-6x54g\" (UID: \"d4ac8583-8e6a-4c40-9903-37fe6f82d038\") " pod="metallb-system/speaker-6x54g" Feb 19 18:47:26 crc kubenswrapper[4749]: E0219 18:47:26.676686 4749 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 19 18:47:26 crc kubenswrapper[4749]: E0219 18:47:26.676748 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4ac8583-8e6a-4c40-9903-37fe6f82d038-memberlist podName:d4ac8583-8e6a-4c40-9903-37fe6f82d038 nodeName:}" failed. No retries permitted until 2026-02-19 18:47:27.676731176 +0000 UTC m=+821.637951130 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d4ac8583-8e6a-4c40-9903-37fe6f82d038-memberlist") pod "speaker-6x54g" (UID: "d4ac8583-8e6a-4c40-9903-37fe6f82d038") : secret "metallb-memberlist" not found Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.721599 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-66v4b" event={"ID":"ec4d9a98-6ae9-4282-8b6e-eb107c8fadc5","Type":"ContainerStarted","Data":"fc386189e8a80be701ccf53292f404f6b286a78f7cb7cd414b196698c52fb93b"} Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.767820 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-trs7x"] Feb 19 18:47:26 crc kubenswrapper[4749]: W0219 18:47:26.777517 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0920f1a5_2f5c_4ec0_b20a_4c1e2a152ccc.slice/crio-be9a203c2b25add32f98c2bc5fc5d77fed602d3314e4948ee0428fc558cc79b1 WatchSource:0}: Error finding container be9a203c2b25add32f98c2bc5fc5d77fed602d3314e4948ee0428fc558cc79b1: Status 404 returned error can't find the container with id be9a203c2b25add32f98c2bc5fc5d77fed602d3314e4948ee0428fc558cc79b1 Feb 19 18:47:26 crc kubenswrapper[4749]: I0219 18:47:26.831506 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:27 crc kubenswrapper[4749]: I0219 18:47:27.692786 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d4ac8583-8e6a-4c40-9903-37fe6f82d038-memberlist\") pod \"speaker-6x54g\" (UID: \"d4ac8583-8e6a-4c40-9903-37fe6f82d038\") " pod="metallb-system/speaker-6x54g" Feb 19 18:47:27 crc kubenswrapper[4749]: I0219 18:47:27.698540 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d4ac8583-8e6a-4c40-9903-37fe6f82d038-memberlist\") pod \"speaker-6x54g\" (UID: \"d4ac8583-8e6a-4c40-9903-37fe6f82d038\") " pod="metallb-system/speaker-6x54g" Feb 19 18:47:27 crc kubenswrapper[4749]: I0219 18:47:27.730522 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-trs7x" event={"ID":"0920f1a5-2f5c-4ec0-b20a-4c1e2a152ccc","Type":"ContainerStarted","Data":"37212485922fc1bef5bda5612922333fcb8428e0c2fc0fc915acbf45171286e2"} Feb 19 18:47:27 crc kubenswrapper[4749]: I0219 18:47:27.730561 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-trs7x" event={"ID":"0920f1a5-2f5c-4ec0-b20a-4c1e2a152ccc","Type":"ContainerStarted","Data":"ce655fdbc8c278a2a21ff827b05a2b66cba43fe8fe250501014a61bc6b4f94e6"} Feb 19 18:47:27 crc kubenswrapper[4749]: I0219 18:47:27.730570 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-trs7x" event={"ID":"0920f1a5-2f5c-4ec0-b20a-4c1e2a152ccc","Type":"ContainerStarted","Data":"be9a203c2b25add32f98c2bc5fc5d77fed602d3314e4948ee0428fc558cc79b1"} Feb 19 18:47:27 crc kubenswrapper[4749]: I0219 18:47:27.730612 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-trs7x" Feb 19 18:47:27 crc kubenswrapper[4749]: I0219 18:47:27.732666 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qx65k" event={"ID":"c7166b5c-848f-44ba-a13a-a8c5b6301844","Type":"ContainerStarted","Data":"3886e57fd258ba04351bc254833fa59fe463a7c6187f30b16aa4cd1827ff07ba"} Feb 19 18:47:27 crc kubenswrapper[4749]: I0219 18:47:27.754821 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-trs7x" podStartSLOduration=1.754792693 podStartE2EDuration="1.754792693s" podCreationTimestamp="2026-02-19 18:47:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:47:27.743617255 +0000 UTC m=+821.704837209" watchObservedRunningTime="2026-02-19 18:47:27.754792693 +0000 UTC m=+821.716012687" Feb 19 18:47:27 crc kubenswrapper[4749]: I0219 18:47:27.830579 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-6x54g" Feb 19 18:47:27 crc kubenswrapper[4749]: W0219 18:47:27.856297 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4ac8583_8e6a_4c40_9903_37fe6f82d038.slice/crio-24068966e80a2e97bd325c63357340004e6c8bfa6d288827c0349ed069e7148b WatchSource:0}: Error finding container 24068966e80a2e97bd325c63357340004e6c8bfa6d288827c0349ed069e7148b: Status 404 returned error can't find the container with id 24068966e80a2e97bd325c63357340004e6c8bfa6d288827c0349ed069e7148b Feb 19 18:47:28 crc kubenswrapper[4749]: I0219 18:47:28.744505 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-6x54g" event={"ID":"d4ac8583-8e6a-4c40-9903-37fe6f82d038","Type":"ContainerStarted","Data":"a60eacd35319088bf65f25dabaf0b60de9ace42ceea5b165ab3ccad51afd55c7"} Feb 19 18:47:28 crc kubenswrapper[4749]: I0219 18:47:28.744904 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-6x54g" event={"ID":"d4ac8583-8e6a-4c40-9903-37fe6f82d038","Type":"ContainerStarted","Data":"0a7aaf91173701459967da191cea3c2d50b3b683b7dd4fdcb171ad8d57ed299b"} Feb 19 18:47:28 crc kubenswrapper[4749]: I0219 18:47:28.744916 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-6x54g" event={"ID":"d4ac8583-8e6a-4c40-9903-37fe6f82d038","Type":"ContainerStarted","Data":"24068966e80a2e97bd325c63357340004e6c8bfa6d288827c0349ed069e7148b"} Feb 19 18:47:28 crc kubenswrapper[4749]: I0219 18:47:28.745191 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-6x54g" Feb 19 18:47:28 crc kubenswrapper[4749]: I0219 18:47:28.763938 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-6x54g" podStartSLOduration=2.763920314 podStartE2EDuration="2.763920314s" podCreationTimestamp="2026-02-19 18:47:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:47:28.760465571 +0000 UTC m=+822.721685525" watchObservedRunningTime="2026-02-19 18:47:28.763920314 +0000 UTC m=+822.725140268" Feb 19 18:47:35 crc kubenswrapper[4749]: I0219 18:47:35.807675 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-66v4b" event={"ID":"ec4d9a98-6ae9-4282-8b6e-eb107c8fadc5","Type":"ContainerStarted","Data":"d973e096ef1ea6a24b195abb3c5ec776c9577c7021ded7488fb521a4a359ac6c"} Feb 19 18:47:35 crc kubenswrapper[4749]: I0219 18:47:35.808298 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-66v4b" Feb 19 18:47:35 crc kubenswrapper[4749]: I0219 18:47:35.812164 4749 generic.go:334] "Generic (PLEG): container finished" podID="c7166b5c-848f-44ba-a13a-a8c5b6301844" containerID="d8c9c2c2ef9838351e4f875b73378cbedc7cf5071613e2d0c79dab56da996b73" exitCode=0 Feb 19 18:47:35 crc kubenswrapper[4749]: I0219 18:47:35.812222 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qx65k" event={"ID":"c7166b5c-848f-44ba-a13a-a8c5b6301844","Type":"ContainerDied","Data":"d8c9c2c2ef9838351e4f875b73378cbedc7cf5071613e2d0c79dab56da996b73"} Feb 19 18:47:35 crc kubenswrapper[4749]: I0219 18:47:35.827978 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-66v4b" podStartSLOduration=2.606919779 podStartE2EDuration="10.827963591s" podCreationTimestamp="2026-02-19 18:47:25 +0000 UTC" firstStartedPulling="2026-02-19 18:47:26.480731466 +0000 UTC m=+820.441951420" lastFinishedPulling="2026-02-19 18:47:34.701775278 +0000 UTC m=+828.662995232" observedRunningTime="2026-02-19 18:47:35.822135881 +0000 UTC m=+829.783355855" watchObservedRunningTime="2026-02-19 18:47:35.827963591 +0000 UTC m=+829.789183545" Feb 19 18:47:36 crc kubenswrapper[4749]: I0219 18:47:36.349217 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-trs7x" Feb 19 18:47:36 crc kubenswrapper[4749]: I0219 18:47:36.818940 4749 generic.go:334] "Generic (PLEG): container finished" podID="c7166b5c-848f-44ba-a13a-a8c5b6301844" containerID="dce9cd1b2621a6a960e9586210fc884754c839454a5443eea6ca63477c783ba8" exitCode=0 Feb 19 18:47:36 crc kubenswrapper[4749]: I0219 18:47:36.818973 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qx65k" event={"ID":"c7166b5c-848f-44ba-a13a-a8c5b6301844","Type":"ContainerDied","Data":"dce9cd1b2621a6a960e9586210fc884754c839454a5443eea6ca63477c783ba8"} Feb 19 18:47:37 crc kubenswrapper[4749]: I0219 18:47:37.826820 4749 generic.go:334] "Generic (PLEG): container finished" podID="c7166b5c-848f-44ba-a13a-a8c5b6301844" containerID="984e117afa060056de2c643ac2e12567c2d0d2b20f44831e0872179c4d25cf84" exitCode=0 Feb 19 18:47:37 crc kubenswrapper[4749]: I0219 18:47:37.826855 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qx65k" event={"ID":"c7166b5c-848f-44ba-a13a-a8c5b6301844","Type":"ContainerDied","Data":"984e117afa060056de2c643ac2e12567c2d0d2b20f44831e0872179c4d25cf84"} Feb 19 18:47:38 crc kubenswrapper[4749]: I0219 18:47:38.850531 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qx65k" event={"ID":"c7166b5c-848f-44ba-a13a-a8c5b6301844","Type":"ContainerStarted","Data":"27032ee2d48e6a30266b56014092750cf5c272b4102368cbdfb49d830da1b588"} Feb 19 18:47:38 crc kubenswrapper[4749]: I0219 18:47:38.851107 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qx65k" event={"ID":"c7166b5c-848f-44ba-a13a-a8c5b6301844","Type":"ContainerStarted","Data":"e53474dc414e24b2f1f401b6cbdbd7d63690d99cf9ca32996f95c2f4120817da"} Feb 19 18:47:38 crc kubenswrapper[4749]: I0219 18:47:38.851123 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qx65k" event={"ID":"c7166b5c-848f-44ba-a13a-a8c5b6301844","Type":"ContainerStarted","Data":"b2bd4c50fa04c329dfb1162517b696946400013b6fbc294d7198c0227b8233b3"} Feb 19 18:47:38 crc kubenswrapper[4749]: I0219 18:47:38.851135 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qx65k" event={"ID":"c7166b5c-848f-44ba-a13a-a8c5b6301844","Type":"ContainerStarted","Data":"afd40cf3767f8e4759f363708703cf98acf7bd7f5b94fe90a09ce3bf9da5f361"} Feb 19 18:47:38 crc kubenswrapper[4749]: I0219 18:47:38.851146 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qx65k" event={"ID":"c7166b5c-848f-44ba-a13a-a8c5b6301844","Type":"ContainerStarted","Data":"ba7a5324579830edd651ec0396d7996f727bd091f912e18f5deed7311da2b401"} Feb 19 18:47:39 crc kubenswrapper[4749]: I0219 18:47:39.859715 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qx65k" event={"ID":"c7166b5c-848f-44ba-a13a-a8c5b6301844","Type":"ContainerStarted","Data":"6f9da71f27547a18caf19652f6590ab82b4e4ae9fad97e48b5dee974ddace37c"} Feb 19 18:47:39 crc kubenswrapper[4749]: I0219 18:47:39.860534 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:39 crc kubenswrapper[4749]: I0219 18:47:39.879722 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-qx65k" podStartSLOduration=7.147983948 podStartE2EDuration="14.87970295s" podCreationTimestamp="2026-02-19 18:47:25 +0000 UTC" firstStartedPulling="2026-02-19 18:47:26.944217324 +0000 UTC m=+820.905437278" lastFinishedPulling="2026-02-19 18:47:34.675936316 +0000 UTC m=+828.637156280" observedRunningTime="2026-02-19 18:47:39.878670395 +0000 UTC m=+833.839890369" watchObservedRunningTime="2026-02-19 18:47:39.87970295 +0000 UTC m=+833.840922914" Feb 19 18:47:41 crc kubenswrapper[4749]: I0219 18:47:41.832509 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:41 crc kubenswrapper[4749]: I0219 18:47:41.868156 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:46 crc kubenswrapper[4749]: I0219 18:47:46.275522 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-66v4b" Feb 19 18:47:47 crc kubenswrapper[4749]: I0219 18:47:47.834480 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-6x54g" Feb 19 18:47:50 crc kubenswrapper[4749]: I0219 18:47:50.922732 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-hzl6s"] Feb 19 18:47:50 crc kubenswrapper[4749]: I0219 18:47:50.923523 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hzl6s" Feb 19 18:47:50 crc kubenswrapper[4749]: I0219 18:47:50.925081 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 19 18:47:50 crc kubenswrapper[4749]: I0219 18:47:50.926364 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-nw6vx" Feb 19 18:47:50 crc kubenswrapper[4749]: I0219 18:47:50.926558 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 19 18:47:50 crc kubenswrapper[4749]: I0219 18:47:50.944011 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hzl6s"] Feb 19 18:47:51 crc kubenswrapper[4749]: I0219 18:47:51.015871 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5smm\" (UniqueName: \"kubernetes.io/projected/e873ac05-1fcd-4895-a084-ea3f3b87d779-kube-api-access-g5smm\") pod \"openstack-operator-index-hzl6s\" (UID: \"e873ac05-1fcd-4895-a084-ea3f3b87d779\") " pod="openstack-operators/openstack-operator-index-hzl6s" Feb 19 18:47:51 crc kubenswrapper[4749]: I0219 18:47:51.117731 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5smm\" (UniqueName: \"kubernetes.io/projected/e873ac05-1fcd-4895-a084-ea3f3b87d779-kube-api-access-g5smm\") pod \"openstack-operator-index-hzl6s\" (UID: \"e873ac05-1fcd-4895-a084-ea3f3b87d779\") " pod="openstack-operators/openstack-operator-index-hzl6s" Feb 19 18:47:51 crc kubenswrapper[4749]: I0219 18:47:51.133745 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5smm\" (UniqueName: \"kubernetes.io/projected/e873ac05-1fcd-4895-a084-ea3f3b87d779-kube-api-access-g5smm\") pod \"openstack-operator-index-hzl6s\" (UID: \"e873ac05-1fcd-4895-a084-ea3f3b87d779\") " pod="openstack-operators/openstack-operator-index-hzl6s" Feb 19 18:47:51 crc kubenswrapper[4749]: I0219 18:47:51.240184 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hzl6s" Feb 19 18:47:51 crc kubenswrapper[4749]: I0219 18:47:51.634807 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hzl6s"] Feb 19 18:47:51 crc kubenswrapper[4749]: I0219 18:47:51.933959 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hzl6s" event={"ID":"e873ac05-1fcd-4895-a084-ea3f3b87d779","Type":"ContainerStarted","Data":"02e4fabffc954251ac41b9d996f1e86ac9cb134b328946725103c985068d064b"} Feb 19 18:47:54 crc kubenswrapper[4749]: I0219 18:47:54.495523 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-hzl6s"] Feb 19 18:47:54 crc kubenswrapper[4749]: I0219 18:47:54.957678 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hzl6s" event={"ID":"e873ac05-1fcd-4895-a084-ea3f3b87d779","Type":"ContainerStarted","Data":"ee421365f03023ba262870377724d698a3f8ae5ca3099a7c2987ba2b402e9aa1"} Feb 19 18:47:54 crc kubenswrapper[4749]: I0219 18:47:54.957824 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-hzl6s" podUID="e873ac05-1fcd-4895-a084-ea3f3b87d779" containerName="registry-server" containerID="cri-o://ee421365f03023ba262870377724d698a3f8ae5ca3099a7c2987ba2b402e9aa1" gracePeriod=2 Feb 19 18:47:54 crc kubenswrapper[4749]: I0219 18:47:54.970867 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-hzl6s" podStartSLOduration=1.830966059 podStartE2EDuration="4.970848325s" podCreationTimestamp="2026-02-19 18:47:50 +0000 UTC" firstStartedPulling="2026-02-19 18:47:51.638548815 +0000 UTC m=+845.599768769" lastFinishedPulling="2026-02-19 18:47:54.778431081 +0000 UTC m=+848.739651035" observedRunningTime="2026-02-19 18:47:54.968813646 +0000 UTC m=+848.930033620" watchObservedRunningTime="2026-02-19 18:47:54.970848325 +0000 UTC m=+848.932068299" Feb 19 18:47:55 crc kubenswrapper[4749]: I0219 18:47:55.100193 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-n89m9"] Feb 19 18:47:55 crc kubenswrapper[4749]: I0219 18:47:55.101345 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-n89m9" Feb 19 18:47:55 crc kubenswrapper[4749]: I0219 18:47:55.110865 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-n89m9"] Feb 19 18:47:55 crc kubenswrapper[4749]: I0219 18:47:55.169361 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95blx\" (UniqueName: \"kubernetes.io/projected/c177988f-7956-4b60-aaf0-0ece549e28cb-kube-api-access-95blx\") pod \"openstack-operator-index-n89m9\" (UID: \"c177988f-7956-4b60-aaf0-0ece549e28cb\") " pod="openstack-operators/openstack-operator-index-n89m9" Feb 19 18:47:55 crc kubenswrapper[4749]: I0219 18:47:55.270956 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95blx\" (UniqueName: \"kubernetes.io/projected/c177988f-7956-4b60-aaf0-0ece549e28cb-kube-api-access-95blx\") pod \"openstack-operator-index-n89m9\" (UID: \"c177988f-7956-4b60-aaf0-0ece549e28cb\") " pod="openstack-operators/openstack-operator-index-n89m9" Feb 19 18:47:55 crc kubenswrapper[4749]: I0219 18:47:55.292383 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95blx\" (UniqueName: \"kubernetes.io/projected/c177988f-7956-4b60-aaf0-0ece549e28cb-kube-api-access-95blx\") pod \"openstack-operator-index-n89m9\" (UID: \"c177988f-7956-4b60-aaf0-0ece549e28cb\") " pod="openstack-operators/openstack-operator-index-n89m9" Feb 19 18:47:55 crc kubenswrapper[4749]: I0219 18:47:55.319967 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hzl6s" Feb 19 18:47:55 crc kubenswrapper[4749]: I0219 18:47:55.371905 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5smm\" (UniqueName: \"kubernetes.io/projected/e873ac05-1fcd-4895-a084-ea3f3b87d779-kube-api-access-g5smm\") pod \"e873ac05-1fcd-4895-a084-ea3f3b87d779\" (UID: \"e873ac05-1fcd-4895-a084-ea3f3b87d779\") " Feb 19 18:47:55 crc kubenswrapper[4749]: I0219 18:47:55.375281 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e873ac05-1fcd-4895-a084-ea3f3b87d779-kube-api-access-g5smm" (OuterVolumeSpecName: "kube-api-access-g5smm") pod "e873ac05-1fcd-4895-a084-ea3f3b87d779" (UID: "e873ac05-1fcd-4895-a084-ea3f3b87d779"). InnerVolumeSpecName "kube-api-access-g5smm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:47:55 crc kubenswrapper[4749]: I0219 18:47:55.428859 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-n89m9" Feb 19 18:47:55 crc kubenswrapper[4749]: I0219 18:47:55.474052 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5smm\" (UniqueName: \"kubernetes.io/projected/e873ac05-1fcd-4895-a084-ea3f3b87d779-kube-api-access-g5smm\") on node \"crc\" DevicePath \"\"" Feb 19 18:47:55 crc kubenswrapper[4749]: I0219 18:47:55.812492 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-n89m9"] Feb 19 18:47:55 crc kubenswrapper[4749]: W0219 18:47:55.813855 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc177988f_7956_4b60_aaf0_0ece549e28cb.slice/crio-210c2e5473bd9497999eb542f5344f00c47f38a72d120f015fd97541b9a616d5 WatchSource:0}: Error finding container 210c2e5473bd9497999eb542f5344f00c47f38a72d120f015fd97541b9a616d5: Status 404 returned error can't find the container with id 210c2e5473bd9497999eb542f5344f00c47f38a72d120f015fd97541b9a616d5 Feb 19 18:47:55 crc kubenswrapper[4749]: I0219 18:47:55.966971 4749 generic.go:334] "Generic (PLEG): container finished" podID="e873ac05-1fcd-4895-a084-ea3f3b87d779" containerID="ee421365f03023ba262870377724d698a3f8ae5ca3099a7c2987ba2b402e9aa1" exitCode=0 Feb 19 18:47:55 crc kubenswrapper[4749]: I0219 18:47:55.967221 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hzl6s" Feb 19 18:47:55 crc kubenswrapper[4749]: I0219 18:47:55.967767 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hzl6s" event={"ID":"e873ac05-1fcd-4895-a084-ea3f3b87d779","Type":"ContainerDied","Data":"ee421365f03023ba262870377724d698a3f8ae5ca3099a7c2987ba2b402e9aa1"} Feb 19 18:47:55 crc kubenswrapper[4749]: I0219 18:47:55.967806 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hzl6s" event={"ID":"e873ac05-1fcd-4895-a084-ea3f3b87d779","Type":"ContainerDied","Data":"02e4fabffc954251ac41b9d996f1e86ac9cb134b328946725103c985068d064b"} Feb 19 18:47:55 crc kubenswrapper[4749]: I0219 18:47:55.967825 4749 scope.go:117] "RemoveContainer" containerID="ee421365f03023ba262870377724d698a3f8ae5ca3099a7c2987ba2b402e9aa1" Feb 19 18:47:55 crc kubenswrapper[4749]: I0219 18:47:55.971397 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-n89m9" event={"ID":"c177988f-7956-4b60-aaf0-0ece549e28cb","Type":"ContainerStarted","Data":"210c2e5473bd9497999eb542f5344f00c47f38a72d120f015fd97541b9a616d5"} Feb 19 18:47:56 crc kubenswrapper[4749]: I0219 18:47:56.000797 4749 scope.go:117] "RemoveContainer" containerID="ee421365f03023ba262870377724d698a3f8ae5ca3099a7c2987ba2b402e9aa1" Feb 19 18:47:56 crc kubenswrapper[4749]: E0219 18:47:56.001419 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee421365f03023ba262870377724d698a3f8ae5ca3099a7c2987ba2b402e9aa1\": container with ID starting with ee421365f03023ba262870377724d698a3f8ae5ca3099a7c2987ba2b402e9aa1 not found: ID does not exist" containerID="ee421365f03023ba262870377724d698a3f8ae5ca3099a7c2987ba2b402e9aa1" Feb 19 18:47:56 crc kubenswrapper[4749]: I0219 18:47:56.001453 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee421365f03023ba262870377724d698a3f8ae5ca3099a7c2987ba2b402e9aa1"} err="failed to get container status \"ee421365f03023ba262870377724d698a3f8ae5ca3099a7c2987ba2b402e9aa1\": rpc error: code = NotFound desc = could not find container \"ee421365f03023ba262870377724d698a3f8ae5ca3099a7c2987ba2b402e9aa1\": container with ID starting with ee421365f03023ba262870377724d698a3f8ae5ca3099a7c2987ba2b402e9aa1 not found: ID does not exist" Feb 19 18:47:56 crc kubenswrapper[4749]: I0219 18:47:56.023813 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-hzl6s"] Feb 19 18:47:56 crc kubenswrapper[4749]: I0219 18:47:56.031372 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-hzl6s"] Feb 19 18:47:56 crc kubenswrapper[4749]: I0219 18:47:56.690272 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e873ac05-1fcd-4895-a084-ea3f3b87d779" path="/var/lib/kubelet/pods/e873ac05-1fcd-4895-a084-ea3f3b87d779/volumes" Feb 19 18:47:56 crc kubenswrapper[4749]: I0219 18:47:56.836747 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-qx65k" Feb 19 18:47:56 crc kubenswrapper[4749]: I0219 18:47:56.980626 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-n89m9" event={"ID":"c177988f-7956-4b60-aaf0-0ece549e28cb","Type":"ContainerStarted","Data":"3973412bfcb5cebaff5f31b9d6bcaad51878baa28cbb77403671cd39d60e4064"} Feb 19 18:47:56 crc kubenswrapper[4749]: I0219 18:47:56.999833 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-n89m9" podStartSLOduration=1.947905899 podStartE2EDuration="1.999796714s" podCreationTimestamp="2026-02-19 18:47:55 +0000 UTC" firstStartedPulling="2026-02-19 18:47:55.817758391 +0000 UTC m=+849.778978345" lastFinishedPulling="2026-02-19 18:47:55.869649206 +0000 UTC m=+849.830869160" observedRunningTime="2026-02-19 18:47:56.996528375 +0000 UTC m=+850.957748349" watchObservedRunningTime="2026-02-19 18:47:56.999796714 +0000 UTC m=+850.961016678" Feb 19 18:48:05 crc kubenswrapper[4749]: I0219 18:48:05.433962 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-n89m9" Feb 19 18:48:05 crc kubenswrapper[4749]: I0219 18:48:05.434564 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-n89m9" Feb 19 18:48:05 crc kubenswrapper[4749]: I0219 18:48:05.459116 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-n89m9" Feb 19 18:48:06 crc kubenswrapper[4749]: I0219 18:48:06.063199 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-n89m9" Feb 19 18:48:07 crc kubenswrapper[4749]: I0219 18:48:07.329022 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548"] Feb 19 18:48:07 crc kubenswrapper[4749]: E0219 18:48:07.329618 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e873ac05-1fcd-4895-a084-ea3f3b87d779" containerName="registry-server" Feb 19 18:48:07 crc kubenswrapper[4749]: I0219 18:48:07.329630 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e873ac05-1fcd-4895-a084-ea3f3b87d779" containerName="registry-server" Feb 19 18:48:07 crc kubenswrapper[4749]: I0219 18:48:07.329740 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e873ac05-1fcd-4895-a084-ea3f3b87d779" containerName="registry-server" Feb 19 18:48:07 crc kubenswrapper[4749]: I0219 18:48:07.330654 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548" Feb 19 18:48:07 crc kubenswrapper[4749]: I0219 18:48:07.333425 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-wrw6t" Feb 19 18:48:07 crc kubenswrapper[4749]: I0219 18:48:07.348740 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548"] Feb 19 18:48:07 crc kubenswrapper[4749]: I0219 18:48:07.435182 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a22a9813-be4b-4990-b503-3a1c2c30ea1d-util\") pod \"7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548\" (UID: \"a22a9813-be4b-4990-b503-3a1c2c30ea1d\") " pod="openstack-operators/7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548" Feb 19 18:48:07 crc kubenswrapper[4749]: I0219 18:48:07.435413 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a22a9813-be4b-4990-b503-3a1c2c30ea1d-bundle\") pod \"7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548\" (UID: \"a22a9813-be4b-4990-b503-3a1c2c30ea1d\") " pod="openstack-operators/7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548" Feb 19 18:48:07 crc kubenswrapper[4749]: I0219 18:48:07.435566 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r27k\" (UniqueName: \"kubernetes.io/projected/a22a9813-be4b-4990-b503-3a1c2c30ea1d-kube-api-access-8r27k\") pod \"7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548\" (UID: \"a22a9813-be4b-4990-b503-3a1c2c30ea1d\") " pod="openstack-operators/7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548" Feb 19 18:48:07 crc kubenswrapper[4749]: I0219 18:48:07.536411 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r27k\" (UniqueName: \"kubernetes.io/projected/a22a9813-be4b-4990-b503-3a1c2c30ea1d-kube-api-access-8r27k\") pod \"7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548\" (UID: \"a22a9813-be4b-4990-b503-3a1c2c30ea1d\") " pod="openstack-operators/7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548" Feb 19 18:48:07 crc kubenswrapper[4749]: I0219 18:48:07.536457 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a22a9813-be4b-4990-b503-3a1c2c30ea1d-util\") pod \"7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548\" (UID: \"a22a9813-be4b-4990-b503-3a1c2c30ea1d\") " pod="openstack-operators/7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548" Feb 19 18:48:07 crc kubenswrapper[4749]: I0219 18:48:07.536547 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a22a9813-be4b-4990-b503-3a1c2c30ea1d-bundle\") pod \"7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548\" (UID: \"a22a9813-be4b-4990-b503-3a1c2c30ea1d\") " pod="openstack-operators/7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548" Feb 19 18:48:07 crc kubenswrapper[4749]: I0219 18:48:07.537112 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a22a9813-be4b-4990-b503-3a1c2c30ea1d-bundle\") pod \"7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548\" (UID: \"a22a9813-be4b-4990-b503-3a1c2c30ea1d\") " pod="openstack-operators/7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548" Feb 19 18:48:07 crc kubenswrapper[4749]: I0219 18:48:07.537148 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a22a9813-be4b-4990-b503-3a1c2c30ea1d-util\") pod \"7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548\" (UID: \"a22a9813-be4b-4990-b503-3a1c2c30ea1d\") " pod="openstack-operators/7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548" Feb 19 18:48:07 crc kubenswrapper[4749]: I0219 18:48:07.556640 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r27k\" (UniqueName: \"kubernetes.io/projected/a22a9813-be4b-4990-b503-3a1c2c30ea1d-kube-api-access-8r27k\") pod \"7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548\" (UID: \"a22a9813-be4b-4990-b503-3a1c2c30ea1d\") " pod="openstack-operators/7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548" Feb 19 18:48:07 crc kubenswrapper[4749]: I0219 18:48:07.646873 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548" Feb 19 18:48:08 crc kubenswrapper[4749]: I0219 18:48:08.065012 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548"] Feb 19 18:48:08 crc kubenswrapper[4749]: W0219 18:48:08.070976 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda22a9813_be4b_4990_b503_3a1c2c30ea1d.slice/crio-6bb4f3ddccb25042ff3a66457a84551036add063fa9a3c6cc4d0d93cda86f9bf WatchSource:0}: Error finding container 6bb4f3ddccb25042ff3a66457a84551036add063fa9a3c6cc4d0d93cda86f9bf: Status 404 returned error can't find the container with id 6bb4f3ddccb25042ff3a66457a84551036add063fa9a3c6cc4d0d93cda86f9bf Feb 19 18:48:09 crc kubenswrapper[4749]: I0219 18:48:09.067080 4749 generic.go:334] "Generic (PLEG): container finished" podID="a22a9813-be4b-4990-b503-3a1c2c30ea1d" containerID="4a2c130143cf8f27518982d51dfbe3c708e5c6b7de80b0c7465c9c1348cc9a66" exitCode=0 Feb 19 18:48:09 crc kubenswrapper[4749]: I0219 18:48:09.067188 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548" event={"ID":"a22a9813-be4b-4990-b503-3a1c2c30ea1d","Type":"ContainerDied","Data":"4a2c130143cf8f27518982d51dfbe3c708e5c6b7de80b0c7465c9c1348cc9a66"} Feb 19 18:48:09 crc kubenswrapper[4749]: I0219 18:48:09.067565 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548" event={"ID":"a22a9813-be4b-4990-b503-3a1c2c30ea1d","Type":"ContainerStarted","Data":"6bb4f3ddccb25042ff3a66457a84551036add063fa9a3c6cc4d0d93cda86f9bf"} Feb 19 18:48:10 crc kubenswrapper[4749]: I0219 18:48:10.074690 4749 generic.go:334] "Generic (PLEG): container finished" podID="a22a9813-be4b-4990-b503-3a1c2c30ea1d" containerID="6d4a3e8ef0c21738d7f2dafd0285abd99cd1a3a7ac6f83ec96720fd21f0503f7" exitCode=0 Feb 19 18:48:10 crc kubenswrapper[4749]: I0219 18:48:10.074775 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548" event={"ID":"a22a9813-be4b-4990-b503-3a1c2c30ea1d","Type":"ContainerDied","Data":"6d4a3e8ef0c21738d7f2dafd0285abd99cd1a3a7ac6f83ec96720fd21f0503f7"} Feb 19 18:48:11 crc kubenswrapper[4749]: I0219 18:48:11.085837 4749 generic.go:334] "Generic (PLEG): container finished" podID="a22a9813-be4b-4990-b503-3a1c2c30ea1d" containerID="df2e41edee0e1047594a54391f49688704bdbd73e89a8517b20acbba882e523c" exitCode=0 Feb 19 18:48:11 crc kubenswrapper[4749]: I0219 18:48:11.085876 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548" event={"ID":"a22a9813-be4b-4990-b503-3a1c2c30ea1d","Type":"ContainerDied","Data":"df2e41edee0e1047594a54391f49688704bdbd73e89a8517b20acbba882e523c"} Feb 19 18:48:12 crc kubenswrapper[4749]: I0219 18:48:12.332444 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548" Feb 19 18:48:12 crc kubenswrapper[4749]: I0219 18:48:12.397006 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a22a9813-be4b-4990-b503-3a1c2c30ea1d-bundle\") pod \"a22a9813-be4b-4990-b503-3a1c2c30ea1d\" (UID: \"a22a9813-be4b-4990-b503-3a1c2c30ea1d\") " Feb 19 18:48:12 crc kubenswrapper[4749]: I0219 18:48:12.397220 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r27k\" (UniqueName: \"kubernetes.io/projected/a22a9813-be4b-4990-b503-3a1c2c30ea1d-kube-api-access-8r27k\") pod \"a22a9813-be4b-4990-b503-3a1c2c30ea1d\" (UID: \"a22a9813-be4b-4990-b503-3a1c2c30ea1d\") " Feb 19 18:48:12 crc kubenswrapper[4749]: I0219 18:48:12.397257 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a22a9813-be4b-4990-b503-3a1c2c30ea1d-util\") pod \"a22a9813-be4b-4990-b503-3a1c2c30ea1d\" (UID: \"a22a9813-be4b-4990-b503-3a1c2c30ea1d\") " Feb 19 18:48:12 crc kubenswrapper[4749]: I0219 18:48:12.398902 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a22a9813-be4b-4990-b503-3a1c2c30ea1d-bundle" (OuterVolumeSpecName: "bundle") pod "a22a9813-be4b-4990-b503-3a1c2c30ea1d" (UID: "a22a9813-be4b-4990-b503-3a1c2c30ea1d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:48:12 crc kubenswrapper[4749]: I0219 18:48:12.403735 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a22a9813-be4b-4990-b503-3a1c2c30ea1d-kube-api-access-8r27k" (OuterVolumeSpecName: "kube-api-access-8r27k") pod "a22a9813-be4b-4990-b503-3a1c2c30ea1d" (UID: "a22a9813-be4b-4990-b503-3a1c2c30ea1d"). InnerVolumeSpecName "kube-api-access-8r27k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:48:12 crc kubenswrapper[4749]: I0219 18:48:12.416786 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a22a9813-be4b-4990-b503-3a1c2c30ea1d-util" (OuterVolumeSpecName: "util") pod "a22a9813-be4b-4990-b503-3a1c2c30ea1d" (UID: "a22a9813-be4b-4990-b503-3a1c2c30ea1d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:48:12 crc kubenswrapper[4749]: I0219 18:48:12.499484 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r27k\" (UniqueName: \"kubernetes.io/projected/a22a9813-be4b-4990-b503-3a1c2c30ea1d-kube-api-access-8r27k\") on node \"crc\" DevicePath \"\"" Feb 19 18:48:12 crc kubenswrapper[4749]: I0219 18:48:12.499527 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a22a9813-be4b-4990-b503-3a1c2c30ea1d-util\") on node \"crc\" DevicePath \"\"" Feb 19 18:48:12 crc kubenswrapper[4749]: I0219 18:48:12.499545 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a22a9813-be4b-4990-b503-3a1c2c30ea1d-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:48:13 crc kubenswrapper[4749]: I0219 18:48:13.101974 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548" event={"ID":"a22a9813-be4b-4990-b503-3a1c2c30ea1d","Type":"ContainerDied","Data":"6bb4f3ddccb25042ff3a66457a84551036add063fa9a3c6cc4d0d93cda86f9bf"} Feb 19 18:48:13 crc kubenswrapper[4749]: I0219 18:48:13.102430 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bb4f3ddccb25042ff3a66457a84551036add063fa9a3c6cc4d0d93cda86f9bf" Feb 19 18:48:13 crc kubenswrapper[4749]: I0219 18:48:13.102081 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548" Feb 19 18:48:19 crc kubenswrapper[4749]: I0219 18:48:19.251318 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-857dd64d7c-c9mq9"] Feb 19 18:48:19 crc kubenswrapper[4749]: E0219 18:48:19.251912 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a22a9813-be4b-4990-b503-3a1c2c30ea1d" containerName="pull" Feb 19 18:48:19 crc kubenswrapper[4749]: I0219 18:48:19.251924 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a22a9813-be4b-4990-b503-3a1c2c30ea1d" containerName="pull" Feb 19 18:48:19 crc kubenswrapper[4749]: E0219 18:48:19.251936 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a22a9813-be4b-4990-b503-3a1c2c30ea1d" containerName="util" Feb 19 18:48:19 crc kubenswrapper[4749]: I0219 18:48:19.251942 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a22a9813-be4b-4990-b503-3a1c2c30ea1d" containerName="util" Feb 19 18:48:19 crc kubenswrapper[4749]: E0219 18:48:19.251954 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a22a9813-be4b-4990-b503-3a1c2c30ea1d" containerName="extract" Feb 19 18:48:19 crc kubenswrapper[4749]: I0219 18:48:19.251960 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a22a9813-be4b-4990-b503-3a1c2c30ea1d" containerName="extract" Feb 19 18:48:19 crc kubenswrapper[4749]: I0219 18:48:19.252090 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a22a9813-be4b-4990-b503-3a1c2c30ea1d" containerName="extract" Feb 19 18:48:19 crc kubenswrapper[4749]: I0219 18:48:19.252467 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-857dd64d7c-c9mq9" Feb 19 18:48:19 crc kubenswrapper[4749]: I0219 18:48:19.254424 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-b7j7p" Feb 19 18:48:19 crc kubenswrapper[4749]: I0219 18:48:19.268436 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-857dd64d7c-c9mq9"] Feb 19 18:48:19 crc kubenswrapper[4749]: I0219 18:48:19.310180 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4pb9\" (UniqueName: \"kubernetes.io/projected/83f4c4a7-f126-44f4-9780-82e159ec9ec7-kube-api-access-b4pb9\") pod \"openstack-operator-controller-init-857dd64d7c-c9mq9\" (UID: \"83f4c4a7-f126-44f4-9780-82e159ec9ec7\") " pod="openstack-operators/openstack-operator-controller-init-857dd64d7c-c9mq9" Feb 19 18:48:19 crc kubenswrapper[4749]: I0219 18:48:19.412084 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4pb9\" (UniqueName: \"kubernetes.io/projected/83f4c4a7-f126-44f4-9780-82e159ec9ec7-kube-api-access-b4pb9\") pod \"openstack-operator-controller-init-857dd64d7c-c9mq9\" (UID: \"83f4c4a7-f126-44f4-9780-82e159ec9ec7\") " pod="openstack-operators/openstack-operator-controller-init-857dd64d7c-c9mq9" Feb 19 18:48:19 crc kubenswrapper[4749]: I0219 18:48:19.432165 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4pb9\" (UniqueName: \"kubernetes.io/projected/83f4c4a7-f126-44f4-9780-82e159ec9ec7-kube-api-access-b4pb9\") pod \"openstack-operator-controller-init-857dd64d7c-c9mq9\" (UID: \"83f4c4a7-f126-44f4-9780-82e159ec9ec7\") " pod="openstack-operators/openstack-operator-controller-init-857dd64d7c-c9mq9" Feb 19 18:48:19 crc kubenswrapper[4749]: I0219 18:48:19.574424 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-857dd64d7c-c9mq9" Feb 19 18:48:20 crc kubenswrapper[4749]: I0219 18:48:20.018466 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-857dd64d7c-c9mq9"] Feb 19 18:48:20 crc kubenswrapper[4749]: I0219 18:48:20.145328 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-857dd64d7c-c9mq9" event={"ID":"83f4c4a7-f126-44f4-9780-82e159ec9ec7","Type":"ContainerStarted","Data":"5199370a5645d428eb462494608622c160ad4be31705cc8736b4cc95d81b5fab"} Feb 19 18:48:25 crc kubenswrapper[4749]: I0219 18:48:25.184760 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-857dd64d7c-c9mq9" event={"ID":"83f4c4a7-f126-44f4-9780-82e159ec9ec7","Type":"ContainerStarted","Data":"5af33b98d7dbdb3ec04759164638d8eb2b47e5bdb6c646b478cce132fbb1a182"} Feb 19 18:48:25 crc kubenswrapper[4749]: I0219 18:48:25.185489 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-857dd64d7c-c9mq9" Feb 19 18:48:25 crc kubenswrapper[4749]: I0219 18:48:25.213189 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-857dd64d7c-c9mq9" podStartSLOduration=1.454487427 podStartE2EDuration="6.213174008s" podCreationTimestamp="2026-02-19 18:48:19 +0000 UTC" firstStartedPulling="2026-02-19 18:48:20.02836829 +0000 UTC m=+873.989588244" lastFinishedPulling="2026-02-19 18:48:24.787054871 +0000 UTC m=+878.748274825" observedRunningTime="2026-02-19 18:48:25.208891005 +0000 UTC m=+879.170110959" watchObservedRunningTime="2026-02-19 18:48:25.213174008 +0000 UTC m=+879.174393962" Feb 19 18:48:29 crc kubenswrapper[4749]: I0219 18:48:29.578059 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-857dd64d7c-c9mq9" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.103933 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-54ths"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.105567 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-54ths" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.108132 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-9gxx9" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.137608 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-nlt7x"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.138502 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-nlt7x" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.153272 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-54ths"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.154980 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-lk5hf" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.168499 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-5k6kx"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.169236 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5k6kx" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.171499 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-zwdzf" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.175293 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-nlt7x"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.217739 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-5k6kx"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.227701 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgrl7\" (UniqueName: \"kubernetes.io/projected/ed276a06-3dcf-475c-8d9c-1ee1c364f783-kube-api-access-fgrl7\") pod \"barbican-operator-controller-manager-868647ff47-54ths\" (UID: \"ed276a06-3dcf-475c-8d9c-1ee1c364f783\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-54ths" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.227847 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-kqx52"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.228556 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-kqx52" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.232122 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-6tkxw" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.251762 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-kqx52"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.268502 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-xmlcc"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.269460 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-xmlcc" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.279959 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-ms4gs" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.293357 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-xmlcc"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.316088 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-c5zvr"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.316880 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-c5zvr" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.328972 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-4gbnl" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.329138 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.332947 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lxlr\" (UniqueName: \"kubernetes.io/projected/872c81d0-4024-4678-a081-6698ee2fe586-kube-api-access-7lxlr\") pod \"glance-operator-controller-manager-77987464f4-kqx52\" (UID: \"872c81d0-4024-4678-a081-6698ee2fe586\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-kqx52" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.332994 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mt4f\" (UniqueName: \"kubernetes.io/projected/0c080714-223f-4954-81ad-0fbf2d7ceff1-kube-api-access-2mt4f\") pod \"infra-operator-controller-manager-79d975b745-c5zvr\" (UID: \"0c080714-223f-4954-81ad-0fbf2d7ceff1\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-c5zvr" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.333042 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c080714-223f-4954-81ad-0fbf2d7ceff1-cert\") pod \"infra-operator-controller-manager-79d975b745-c5zvr\" (UID: \"0c080714-223f-4954-81ad-0fbf2d7ceff1\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-c5zvr" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.333114 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65bzc\" (UniqueName: \"kubernetes.io/projected/7938ade1-7dc4-4927-b620-4cdcb7125a94-kube-api-access-65bzc\") pod \"cinder-operator-controller-manager-5d946d989d-nlt7x\" (UID: \"7938ade1-7dc4-4927-b620-4cdcb7125a94\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-nlt7x" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.333149 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g24gf\" (UniqueName: \"kubernetes.io/projected/b5e9baf7-cc67-4cde-ba8f-256eb3c5601f-kube-api-access-g24gf\") pod \"heat-operator-controller-manager-69f49c598c-xmlcc\" (UID: \"b5e9baf7-cc67-4cde-ba8f-256eb3c5601f\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-xmlcc" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.333216 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgrl7\" (UniqueName: \"kubernetes.io/projected/ed276a06-3dcf-475c-8d9c-1ee1c364f783-kube-api-access-fgrl7\") pod \"barbican-operator-controller-manager-868647ff47-54ths\" (UID: \"ed276a06-3dcf-475c-8d9c-1ee1c364f783\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-54ths" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.333238 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpw6v\" (UniqueName: \"kubernetes.io/projected/82c521c0-6968-4298-afc8-e4aac617b61d-kube-api-access-vpw6v\") pod \"designate-operator-controller-manager-6d8bf5c495-5k6kx\" (UID: \"82c521c0-6968-4298-afc8-e4aac617b61d\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5k6kx" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.339014 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-g44zq"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.340223 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-g44zq" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.347903 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-nlqk2" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.361199 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-g44zq"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.388799 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgrl7\" (UniqueName: \"kubernetes.io/projected/ed276a06-3dcf-475c-8d9c-1ee1c364f783-kube-api-access-fgrl7\") pod \"barbican-operator-controller-manager-868647ff47-54ths\" (UID: \"ed276a06-3dcf-475c-8d9c-1ee1c364f783\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-54ths" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.400523 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-c5zvr"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.411540 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-66xhq"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.412523 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-66xhq" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.417267 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-wljfj" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.440754 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65bzc\" (UniqueName: \"kubernetes.io/projected/7938ade1-7dc4-4927-b620-4cdcb7125a94-kube-api-access-65bzc\") pod \"cinder-operator-controller-manager-5d946d989d-nlt7x\" (UID: \"7938ade1-7dc4-4927-b620-4cdcb7125a94\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-nlt7x" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.440827 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g24gf\" (UniqueName: \"kubernetes.io/projected/b5e9baf7-cc67-4cde-ba8f-256eb3c5601f-kube-api-access-g24gf\") pod \"heat-operator-controller-manager-69f49c598c-xmlcc\" (UID: \"b5e9baf7-cc67-4cde-ba8f-256eb3c5601f\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-xmlcc" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.440884 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpw6v\" (UniqueName: \"kubernetes.io/projected/82c521c0-6968-4298-afc8-e4aac617b61d-kube-api-access-vpw6v\") pod \"designate-operator-controller-manager-6d8bf5c495-5k6kx\" (UID: \"82c521c0-6968-4298-afc8-e4aac617b61d\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5k6kx" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.440931 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lxlr\" (UniqueName: \"kubernetes.io/projected/872c81d0-4024-4678-a081-6698ee2fe586-kube-api-access-7lxlr\") pod \"glance-operator-controller-manager-77987464f4-kqx52\" (UID: \"872c81d0-4024-4678-a081-6698ee2fe586\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-kqx52" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.440971 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mt4f\" (UniqueName: \"kubernetes.io/projected/0c080714-223f-4954-81ad-0fbf2d7ceff1-kube-api-access-2mt4f\") pod \"infra-operator-controller-manager-79d975b745-c5zvr\" (UID: \"0c080714-223f-4954-81ad-0fbf2d7ceff1\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-c5zvr" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.441005 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c080714-223f-4954-81ad-0fbf2d7ceff1-cert\") pod \"infra-operator-controller-manager-79d975b745-c5zvr\" (UID: \"0c080714-223f-4954-81ad-0fbf2d7ceff1\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-c5zvr" Feb 19 18:48:50 crc kubenswrapper[4749]: E0219 18:48:50.441199 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 18:48:50 crc kubenswrapper[4749]: E0219 18:48:50.441263 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c080714-223f-4954-81ad-0fbf2d7ceff1-cert podName:0c080714-223f-4954-81ad-0fbf2d7ceff1 nodeName:}" failed. No retries permitted until 2026-02-19 18:48:50.9412388 +0000 UTC m=+904.902458754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c080714-223f-4954-81ad-0fbf2d7ceff1-cert") pod "infra-operator-controller-manager-79d975b745-c5zvr" (UID: "0c080714-223f-4954-81ad-0fbf2d7ceff1") : secret "infra-operator-webhook-server-cert" not found Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.442340 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-54ths" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.446930 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-7c76f"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.451660 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-7c76f" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.456184 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-6rblj" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.493271 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65bzc\" (UniqueName: \"kubernetes.io/projected/7938ade1-7dc4-4927-b620-4cdcb7125a94-kube-api-access-65bzc\") pod \"cinder-operator-controller-manager-5d946d989d-nlt7x\" (UID: \"7938ade1-7dc4-4927-b620-4cdcb7125a94\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-nlt7x" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.493271 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mt4f\" (UniqueName: \"kubernetes.io/projected/0c080714-223f-4954-81ad-0fbf2d7ceff1-kube-api-access-2mt4f\") pod \"infra-operator-controller-manager-79d975b745-c5zvr\" (UID: \"0c080714-223f-4954-81ad-0fbf2d7ceff1\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-c5zvr" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.493765 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpw6v\" (UniqueName: \"kubernetes.io/projected/82c521c0-6968-4298-afc8-e4aac617b61d-kube-api-access-vpw6v\") pod \"designate-operator-controller-manager-6d8bf5c495-5k6kx\" (UID: \"82c521c0-6968-4298-afc8-e4aac617b61d\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5k6kx" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.493786 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lxlr\" (UniqueName: \"kubernetes.io/projected/872c81d0-4024-4678-a081-6698ee2fe586-kube-api-access-7lxlr\") pod \"glance-operator-controller-manager-77987464f4-kqx52\" (UID: \"872c81d0-4024-4678-a081-6698ee2fe586\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-kqx52" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.494873 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5k6kx" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.510641 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-7c76f"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.536925 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g24gf\" (UniqueName: \"kubernetes.io/projected/b5e9baf7-cc67-4cde-ba8f-256eb3c5601f-kube-api-access-g24gf\") pod \"heat-operator-controller-manager-69f49c598c-xmlcc\" (UID: \"b5e9baf7-cc67-4cde-ba8f-256eb3c5601f\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-xmlcc" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.549461 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7q2n\" (UniqueName: \"kubernetes.io/projected/184b233b-5456-42e8-a09b-61221754095e-kube-api-access-r7q2n\") pod \"ironic-operator-controller-manager-554564d7fc-66xhq\" (UID: \"184b233b-5456-42e8-a09b-61221754095e\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-66xhq" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.549509 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk87q\" (UniqueName: \"kubernetes.io/projected/95419345-6f7d-4cb6-b0c6-75bdebf35ade-kube-api-access-lk87q\") pod \"horizon-operator-controller-manager-5b9b8895d5-g44zq\" (UID: \"95419345-6f7d-4cb6-b0c6-75bdebf35ade\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-g44zq" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.549555 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2psp\" (UniqueName: \"kubernetes.io/projected/bda10183-e834-4f98-a0cf-47ce14f1d333-kube-api-access-h2psp\") pod \"keystone-operator-controller-manager-b4d948c87-7c76f\" (UID: \"bda10183-e834-4f98-a0cf-47ce14f1d333\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-7c76f" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.551988 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-kqx52" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.572715 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-66xhq"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.590440 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-5rm6j"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.595801 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-5rm6j" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.596316 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-g4tkq"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.597557 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-g4tkq" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.601511 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-xt9lz" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.601740 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-zx2b6" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.603668 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-5rm6j"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.625630 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-g4tkq"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.630464 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-xmlcc" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.638626 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-k6k2h"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.639872 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-k6k2h" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.641554 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-ls9s6" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.650983 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-k6k2h"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.655234 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2psp\" (UniqueName: \"kubernetes.io/projected/bda10183-e834-4f98-a0cf-47ce14f1d333-kube-api-access-h2psp\") pod \"keystone-operator-controller-manager-b4d948c87-7c76f\" (UID: \"bda10183-e834-4f98-a0cf-47ce14f1d333\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-7c76f" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.660483 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-x87ll"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.685216 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-x87ll" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.688410 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-lmspj" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.693588 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7q2n\" (UniqueName: \"kubernetes.io/projected/184b233b-5456-42e8-a09b-61221754095e-kube-api-access-r7q2n\") pod \"ironic-operator-controller-manager-554564d7fc-66xhq\" (UID: \"184b233b-5456-42e8-a09b-61221754095e\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-66xhq" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.698113 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk87q\" (UniqueName: \"kubernetes.io/projected/95419345-6f7d-4cb6-b0c6-75bdebf35ade-kube-api-access-lk87q\") pod \"horizon-operator-controller-manager-5b9b8895d5-g44zq\" (UID: \"95419345-6f7d-4cb6-b0c6-75bdebf35ade\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-g44zq" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.702945 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2psp\" (UniqueName: \"kubernetes.io/projected/bda10183-e834-4f98-a0cf-47ce14f1d333-kube-api-access-h2psp\") pod \"keystone-operator-controller-manager-b4d948c87-7c76f\" (UID: \"bda10183-e834-4f98-a0cf-47ce14f1d333\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-7c76f" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.724390 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7q2n\" (UniqueName: \"kubernetes.io/projected/184b233b-5456-42e8-a09b-61221754095e-kube-api-access-r7q2n\") pod \"ironic-operator-controller-manager-554564d7fc-66xhq\" (UID: \"184b233b-5456-42e8-a09b-61221754095e\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-66xhq" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.724699 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-875j6"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.729204 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-x87ll"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.729234 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-875j6"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.729309 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-875j6" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.731246 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk87q\" (UniqueName: \"kubernetes.io/projected/95419345-6f7d-4cb6-b0c6-75bdebf35ade-kube-api-access-lk87q\") pod \"horizon-operator-controller-manager-5b9b8895d5-g44zq\" (UID: \"95419345-6f7d-4cb6-b0c6-75bdebf35ade\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-g44zq" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.733165 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-mndcb" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.752835 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-972mw"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.753188 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-nlt7x" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.755170 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-972mw" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.757804 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-zpl9r" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.773174 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-mcgwg"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.774240 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-mcgwg" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.777106 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-qw2xf" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.804424 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-622vr\" (UniqueName: \"kubernetes.io/projected/655da61c-572a-43c2-8d53-c3a3e0f95d43-kube-api-access-622vr\") pod \"nova-operator-controller-manager-567668f5cf-x87ll\" (UID: \"655da61c-572a-43c2-8d53-c3a3e0f95d43\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-x87ll" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.804487 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2hl2\" (UniqueName: \"kubernetes.io/projected/eaf1fbac-75fa-4442-811d-8f51e3a1e66b-kube-api-access-j2hl2\") pod \"manila-operator-controller-manager-54f6768c69-g4tkq\" (UID: \"eaf1fbac-75fa-4442-811d-8f51e3a1e66b\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-g4tkq" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.804536 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4hz5\" (UniqueName: \"kubernetes.io/projected/61c9ed9f-0dff-4560-a6d3-a621e1a6ff09-kube-api-access-r4hz5\") pod \"ovn-operator-controller-manager-d44cf6b75-972mw\" (UID: \"61c9ed9f-0dff-4560-a6d3-a621e1a6ff09\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-972mw" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.804577 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfdpn\" (UniqueName: \"kubernetes.io/projected/b31580cf-6da4-442c-aa01-bed52414bf52-kube-api-access-tfdpn\") pod \"octavia-operator-controller-manager-69f8888797-875j6\" (UID: \"b31580cf-6da4-442c-aa01-bed52414bf52\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-875j6" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.804612 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq29h\" (UniqueName: \"kubernetes.io/projected/aa7dbc9c-deb4-49ae-a0c7-2130343cae10-kube-api-access-jq29h\") pod \"neutron-operator-controller-manager-64ddbf8bb-k6k2h\" (UID: \"aa7dbc9c-deb4-49ae-a0c7-2130343cae10\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-k6k2h" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.804633 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7mtd\" (UniqueName: \"kubernetes.io/projected/0e32dc41-84cc-42d1-bbf8-be4aa4d4b010-kube-api-access-n7mtd\") pod \"mariadb-operator-controller-manager-6994f66f48-5rm6j\" (UID: \"0e32dc41-84cc-42d1-bbf8-be4aa4d4b010\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-5rm6j" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.812084 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-2w6pw"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.812968 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-2w6pw" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.814567 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-4nzn7" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.835368 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-972mw"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.847276 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cft725"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.851355 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cft725" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.855243 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-mcgwg"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.855390 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.855961 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-jczh4" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.874590 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-2w6pw"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.881146 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cft725"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.885756 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xx724"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.886612 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xx724" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.887270 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-66xhq" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.889609 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-dtf2m" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.903447 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xx724"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.906516 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2hl2\" (UniqueName: \"kubernetes.io/projected/eaf1fbac-75fa-4442-811d-8f51e3a1e66b-kube-api-access-j2hl2\") pod \"manila-operator-controller-manager-54f6768c69-g4tkq\" (UID: \"eaf1fbac-75fa-4442-811d-8f51e3a1e66b\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-g4tkq" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.906573 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cft725\" (UID: \"b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cft725" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.906623 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4hz5\" (UniqueName: \"kubernetes.io/projected/61c9ed9f-0dff-4560-a6d3-a621e1a6ff09-kube-api-access-r4hz5\") pod \"ovn-operator-controller-manager-d44cf6b75-972mw\" (UID: \"61c9ed9f-0dff-4560-a6d3-a621e1a6ff09\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-972mw" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.906642 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvgwk\" (UniqueName: \"kubernetes.io/projected/b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d-kube-api-access-jvgwk\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cft725\" (UID: \"b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cft725" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.906681 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfdpn\" (UniqueName: \"kubernetes.io/projected/b31580cf-6da4-442c-aa01-bed52414bf52-kube-api-access-tfdpn\") pod \"octavia-operator-controller-manager-69f8888797-875j6\" (UID: \"b31580cf-6da4-442c-aa01-bed52414bf52\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-875j6" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.906706 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpw4c\" (UniqueName: \"kubernetes.io/projected/c8870691-c7b5-4715-8db1-2ac0f6c56ad9-kube-api-access-tpw4c\") pod \"swift-operator-controller-manager-68f46476f-mcgwg\" (UID: \"c8870691-c7b5-4715-8db1-2ac0f6c56ad9\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-mcgwg" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.906734 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq29h\" (UniqueName: \"kubernetes.io/projected/aa7dbc9c-deb4-49ae-a0c7-2130343cae10-kube-api-access-jq29h\") pod \"neutron-operator-controller-manager-64ddbf8bb-k6k2h\" (UID: \"aa7dbc9c-deb4-49ae-a0c7-2130343cae10\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-k6k2h" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.906754 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7mtd\" (UniqueName: \"kubernetes.io/projected/0e32dc41-84cc-42d1-bbf8-be4aa4d4b010-kube-api-access-n7mtd\") pod \"mariadb-operator-controller-manager-6994f66f48-5rm6j\" (UID: \"0e32dc41-84cc-42d1-bbf8-be4aa4d4b010\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-5rm6j" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.906779 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv2tb\" (UniqueName: \"kubernetes.io/projected/f7febf8e-9f11-446d-b8f1-4bc8e0e2dce3-kube-api-access-lv2tb\") pod \"placement-operator-controller-manager-8497b45c89-2w6pw\" (UID: \"f7febf8e-9f11-446d-b8f1-4bc8e0e2dce3\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-2w6pw" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.906822 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-622vr\" (UniqueName: \"kubernetes.io/projected/655da61c-572a-43c2-8d53-c3a3e0f95d43-kube-api-access-622vr\") pod \"nova-operator-controller-manager-567668f5cf-x87ll\" (UID: \"655da61c-572a-43c2-8d53-c3a3e0f95d43\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-x87ll" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.908169 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-7c76f" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.918278 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-7cjck"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.922491 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-7cjck"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.922590 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-7cjck" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.925382 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-4s2nf" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.934220 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2hl2\" (UniqueName: \"kubernetes.io/projected/eaf1fbac-75fa-4442-811d-8f51e3a1e66b-kube-api-access-j2hl2\") pod \"manila-operator-controller-manager-54f6768c69-g4tkq\" (UID: \"eaf1fbac-75fa-4442-811d-8f51e3a1e66b\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-g4tkq" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.934985 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfdpn\" (UniqueName: \"kubernetes.io/projected/b31580cf-6da4-442c-aa01-bed52414bf52-kube-api-access-tfdpn\") pod \"octavia-operator-controller-manager-69f8888797-875j6\" (UID: \"b31580cf-6da4-442c-aa01-bed52414bf52\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-875j6" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.938112 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq29h\" (UniqueName: \"kubernetes.io/projected/aa7dbc9c-deb4-49ae-a0c7-2130343cae10-kube-api-access-jq29h\") pod \"neutron-operator-controller-manager-64ddbf8bb-k6k2h\" (UID: \"aa7dbc9c-deb4-49ae-a0c7-2130343cae10\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-k6k2h" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.952294 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-g4tkq" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.955622 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-56fd5cc5c9-s7k5m"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.957196 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4hz5\" (UniqueName: \"kubernetes.io/projected/61c9ed9f-0dff-4560-a6d3-a621e1a6ff09-kube-api-access-r4hz5\") pod \"ovn-operator-controller-manager-d44cf6b75-972mw\" (UID: \"61c9ed9f-0dff-4560-a6d3-a621e1a6ff09\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-972mw" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.957802 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-56fd5cc5c9-s7k5m" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.959508 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7mtd\" (UniqueName: \"kubernetes.io/projected/0e32dc41-84cc-42d1-bbf8-be4aa4d4b010-kube-api-access-n7mtd\") pod \"mariadb-operator-controller-manager-6994f66f48-5rm6j\" (UID: \"0e32dc41-84cc-42d1-bbf8-be4aa4d4b010\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-5rm6j" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.972040 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-g44zq" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.983417 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-cllg4" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.983509 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-56fd5cc5c9-s7k5m"] Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.988589 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-622vr\" (UniqueName: \"kubernetes.io/projected/655da61c-572a-43c2-8d53-c3a3e0f95d43-kube-api-access-622vr\") pod \"nova-operator-controller-manager-567668f5cf-x87ll\" (UID: \"655da61c-572a-43c2-8d53-c3a3e0f95d43\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-x87ll" Feb 19 18:48:50 crc kubenswrapper[4749]: I0219 18:48:50.988654 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-k6k2h" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.015016 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c080714-223f-4954-81ad-0fbf2d7ceff1-cert\") pod \"infra-operator-controller-manager-79d975b745-c5zvr\" (UID: \"0c080714-223f-4954-81ad-0fbf2d7ceff1\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-c5zvr" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.015078 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpw4c\" (UniqueName: \"kubernetes.io/projected/c8870691-c7b5-4715-8db1-2ac0f6c56ad9-kube-api-access-tpw4c\") pod \"swift-operator-controller-manager-68f46476f-mcgwg\" (UID: \"c8870691-c7b5-4715-8db1-2ac0f6c56ad9\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-mcgwg" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.015107 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr2cl\" (UniqueName: \"kubernetes.io/projected/aaf03c23-f79b-4c42-9350-dd35ace208e3-kube-api-access-xr2cl\") pod \"watcher-operator-controller-manager-56fd5cc5c9-s7k5m\" (UID: \"aaf03c23-f79b-4c42-9350-dd35ace208e3\") " pod="openstack-operators/watcher-operator-controller-manager-56fd5cc5c9-s7k5m" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.015144 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf7k5\" (UniqueName: \"kubernetes.io/projected/a8475ec0-8fed-454b-9d2e-7008db016ae4-kube-api-access-jf7k5\") pod \"test-operator-controller-manager-7866795846-7cjck\" (UID: \"a8475ec0-8fed-454b-9d2e-7008db016ae4\") " pod="openstack-operators/test-operator-controller-manager-7866795846-7cjck" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.015173 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv2tb\" (UniqueName: \"kubernetes.io/projected/f7febf8e-9f11-446d-b8f1-4bc8e0e2dce3-kube-api-access-lv2tb\") pod \"placement-operator-controller-manager-8497b45c89-2w6pw\" (UID: \"f7febf8e-9f11-446d-b8f1-4bc8e0e2dce3\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-2w6pw" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.015223 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cft725\" (UID: \"b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cft725" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.015252 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h2j6\" (UniqueName: \"kubernetes.io/projected/e6eac27e-c253-4729-9171-7adca82bbf48-kube-api-access-2h2j6\") pod \"telemetry-operator-controller-manager-7f45b4ff68-xx724\" (UID: \"e6eac27e-c253-4729-9171-7adca82bbf48\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xx724" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.015292 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvgwk\" (UniqueName: \"kubernetes.io/projected/b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d-kube-api-access-jvgwk\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cft725\" (UID: \"b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cft725" Feb 19 18:48:51 crc kubenswrapper[4749]: E0219 18:48:51.015739 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 18:48:51 crc kubenswrapper[4749]: E0219 18:48:51.015784 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c080714-223f-4954-81ad-0fbf2d7ceff1-cert podName:0c080714-223f-4954-81ad-0fbf2d7ceff1 nodeName:}" failed. No retries permitted until 2026-02-19 18:48:52.015770466 +0000 UTC m=+905.976990420 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c080714-223f-4954-81ad-0fbf2d7ceff1-cert") pod "infra-operator-controller-manager-79d975b745-c5zvr" (UID: "0c080714-223f-4954-81ad-0fbf2d7ceff1") : secret "infra-operator-webhook-server-cert" not found Feb 19 18:48:51 crc kubenswrapper[4749]: E0219 18:48:51.016399 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 18:48:51 crc kubenswrapper[4749]: E0219 18:48:51.016430 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d-cert podName:b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d nodeName:}" failed. No retries permitted until 2026-02-19 18:48:51.516421732 +0000 UTC m=+905.477641686 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cft725" (UID: "b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.034639 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv2tb\" (UniqueName: \"kubernetes.io/projected/f7febf8e-9f11-446d-b8f1-4bc8e0e2dce3-kube-api-access-lv2tb\") pod \"placement-operator-controller-manager-8497b45c89-2w6pw\" (UID: \"f7febf8e-9f11-446d-b8f1-4bc8e0e2dce3\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-2w6pw" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.035182 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvgwk\" (UniqueName: \"kubernetes.io/projected/b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d-kube-api-access-jvgwk\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cft725\" (UID: \"b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cft725" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.047489 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-x87ll" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.058012 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpw4c\" (UniqueName: \"kubernetes.io/projected/c8870691-c7b5-4715-8db1-2ac0f6c56ad9-kube-api-access-tpw4c\") pod \"swift-operator-controller-manager-68f46476f-mcgwg\" (UID: \"c8870691-c7b5-4715-8db1-2ac0f6c56ad9\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-mcgwg" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.068907 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf"] Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.069774 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf" Feb 19 18:48:51 crc kubenswrapper[4749]: W0219 18:48:51.073269 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod872c81d0_4024_4678_a081_6698ee2fe586.slice/crio-31f3fa3a52e0bfdb60d7aaa15eea5b4f5065da3ae1b221eb726220f307b7d2d6 WatchSource:0}: Error finding container 31f3fa3a52e0bfdb60d7aaa15eea5b4f5065da3ae1b221eb726220f307b7d2d6: Status 404 returned error can't find the container with id 31f3fa3a52e0bfdb60d7aaa15eea5b4f5065da3ae1b221eb726220f307b7d2d6 Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.073538 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.073762 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.074006 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-rbvqj" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.079211 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-875j6" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.092941 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf"] Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.104632 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-972mw" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.116678 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h2j6\" (UniqueName: \"kubernetes.io/projected/e6eac27e-c253-4729-9171-7adca82bbf48-kube-api-access-2h2j6\") pod \"telemetry-operator-controller-manager-7f45b4ff68-xx724\" (UID: \"e6eac27e-c253-4729-9171-7adca82bbf48\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xx724" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.116753 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr2cl\" (UniqueName: \"kubernetes.io/projected/aaf03c23-f79b-4c42-9350-dd35ace208e3-kube-api-access-xr2cl\") pod \"watcher-operator-controller-manager-56fd5cc5c9-s7k5m\" (UID: \"aaf03c23-f79b-4c42-9350-dd35ace208e3\") " pod="openstack-operators/watcher-operator-controller-manager-56fd5cc5c9-s7k5m" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.116786 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf7k5\" (UniqueName: \"kubernetes.io/projected/a8475ec0-8fed-454b-9d2e-7008db016ae4-kube-api-access-jf7k5\") pod \"test-operator-controller-manager-7866795846-7cjck\" (UID: \"a8475ec0-8fed-454b-9d2e-7008db016ae4\") " pod="openstack-operators/test-operator-controller-manager-7866795846-7cjck" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.136999 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr2cl\" (UniqueName: \"kubernetes.io/projected/aaf03c23-f79b-4c42-9350-dd35ace208e3-kube-api-access-xr2cl\") pod \"watcher-operator-controller-manager-56fd5cc5c9-s7k5m\" (UID: \"aaf03c23-f79b-4c42-9350-dd35ace208e3\") " pod="openstack-operators/watcher-operator-controller-manager-56fd5cc5c9-s7k5m" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.137152 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtd4f"] Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.137853 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-mcgwg" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.138070 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtd4f" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.141532 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-v7tmj" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.146553 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf7k5\" (UniqueName: \"kubernetes.io/projected/a8475ec0-8fed-454b-9d2e-7008db016ae4-kube-api-access-jf7k5\") pod \"test-operator-controller-manager-7866795846-7cjck\" (UID: \"a8475ec0-8fed-454b-9d2e-7008db016ae4\") " pod="openstack-operators/test-operator-controller-manager-7866795846-7cjck" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.150297 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-2w6pw" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.150442 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtd4f"] Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.153803 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h2j6\" (UniqueName: \"kubernetes.io/projected/e6eac27e-c253-4729-9171-7adca82bbf48-kube-api-access-2h2j6\") pod \"telemetry-operator-controller-manager-7f45b4ff68-xx724\" (UID: \"e6eac27e-c253-4729-9171-7adca82bbf48\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xx724" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.202878 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-kqx52"] Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.214228 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-5k6kx"] Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.217606 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-metrics-certs\") pod \"openstack-operator-controller-manager-c59d96f56-stlgf\" (UID: \"4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382\") " pod="openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.217661 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h4wq\" (UniqueName: \"kubernetes.io/projected/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-kube-api-access-7h4wq\") pod \"openstack-operator-controller-manager-c59d96f56-stlgf\" (UID: \"4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382\") " pod="openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.217714 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-webhook-certs\") pod \"openstack-operator-controller-manager-c59d96f56-stlgf\" (UID: \"4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382\") " pod="openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.217750 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7shf\" (UniqueName: \"kubernetes.io/projected/4a2510e7-7b2d-445a-b092-74831cb6701e-kube-api-access-v7shf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-rtd4f\" (UID: \"4a2510e7-7b2d-445a-b092-74831cb6701e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtd4f" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.232100 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-54ths"] Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.235734 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xx724" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.241365 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-5rm6j" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.258146 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-xmlcc"] Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.278812 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-7cjck" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.292792 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-56fd5cc5c9-s7k5m" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.319125 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-metrics-certs\") pod \"openstack-operator-controller-manager-c59d96f56-stlgf\" (UID: \"4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382\") " pod="openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.319196 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h4wq\" (UniqueName: \"kubernetes.io/projected/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-kube-api-access-7h4wq\") pod \"openstack-operator-controller-manager-c59d96f56-stlgf\" (UID: \"4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382\") " pod="openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.319384 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-webhook-certs\") pod \"openstack-operator-controller-manager-c59d96f56-stlgf\" (UID: \"4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382\") " pod="openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.319420 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7shf\" (UniqueName: \"kubernetes.io/projected/4a2510e7-7b2d-445a-b092-74831cb6701e-kube-api-access-v7shf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-rtd4f\" (UID: \"4a2510e7-7b2d-445a-b092-74831cb6701e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtd4f" Feb 19 18:48:51 crc kubenswrapper[4749]: E0219 18:48:51.319934 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 18:48:51 crc kubenswrapper[4749]: E0219 18:48:51.319990 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-metrics-certs podName:4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382 nodeName:}" failed. No retries permitted until 2026-02-19 18:48:51.819973391 +0000 UTC m=+905.781193345 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-metrics-certs") pod "openstack-operator-controller-manager-c59d96f56-stlgf" (UID: "4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382") : secret "metrics-server-cert" not found Feb 19 18:48:51 crc kubenswrapper[4749]: E0219 18:48:51.320314 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 18:48:51 crc kubenswrapper[4749]: E0219 18:48:51.320400 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-webhook-certs podName:4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382 nodeName:}" failed. No retries permitted until 2026-02-19 18:48:51.8203813 +0000 UTC m=+905.781601254 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-webhook-certs") pod "openstack-operator-controller-manager-c59d96f56-stlgf" (UID: "4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382") : secret "webhook-server-cert" not found Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.350858 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7shf\" (UniqueName: \"kubernetes.io/projected/4a2510e7-7b2d-445a-b092-74831cb6701e-kube-api-access-v7shf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-rtd4f\" (UID: \"4a2510e7-7b2d-445a-b092-74831cb6701e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtd4f" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.350876 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h4wq\" (UniqueName: \"kubernetes.io/projected/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-kube-api-access-7h4wq\") pod \"openstack-operator-controller-manager-c59d96f56-stlgf\" (UID: \"4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382\") " pod="openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.354386 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-xmlcc" event={"ID":"b5e9baf7-cc67-4cde-ba8f-256eb3c5601f","Type":"ContainerStarted","Data":"f5a68fc167dc599a53ef343d41396309ddb859d849ad9ee6724a6aaa5185baba"} Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.355356 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5k6kx" event={"ID":"82c521c0-6968-4298-afc8-e4aac617b61d","Type":"ContainerStarted","Data":"d685ada7d52f0f3eed9d35f1215e0f5a8252786c90674b51b622b0c53e4273a4"} Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.356182 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-kqx52" event={"ID":"872c81d0-4024-4678-a081-6698ee2fe586","Type":"ContainerStarted","Data":"31f3fa3a52e0bfdb60d7aaa15eea5b4f5065da3ae1b221eb726220f307b7d2d6"} Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.356939 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-54ths" event={"ID":"ed276a06-3dcf-475c-8d9c-1ee1c364f783","Type":"ContainerStarted","Data":"af412a3237ff8e7af09a29e8c4e1072da902996c61e460afde32684b74154bac"} Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.463999 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtd4f" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.479950 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-nlt7x"] Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.522782 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cft725\" (UID: \"b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cft725" Feb 19 18:48:51 crc kubenswrapper[4749]: E0219 18:48:51.523192 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 18:48:51 crc kubenswrapper[4749]: E0219 18:48:51.523243 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d-cert podName:b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d nodeName:}" failed. No retries permitted until 2026-02-19 18:48:52.523229334 +0000 UTC m=+906.484449288 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cft725" (UID: "b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.530495 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-7c76f"] Feb 19 18:48:51 crc kubenswrapper[4749]: W0219 18:48:51.532730 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7938ade1_7dc4_4927_b620_4cdcb7125a94.slice/crio-9ae3e2f640a7776ee01ed4a5aa0cee06a4056a44200282680d81b342b06b4c07 WatchSource:0}: Error finding container 9ae3e2f640a7776ee01ed4a5aa0cee06a4056a44200282680d81b342b06b4c07: Status 404 returned error can't find the container with id 9ae3e2f640a7776ee01ed4a5aa0cee06a4056a44200282680d81b342b06b4c07 Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.546118 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-66xhq"] Feb 19 18:48:51 crc kubenswrapper[4749]: W0219 18:48:51.591400 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod184b233b_5456_42e8_a09b_61221754095e.slice/crio-7e8f92870522838c08c62af279b84e72b7d086adad1a41daf4f1d7558300f305 WatchSource:0}: Error finding container 7e8f92870522838c08c62af279b84e72b7d086adad1a41daf4f1d7558300f305: Status 404 returned error can't find the container with id 7e8f92870522838c08c62af279b84e72b7d086adad1a41daf4f1d7558300f305 Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.739507 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-g4tkq"] Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.827897 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-metrics-certs\") pod \"openstack-operator-controller-manager-c59d96f56-stlgf\" (UID: \"4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382\") " pod="openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf" Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.828018 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-webhook-certs\") pod \"openstack-operator-controller-manager-c59d96f56-stlgf\" (UID: \"4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382\") " pod="openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf" Feb 19 18:48:51 crc kubenswrapper[4749]: E0219 18:48:51.828218 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 18:48:51 crc kubenswrapper[4749]: E0219 18:48:51.828284 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-webhook-certs podName:4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382 nodeName:}" failed. No retries permitted until 2026-02-19 18:48:52.828264038 +0000 UTC m=+906.789483992 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-webhook-certs") pod "openstack-operator-controller-manager-c59d96f56-stlgf" (UID: "4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382") : secret "webhook-server-cert" not found Feb 19 18:48:51 crc kubenswrapper[4749]: E0219 18:48:51.828850 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 18:48:51 crc kubenswrapper[4749]: E0219 18:48:51.830335 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-metrics-certs podName:4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382 nodeName:}" failed. No retries permitted until 2026-02-19 18:48:52.830325027 +0000 UTC m=+906.791544991 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-metrics-certs") pod "openstack-operator-controller-manager-c59d96f56-stlgf" (UID: "4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382") : secret "metrics-server-cert" not found Feb 19 18:48:51 crc kubenswrapper[4749]: W0219 18:48:51.918994 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa7dbc9c_deb4_49ae_a0c7_2130343cae10.slice/crio-39f174458e3d5abe734d59f65f88038fa6d656cf52ad319539b7c9bef0290ed5 WatchSource:0}: Error finding container 39f174458e3d5abe734d59f65f88038fa6d656cf52ad319539b7c9bef0290ed5: Status 404 returned error can't find the container with id 39f174458e3d5abe734d59f65f88038fa6d656cf52ad319539b7c9bef0290ed5 Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.922098 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-k6k2h"] Feb 19 18:48:51 crc kubenswrapper[4749]: W0219 18:48:51.928058 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod655da61c_572a_43c2_8d53_c3a3e0f95d43.slice/crio-1cc136e531800aba27267a565196134b16a9602134f379296d8740b2aadf5901 WatchSource:0}: Error finding container 1cc136e531800aba27267a565196134b16a9602134f379296d8740b2aadf5901: Status 404 returned error can't find the container with id 1cc136e531800aba27267a565196134b16a9602134f379296d8740b2aadf5901 Feb 19 18:48:51 crc kubenswrapper[4749]: I0219 18:48:51.940067 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-x87ll"] Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.007508 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-875j6"] Feb 19 18:48:52 crc kubenswrapper[4749]: W0219 18:48:52.009771 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e32dc41_84cc_42d1_bbf8_be4aa4d4b010.slice/crio-fadee557ad774aa68c8c283b1e4503feb0acdf92a518b7cfc35af3afeab9f1ad WatchSource:0}: Error finding container fadee557ad774aa68c8c283b1e4503feb0acdf92a518b7cfc35af3afeab9f1ad: Status 404 returned error can't find the container with id fadee557ad774aa68c8c283b1e4503feb0acdf92a518b7cfc35af3afeab9f1ad Feb 19 18:48:52 crc kubenswrapper[4749]: W0219 18:48:52.013628 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb31580cf_6da4_442c_aa01_bed52414bf52.slice/crio-1a86a79744b0aaf6b71b0371c4dccc457d7ef31c6abac26776d3966267bc03e9 WatchSource:0}: Error finding container 1a86a79744b0aaf6b71b0371c4dccc457d7ef31c6abac26776d3966267bc03e9: Status 404 returned error can't find the container with id 1a86a79744b0aaf6b71b0371c4dccc457d7ef31c6abac26776d3966267bc03e9 Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.017152 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-5rm6j"] Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.024820 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-mcgwg"] Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.030769 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-g44zq"] Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.031789 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c080714-223f-4954-81ad-0fbf2d7ceff1-cert\") pod \"infra-operator-controller-manager-79d975b745-c5zvr\" (UID: \"0c080714-223f-4954-81ad-0fbf2d7ceff1\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-c5zvr" Feb 19 18:48:52 crc kubenswrapper[4749]: E0219 18:48:52.031954 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 18:48:52 crc kubenswrapper[4749]: E0219 18:48:52.032042 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c080714-223f-4954-81ad-0fbf2d7ceff1-cert podName:0c080714-223f-4954-81ad-0fbf2d7ceff1 nodeName:}" failed. No retries permitted until 2026-02-19 18:48:54.032009423 +0000 UTC m=+907.993229377 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c080714-223f-4954-81ad-0fbf2d7ceff1-cert") pod "infra-operator-controller-manager-79d975b745-c5zvr" (UID: "0c080714-223f-4954-81ad-0fbf2d7ceff1") : secret "infra-operator-webhook-server-cert" not found Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.161866 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xx724"] Feb 19 18:48:52 crc kubenswrapper[4749]: W0219 18:48:52.166015 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6eac27e_c253_4729_9171_7adca82bbf48.slice/crio-b186b210f5fd14d2161422a7dd6710f6fd24ddf1d3dade8ba9555b05b034d6b5 WatchSource:0}: Error finding container b186b210f5fd14d2161422a7dd6710f6fd24ddf1d3dade8ba9555b05b034d6b5: Status 404 returned error can't find the container with id b186b210f5fd14d2161422a7dd6710f6fd24ddf1d3dade8ba9555b05b034d6b5 Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.172740 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtd4f"] Feb 19 18:48:52 crc kubenswrapper[4749]: W0219 18:48:52.173643 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a2510e7_7b2d_445a_b092_74831cb6701e.slice/crio-6d9d67f7b7b88469bf7b6a6daa4c0b2898bfca4c40d9d7750f77a787b15259a6 WatchSource:0}: Error finding container 6d9d67f7b7b88469bf7b6a6daa4c0b2898bfca4c40d9d7750f77a787b15259a6: Status 404 returned error can't find the container with id 6d9d67f7b7b88469bf7b6a6daa4c0b2898bfca4c40d9d7750f77a787b15259a6 Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.185091 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-56fd5cc5c9-s7k5m"] Feb 19 18:48:52 crc kubenswrapper[4749]: E0219 18:48:52.189525 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v7shf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-rtd4f_openstack-operators(4a2510e7-7b2d-445a-b092-74831cb6701e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 18:48:52 crc kubenswrapper[4749]: E0219 18:48:52.189995 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.75:5001/openstack-k8s-operators/watcher-operator:eaf82eeed7c641cca4b0e467ff9bfd7468ff8986,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xr2cl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-56fd5cc5c9-s7k5m_openstack-operators(aaf03c23-f79b-4c42-9350-dd35ace208e3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 18:48:52 crc kubenswrapper[4749]: E0219 18:48:52.190857 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtd4f" podUID="4a2510e7-7b2d-445a-b092-74831cb6701e" Feb 19 18:48:52 crc kubenswrapper[4749]: E0219 18:48:52.191279 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-56fd5cc5c9-s7k5m" podUID="aaf03c23-f79b-4c42-9350-dd35ace208e3" Feb 19 18:48:52 crc kubenswrapper[4749]: E0219 18:48:52.194933 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jf7k5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7866795846-7cjck_openstack-operators(a8475ec0-8fed-454b-9d2e-7008db016ae4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 18:48:52 crc kubenswrapper[4749]: E0219 18:48:52.196110 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7866795846-7cjck" podUID="a8475ec0-8fed-454b-9d2e-7008db016ae4" Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.198255 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-7cjck"] Feb 19 18:48:52 crc kubenswrapper[4749]: W0219 18:48:52.204091 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61c9ed9f_0dff_4560_a6d3_a621e1a6ff09.slice/crio-9c5a80970c9a7355a4bed3447128b07c16989252d389645d631f309e9268301a WatchSource:0}: Error finding container 9c5a80970c9a7355a4bed3447128b07c16989252d389645d631f309e9268301a: Status 404 returned error can't find the container with id 9c5a80970c9a7355a4bed3447128b07c16989252d389645d631f309e9268301a Feb 19 18:48:52 crc kubenswrapper[4749]: W0219 18:48:52.206818 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7febf8e_9f11_446d_b8f1_4bc8e0e2dce3.slice/crio-0f2d799d5af05bc650aa8135e65aca779fa237ec7d7f4e7874e748ff9a7d18a3 WatchSource:0}: Error finding container 0f2d799d5af05bc650aa8135e65aca779fa237ec7d7f4e7874e748ff9a7d18a3: Status 404 returned error can't find the container with id 0f2d799d5af05bc650aa8135e65aca779fa237ec7d7f4e7874e748ff9a7d18a3 Feb 19 18:48:52 crc kubenswrapper[4749]: E0219 18:48:52.209005 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lv2tb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8497b45c89-2w6pw_openstack-operators(f7febf8e-9f11-446d-b8f1-4bc8e0e2dce3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 18:48:52 crc kubenswrapper[4749]: E0219 18:48:52.210279 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-2w6pw" podUID="f7febf8e-9f11-446d-b8f1-4bc8e0e2dce3" Feb 19 18:48:52 crc kubenswrapper[4749]: E0219 18:48:52.210922 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r4hz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-d44cf6b75-972mw_openstack-operators(61c9ed9f-0dff-4560-a6d3-a621e1a6ff09): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 18:48:52 crc kubenswrapper[4749]: E0219 18:48:52.216086 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-972mw" podUID="61c9ed9f-0dff-4560-a6d3-a621e1a6ff09" Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.218604 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-2w6pw"] Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.223395 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-972mw"] Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.395119 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-7cjck" event={"ID":"a8475ec0-8fed-454b-9d2e-7008db016ae4","Type":"ContainerStarted","Data":"f202cbbff8af29f60f27b72a5af5119f49cf75d22fb22cd5bd81cad34b100178"} Feb 19 18:48:52 crc kubenswrapper[4749]: E0219 18:48:52.396747 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-7cjck" podUID="a8475ec0-8fed-454b-9d2e-7008db016ae4" Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.398335 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-972mw" event={"ID":"61c9ed9f-0dff-4560-a6d3-a621e1a6ff09","Type":"ContainerStarted","Data":"9c5a80970c9a7355a4bed3447128b07c16989252d389645d631f309e9268301a"} Feb 19 18:48:52 crc kubenswrapper[4749]: E0219 18:48:52.399801 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-972mw" podUID="61c9ed9f-0dff-4560-a6d3-a621e1a6ff09" Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.401204 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-g44zq" event={"ID":"95419345-6f7d-4cb6-b0c6-75bdebf35ade","Type":"ContainerStarted","Data":"5092462df6db2cdedf376cc6578f521e5f25306812b65a0a1119d48f97827f1d"} Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.406918 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-875j6" event={"ID":"b31580cf-6da4-442c-aa01-bed52414bf52","Type":"ContainerStarted","Data":"1a86a79744b0aaf6b71b0371c4dccc457d7ef31c6abac26776d3966267bc03e9"} Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.408718 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-66xhq" event={"ID":"184b233b-5456-42e8-a09b-61221754095e","Type":"ContainerStarted","Data":"7e8f92870522838c08c62af279b84e72b7d086adad1a41daf4f1d7558300f305"} Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.410501 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-5rm6j" event={"ID":"0e32dc41-84cc-42d1-bbf8-be4aa4d4b010","Type":"ContainerStarted","Data":"fadee557ad774aa68c8c283b1e4503feb0acdf92a518b7cfc35af3afeab9f1ad"} Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.412266 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-x87ll" event={"ID":"655da61c-572a-43c2-8d53-c3a3e0f95d43","Type":"ContainerStarted","Data":"1cc136e531800aba27267a565196134b16a9602134f379296d8740b2aadf5901"} Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.417571 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-7c76f" event={"ID":"bda10183-e834-4f98-a0cf-47ce14f1d333","Type":"ContainerStarted","Data":"52d9527d8ac15a05aa8af86ea2c48f8b4dafb3dde4c0fec673fcbe5d352764cf"} Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.419562 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-2w6pw" event={"ID":"f7febf8e-9f11-446d-b8f1-4bc8e0e2dce3","Type":"ContainerStarted","Data":"0f2d799d5af05bc650aa8135e65aca779fa237ec7d7f4e7874e748ff9a7d18a3"} Feb 19 18:48:52 crc kubenswrapper[4749]: E0219 18:48:52.420760 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-2w6pw" podUID="f7febf8e-9f11-446d-b8f1-4bc8e0e2dce3" Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.433206 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-mcgwg" event={"ID":"c8870691-c7b5-4715-8db1-2ac0f6c56ad9","Type":"ContainerStarted","Data":"6d39f8b1da4b29363e80eb457c3d1a2cf40dca64b4f9cf28cfcac6ae4a6ea65b"} Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.434797 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-g4tkq" event={"ID":"eaf1fbac-75fa-4442-811d-8f51e3a1e66b","Type":"ContainerStarted","Data":"0cd7cd3d00538260c92a22d383ffb347a7491e6c12f1e66b3282c09d718c7349"} Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.446136 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xx724" event={"ID":"e6eac27e-c253-4729-9171-7adca82bbf48","Type":"ContainerStarted","Data":"b186b210f5fd14d2161422a7dd6710f6fd24ddf1d3dade8ba9555b05b034d6b5"} Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.447619 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-k6k2h" event={"ID":"aa7dbc9c-deb4-49ae-a0c7-2130343cae10","Type":"ContainerStarted","Data":"39f174458e3d5abe734d59f65f88038fa6d656cf52ad319539b7c9bef0290ed5"} Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.450420 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-nlt7x" event={"ID":"7938ade1-7dc4-4927-b620-4cdcb7125a94","Type":"ContainerStarted","Data":"9ae3e2f640a7776ee01ed4a5aa0cee06a4056a44200282680d81b342b06b4c07"} Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.451894 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtd4f" event={"ID":"4a2510e7-7b2d-445a-b092-74831cb6701e","Type":"ContainerStarted","Data":"6d9d67f7b7b88469bf7b6a6daa4c0b2898bfca4c40d9d7750f77a787b15259a6"} Feb 19 18:48:52 crc kubenswrapper[4749]: E0219 18:48:52.454345 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtd4f" podUID="4a2510e7-7b2d-445a-b092-74831cb6701e" Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.454544 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-56fd5cc5c9-s7k5m" event={"ID":"aaf03c23-f79b-4c42-9350-dd35ace208e3","Type":"ContainerStarted","Data":"21d3be5d00baa5e5c1862964d9f50fa98aab93a236b8ff32e394a05e7aa1823f"} Feb 19 18:48:52 crc kubenswrapper[4749]: E0219 18:48:52.456189 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.75:5001/openstack-k8s-operators/watcher-operator:eaf82eeed7c641cca4b0e467ff9bfd7468ff8986\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-56fd5cc5c9-s7k5m" podUID="aaf03c23-f79b-4c42-9350-dd35ace208e3" Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.548533 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cft725\" (UID: \"b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cft725" Feb 19 18:48:52 crc kubenswrapper[4749]: E0219 18:48:52.548764 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 18:48:52 crc kubenswrapper[4749]: E0219 18:48:52.548857 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d-cert podName:b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d nodeName:}" failed. No retries permitted until 2026-02-19 18:48:54.548835215 +0000 UTC m=+908.510055179 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cft725" (UID: "b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.858856 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-metrics-certs\") pod \"openstack-operator-controller-manager-c59d96f56-stlgf\" (UID: \"4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382\") " pod="openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf" Feb 19 18:48:52 crc kubenswrapper[4749]: I0219 18:48:52.858925 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-webhook-certs\") pod \"openstack-operator-controller-manager-c59d96f56-stlgf\" (UID: \"4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382\") " pod="openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf" Feb 19 18:48:52 crc kubenswrapper[4749]: E0219 18:48:52.859063 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 18:48:52 crc kubenswrapper[4749]: E0219 18:48:52.859110 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-webhook-certs podName:4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382 nodeName:}" failed. No retries permitted until 2026-02-19 18:48:54.859094624 +0000 UTC m=+908.820314578 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-webhook-certs") pod "openstack-operator-controller-manager-c59d96f56-stlgf" (UID: "4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382") : secret "webhook-server-cert" not found Feb 19 18:48:52 crc kubenswrapper[4749]: E0219 18:48:52.859463 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 18:48:52 crc kubenswrapper[4749]: E0219 18:48:52.859490 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-metrics-certs podName:4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382 nodeName:}" failed. No retries permitted until 2026-02-19 18:48:54.859481443 +0000 UTC m=+908.820701387 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-metrics-certs") pod "openstack-operator-controller-manager-c59d96f56-stlgf" (UID: "4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382") : secret "metrics-server-cert" not found Feb 19 18:48:53 crc kubenswrapper[4749]: E0219 18:48:53.462085 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-2w6pw" podUID="f7febf8e-9f11-446d-b8f1-4bc8e0e2dce3" Feb 19 18:48:53 crc kubenswrapper[4749]: E0219 18:48:53.462145 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-7cjck" podUID="a8475ec0-8fed-454b-9d2e-7008db016ae4" Feb 19 18:48:53 crc kubenswrapper[4749]: E0219 18:48:53.462626 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-972mw" podUID="61c9ed9f-0dff-4560-a6d3-a621e1a6ff09" Feb 19 18:48:53 crc kubenswrapper[4749]: E0219 18:48:53.462656 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtd4f" podUID="4a2510e7-7b2d-445a-b092-74831cb6701e" Feb 19 18:48:53 crc kubenswrapper[4749]: E0219 18:48:53.462752 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.75:5001/openstack-k8s-operators/watcher-operator:eaf82eeed7c641cca4b0e467ff9bfd7468ff8986\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-56fd5cc5c9-s7k5m" podUID="aaf03c23-f79b-4c42-9350-dd35ace208e3" Feb 19 18:48:54 crc kubenswrapper[4749]: I0219 18:48:54.074644 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c080714-223f-4954-81ad-0fbf2d7ceff1-cert\") pod \"infra-operator-controller-manager-79d975b745-c5zvr\" (UID: \"0c080714-223f-4954-81ad-0fbf2d7ceff1\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-c5zvr" Feb 19 18:48:54 crc kubenswrapper[4749]: E0219 18:48:54.074882 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 18:48:54 crc kubenswrapper[4749]: E0219 18:48:54.074978 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c080714-223f-4954-81ad-0fbf2d7ceff1-cert podName:0c080714-223f-4954-81ad-0fbf2d7ceff1 nodeName:}" failed. No retries permitted until 2026-02-19 18:48:58.074956354 +0000 UTC m=+912.036176308 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c080714-223f-4954-81ad-0fbf2d7ceff1-cert") pod "infra-operator-controller-manager-79d975b745-c5zvr" (UID: "0c080714-223f-4954-81ad-0fbf2d7ceff1") : secret "infra-operator-webhook-server-cert" not found Feb 19 18:48:54 crc kubenswrapper[4749]: I0219 18:48:54.684626 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cft725\" (UID: \"b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cft725" Feb 19 18:48:54 crc kubenswrapper[4749]: E0219 18:48:54.684753 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 18:48:54 crc kubenswrapper[4749]: E0219 18:48:54.684796 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d-cert podName:b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d nodeName:}" failed. No retries permitted until 2026-02-19 18:48:58.684781595 +0000 UTC m=+912.646001549 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cft725" (UID: "b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 18:48:54 crc kubenswrapper[4749]: I0219 18:48:54.887569 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-metrics-certs\") pod \"openstack-operator-controller-manager-c59d96f56-stlgf\" (UID: \"4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382\") " pod="openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf" Feb 19 18:48:54 crc kubenswrapper[4749]: I0219 18:48:54.887661 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-webhook-certs\") pod \"openstack-operator-controller-manager-c59d96f56-stlgf\" (UID: \"4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382\") " pod="openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf" Feb 19 18:48:54 crc kubenswrapper[4749]: E0219 18:48:54.887721 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 18:48:54 crc kubenswrapper[4749]: E0219 18:48:54.887771 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-metrics-certs podName:4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382 nodeName:}" failed. No retries permitted until 2026-02-19 18:48:58.887756091 +0000 UTC m=+912.848976035 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-metrics-certs") pod "openstack-operator-controller-manager-c59d96f56-stlgf" (UID: "4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382") : secret "metrics-server-cert" not found Feb 19 18:48:54 crc kubenswrapper[4749]: E0219 18:48:54.887859 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 18:48:54 crc kubenswrapper[4749]: E0219 18:48:54.887958 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-webhook-certs podName:4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382 nodeName:}" failed. No retries permitted until 2026-02-19 18:48:58.887922545 +0000 UTC m=+912.849142499 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-webhook-certs") pod "openstack-operator-controller-manager-c59d96f56-stlgf" (UID: "4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382") : secret "webhook-server-cert" not found Feb 19 18:48:58 crc kubenswrapper[4749]: I0219 18:48:58.136177 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c080714-223f-4954-81ad-0fbf2d7ceff1-cert\") pod \"infra-operator-controller-manager-79d975b745-c5zvr\" (UID: \"0c080714-223f-4954-81ad-0fbf2d7ceff1\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-c5zvr" Feb 19 18:48:58 crc kubenswrapper[4749]: E0219 18:48:58.136360 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 18:48:58 crc kubenswrapper[4749]: E0219 18:48:58.136907 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c080714-223f-4954-81ad-0fbf2d7ceff1-cert podName:0c080714-223f-4954-81ad-0fbf2d7ceff1 nodeName:}" failed. No retries permitted until 2026-02-19 18:49:06.136890719 +0000 UTC m=+920.098110673 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c080714-223f-4954-81ad-0fbf2d7ceff1-cert") pod "infra-operator-controller-manager-79d975b745-c5zvr" (UID: "0c080714-223f-4954-81ad-0fbf2d7ceff1") : secret "infra-operator-webhook-server-cert" not found Feb 19 18:48:58 crc kubenswrapper[4749]: I0219 18:48:58.744775 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cft725\" (UID: \"b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cft725" Feb 19 18:48:58 crc kubenswrapper[4749]: E0219 18:48:58.744959 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 18:48:58 crc kubenswrapper[4749]: E0219 18:48:58.745697 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d-cert podName:b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d nodeName:}" failed. No retries permitted until 2026-02-19 18:49:06.745678475 +0000 UTC m=+920.706898429 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cft725" (UID: "b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 18:48:58 crc kubenswrapper[4749]: I0219 18:48:58.947831 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-metrics-certs\") pod \"openstack-operator-controller-manager-c59d96f56-stlgf\" (UID: \"4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382\") " pod="openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf" Feb 19 18:48:58 crc kubenswrapper[4749]: I0219 18:48:58.947925 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-webhook-certs\") pod \"openstack-operator-controller-manager-c59d96f56-stlgf\" (UID: \"4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382\") " pod="openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf" Feb 19 18:48:58 crc kubenswrapper[4749]: E0219 18:48:58.948096 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 18:48:58 crc kubenswrapper[4749]: E0219 18:48:58.948204 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-webhook-certs podName:4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382 nodeName:}" failed. No retries permitted until 2026-02-19 18:49:06.94816942 +0000 UTC m=+920.909389374 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-webhook-certs") pod "openstack-operator-controller-manager-c59d96f56-stlgf" (UID: "4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382") : secret "webhook-server-cert" not found Feb 19 18:48:58 crc kubenswrapper[4749]: E0219 18:48:58.948005 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 18:48:58 crc kubenswrapper[4749]: E0219 18:48:58.948350 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-metrics-certs podName:4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382 nodeName:}" failed. No retries permitted until 2026-02-19 18:49:06.948330494 +0000 UTC m=+920.909550448 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-metrics-certs") pod "openstack-operator-controller-manager-c59d96f56-stlgf" (UID: "4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382") : secret "metrics-server-cert" not found Feb 19 18:49:04 crc kubenswrapper[4749]: I0219 18:49:04.679792 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 18:49:05 crc kubenswrapper[4749]: I0219 18:49:05.832747 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-g4tkq" event={"ID":"eaf1fbac-75fa-4442-811d-8f51e3a1e66b","Type":"ContainerStarted","Data":"d138ca0daf0ff6c53e29dc2141a1fdfd4a13739ff428e3c91b06ac8babede1f7"} Feb 19 18:49:05 crc kubenswrapper[4749]: I0219 18:49:05.833902 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-g4tkq" Feb 19 18:49:05 crc kubenswrapper[4749]: I0219 18:49:05.849898 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-kqx52" event={"ID":"872c81d0-4024-4678-a081-6698ee2fe586","Type":"ContainerStarted","Data":"cff871cb1c59e02734f06c5c3a79095bf809429d88fbe4c7f46ba48d7706a098"} Feb 19 18:49:05 crc kubenswrapper[4749]: I0219 18:49:05.850701 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-kqx52" Feb 19 18:49:05 crc kubenswrapper[4749]: I0219 18:49:05.866798 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-g4tkq" podStartSLOduration=3.203125764 podStartE2EDuration="15.866783175s" podCreationTimestamp="2026-02-19 18:48:50 +0000 UTC" firstStartedPulling="2026-02-19 18:48:51.752546343 +0000 UTC m=+905.713766287" lastFinishedPulling="2026-02-19 18:49:04.416203744 +0000 UTC m=+918.377423698" observedRunningTime="2026-02-19 18:49:05.866160631 +0000 UTC m=+919.827380585" watchObservedRunningTime="2026-02-19 18:49:05.866783175 +0000 UTC m=+919.828003129" Feb 19 18:49:05 crc kubenswrapper[4749]: I0219 18:49:05.868736 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-x87ll" event={"ID":"655da61c-572a-43c2-8d53-c3a3e0f95d43","Type":"ContainerStarted","Data":"990b9eb2c9291dd5ddfaddfc0acbc1a1786d8039289f7116e5f128b181d78b37"} Feb 19 18:49:05 crc kubenswrapper[4749]: I0219 18:49:05.869433 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-x87ll" Feb 19 18:49:05 crc kubenswrapper[4749]: I0219 18:49:05.881488 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xx724" event={"ID":"e6eac27e-c253-4729-9171-7adca82bbf48","Type":"ContainerStarted","Data":"c090a341602435d202889e2d6de829325b10e463b3b763ef6a2f4b3cdffccbc4"} Feb 19 18:49:05 crc kubenswrapper[4749]: I0219 18:49:05.882274 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xx724" Feb 19 18:49:05 crc kubenswrapper[4749]: I0219 18:49:05.897939 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-kqx52" podStartSLOduration=2.625419355 podStartE2EDuration="15.897922956s" podCreationTimestamp="2026-02-19 18:48:50 +0000 UTC" firstStartedPulling="2026-02-19 18:48:51.145839435 +0000 UTC m=+905.107059389" lastFinishedPulling="2026-02-19 18:49:04.418343036 +0000 UTC m=+918.379562990" observedRunningTime="2026-02-19 18:49:05.89397481 +0000 UTC m=+919.855194764" watchObservedRunningTime="2026-02-19 18:49:05.897922956 +0000 UTC m=+919.859142910" Feb 19 18:49:05 crc kubenswrapper[4749]: I0219 18:49:05.900393 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-xmlcc" event={"ID":"b5e9baf7-cc67-4cde-ba8f-256eb3c5601f","Type":"ContainerStarted","Data":"ba1467d7df79d26cb9515119ad3c4ae8500ebf6ac058875c1c772b600df01576"} Feb 19 18:49:05 crc kubenswrapper[4749]: I0219 18:49:05.901123 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-xmlcc" Feb 19 18:49:05 crc kubenswrapper[4749]: I0219 18:49:05.919393 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-7c76f" event={"ID":"bda10183-e834-4f98-a0cf-47ce14f1d333","Type":"ContainerStarted","Data":"3dc88105d7f40508e9d85ca08db392897a84dc830f704f143708dccd4bcaac60"} Feb 19 18:49:05 crc kubenswrapper[4749]: I0219 18:49:05.919842 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-7c76f" Feb 19 18:49:05 crc kubenswrapper[4749]: I0219 18:49:05.935266 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-nlt7x" event={"ID":"7938ade1-7dc4-4927-b620-4cdcb7125a94","Type":"ContainerStarted","Data":"01f198c7ab693bcd4cd4c27c09e6c09a480b77dd492c01da6f797180fce3b5a2"} Feb 19 18:49:05 crc kubenswrapper[4749]: I0219 18:49:05.935724 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-nlt7x" Feb 19 18:49:05 crc kubenswrapper[4749]: I0219 18:49:05.955017 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xx724" podStartSLOduration=3.711681526 podStartE2EDuration="15.955004629s" podCreationTimestamp="2026-02-19 18:48:50 +0000 UTC" firstStartedPulling="2026-02-19 18:48:52.174533331 +0000 UTC m=+906.135753285" lastFinishedPulling="2026-02-19 18:49:04.417856444 +0000 UTC m=+918.379076388" observedRunningTime="2026-02-19 18:49:05.945289935 +0000 UTC m=+919.906509889" watchObservedRunningTime="2026-02-19 18:49:05.955004629 +0000 UTC m=+919.916224583" Feb 19 18:49:05 crc kubenswrapper[4749]: I0219 18:49:05.956224 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-k6k2h" event={"ID":"aa7dbc9c-deb4-49ae-a0c7-2130343cae10","Type":"ContainerStarted","Data":"9334191d5e3818a8d21238a7dc9a7a3a5832c01dd78eca691c6bb0d4dcbba629"} Feb 19 18:49:05 crc kubenswrapper[4749]: I0219 18:49:05.956849 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-k6k2h" Feb 19 18:49:05 crc kubenswrapper[4749]: I0219 18:49:05.976521 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-875j6" event={"ID":"b31580cf-6da4-442c-aa01-bed52414bf52","Type":"ContainerStarted","Data":"71c085a9363728de23b8ac4de80277a635ae821f51d52a761e88ba08d2667981"} Feb 19 18:49:05 crc kubenswrapper[4749]: I0219 18:49:05.977322 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-875j6" Feb 19 18:49:05 crc kubenswrapper[4749]: I0219 18:49:05.985196 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-x87ll" podStartSLOduration=3.368207591 podStartE2EDuration="15.985180476s" podCreationTimestamp="2026-02-19 18:48:50 +0000 UTC" firstStartedPulling="2026-02-19 18:48:51.93094312 +0000 UTC m=+905.892163074" lastFinishedPulling="2026-02-19 18:49:04.547916005 +0000 UTC m=+918.509135959" observedRunningTime="2026-02-19 18:49:05.979469578 +0000 UTC m=+919.940689542" watchObservedRunningTime="2026-02-19 18:49:05.985180476 +0000 UTC m=+919.946400430" Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.007309 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-mcgwg" event={"ID":"c8870691-c7b5-4715-8db1-2ac0f6c56ad9","Type":"ContainerStarted","Data":"b664cee4d9714864496f8ebc178d77104ffbb36e1d12d632ba842db7d9892108"} Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.008005 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-mcgwg" Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.021806 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-g44zq" event={"ID":"95419345-6f7d-4cb6-b0c6-75bdebf35ade","Type":"ContainerStarted","Data":"066a8993e45e704eb4f2c55fe83cf3e3c9380cbf0f7bf18b696f71bec8130d08"} Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.021857 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-g44zq" Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.034164 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-7c76f" podStartSLOduration=3.16442506 podStartE2EDuration="16.034139844s" podCreationTimestamp="2026-02-19 18:48:50 +0000 UTC" firstStartedPulling="2026-02-19 18:48:51.548590142 +0000 UTC m=+905.509810096" lastFinishedPulling="2026-02-19 18:49:04.418304926 +0000 UTC m=+918.379524880" observedRunningTime="2026-02-19 18:49:06.003589199 +0000 UTC m=+919.964809153" watchObservedRunningTime="2026-02-19 18:49:06.034139844 +0000 UTC m=+919.995359818" Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.034459 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-875j6" podStartSLOduration=3.632629738 podStartE2EDuration="16.034454152s" podCreationTimestamp="2026-02-19 18:48:50 +0000 UTC" firstStartedPulling="2026-02-19 18:48:52.015664682 +0000 UTC m=+905.976884636" lastFinishedPulling="2026-02-19 18:49:04.417489096 +0000 UTC m=+918.378709050" observedRunningTime="2026-02-19 18:49:06.029308878 +0000 UTC m=+919.990528832" watchObservedRunningTime="2026-02-19 18:49:06.034454152 +0000 UTC m=+919.995674106" Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.047131 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5k6kx" event={"ID":"82c521c0-6968-4298-afc8-e4aac617b61d","Type":"ContainerStarted","Data":"88c80b7e834c6615de152f0f73c4597e91f90c199329b28f17313fb050c7beea"} Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.047172 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5k6kx" Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.071510 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-k6k2h" podStartSLOduration=3.576208248 podStartE2EDuration="16.071494223s" podCreationTimestamp="2026-02-19 18:48:50 +0000 UTC" firstStartedPulling="2026-02-19 18:48:51.922504158 +0000 UTC m=+905.883724112" lastFinishedPulling="2026-02-19 18:49:04.417790133 +0000 UTC m=+918.379010087" observedRunningTime="2026-02-19 18:49:06.066503024 +0000 UTC m=+920.027722978" watchObservedRunningTime="2026-02-19 18:49:06.071494223 +0000 UTC m=+920.032714177" Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.073090 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-66xhq" event={"ID":"184b233b-5456-42e8-a09b-61221754095e","Type":"ContainerStarted","Data":"3fd5eb314e78ece696f3d2ca9d82a19cc3cb525aae52a3e6fd1b30c4df51ae91"} Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.073608 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-66xhq" Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.075112 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-5rm6j" event={"ID":"0e32dc41-84cc-42d1-bbf8-be4aa4d4b010","Type":"ContainerStarted","Data":"bd3e5563bc84beaa38d01de46387e008b36cca7ed11999bb33ed3cd9fa83fa43"} Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.075457 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-5rm6j" Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.076524 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-54ths" event={"ID":"ed276a06-3dcf-475c-8d9c-1ee1c364f783","Type":"ContainerStarted","Data":"dc14dad2d0e47794094eb497b4b85d6e0b6bbee0bd98808ed491fecbd7979814"} Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.076867 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-54ths" Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.090532 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-xmlcc" podStartSLOduration=2.874153726 podStartE2EDuration="16.090511771s" podCreationTimestamp="2026-02-19 18:48:50 +0000 UTC" firstStartedPulling="2026-02-19 18:48:51.200889735 +0000 UTC m=+905.162109689" lastFinishedPulling="2026-02-19 18:49:04.41724778 +0000 UTC m=+918.378467734" observedRunningTime="2026-02-19 18:49:06.084380674 +0000 UTC m=+920.045600628" watchObservedRunningTime="2026-02-19 18:49:06.090511771 +0000 UTC m=+920.051731725" Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.135252 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-nlt7x" podStartSLOduration=3.253245631 podStartE2EDuration="16.135229038s" podCreationTimestamp="2026-02-19 18:48:50 +0000 UTC" firstStartedPulling="2026-02-19 18:48:51.535458807 +0000 UTC m=+905.496678751" lastFinishedPulling="2026-02-19 18:49:04.417442204 +0000 UTC m=+918.378662158" observedRunningTime="2026-02-19 18:49:06.129284775 +0000 UTC m=+920.090504729" watchObservedRunningTime="2026-02-19 18:49:06.135229038 +0000 UTC m=+920.096449002" Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.165855 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c080714-223f-4954-81ad-0fbf2d7ceff1-cert\") pod \"infra-operator-controller-manager-79d975b745-c5zvr\" (UID: \"0c080714-223f-4954-81ad-0fbf2d7ceff1\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-c5zvr" Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.178835 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c080714-223f-4954-81ad-0fbf2d7ceff1-cert\") pod \"infra-operator-controller-manager-79d975b745-c5zvr\" (UID: \"0c080714-223f-4954-81ad-0fbf2d7ceff1\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-c5zvr" Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.180753 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-mcgwg" podStartSLOduration=3.790338044 podStartE2EDuration="16.180732944s" podCreationTimestamp="2026-02-19 18:48:50 +0000 UTC" firstStartedPulling="2026-02-19 18:48:52.028728995 +0000 UTC m=+905.989948939" lastFinishedPulling="2026-02-19 18:49:04.419123885 +0000 UTC m=+918.380343839" observedRunningTime="2026-02-19 18:49:06.16525745 +0000 UTC m=+920.126477404" watchObservedRunningTime="2026-02-19 18:49:06.180732944 +0000 UTC m=+920.141952898" Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.227081 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-54ths" podStartSLOduration=2.995178942 podStartE2EDuration="16.227067139s" podCreationTimestamp="2026-02-19 18:48:50 +0000 UTC" firstStartedPulling="2026-02-19 18:48:51.184241466 +0000 UTC m=+905.145461420" lastFinishedPulling="2026-02-19 18:49:04.416129663 +0000 UTC m=+918.377349617" observedRunningTime="2026-02-19 18:49:06.226896364 +0000 UTC m=+920.188116318" watchObservedRunningTime="2026-02-19 18:49:06.227067139 +0000 UTC m=+920.188287093" Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.233436 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-g44zq" podStartSLOduration=3.84382067 podStartE2EDuration="16.233419301s" podCreationTimestamp="2026-02-19 18:48:50 +0000 UTC" firstStartedPulling="2026-02-19 18:48:52.027522956 +0000 UTC m=+905.988742910" lastFinishedPulling="2026-02-19 18:49:04.417121587 +0000 UTC m=+918.378341541" observedRunningTime="2026-02-19 18:49:06.198384128 +0000 UTC m=+920.159604102" watchObservedRunningTime="2026-02-19 18:49:06.233419301 +0000 UTC m=+920.194639255" Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.252387 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-c5zvr" Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.270196 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5k6kx" podStartSLOduration=3.052992051 podStartE2EDuration="16.270179596s" podCreationTimestamp="2026-02-19 18:48:50 +0000 UTC" firstStartedPulling="2026-02-19 18:48:51.201298735 +0000 UTC m=+905.162518689" lastFinishedPulling="2026-02-19 18:49:04.41848628 +0000 UTC m=+918.379706234" observedRunningTime="2026-02-19 18:49:06.257516681 +0000 UTC m=+920.218736635" watchObservedRunningTime="2026-02-19 18:49:06.270179596 +0000 UTC m=+920.231399550" Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.361517 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-5rm6j" podStartSLOduration=3.9574175069999997 podStartE2EDuration="16.361499825s" podCreationTimestamp="2026-02-19 18:48:50 +0000 UTC" firstStartedPulling="2026-02-19 18:48:52.012133917 +0000 UTC m=+905.973353871" lastFinishedPulling="2026-02-19 18:49:04.416216235 +0000 UTC m=+918.377436189" observedRunningTime="2026-02-19 18:49:06.322763933 +0000 UTC m=+920.283983887" watchObservedRunningTime="2026-02-19 18:49:06.361499825 +0000 UTC m=+920.322719789" Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.782708 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cft725\" (UID: \"b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cft725" Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.788811 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cft725\" (UID: \"b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cft725" Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.866570 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-66xhq" podStartSLOduration=4.066255344 podStartE2EDuration="16.866550513s" podCreationTimestamp="2026-02-19 18:48:50 +0000 UTC" firstStartedPulling="2026-02-19 18:48:51.615932646 +0000 UTC m=+905.577152610" lastFinishedPulling="2026-02-19 18:49:04.416227825 +0000 UTC m=+918.377447779" observedRunningTime="2026-02-19 18:49:06.36583851 +0000 UTC m=+920.327058464" watchObservedRunningTime="2026-02-19 18:49:06.866550513 +0000 UTC m=+920.827770467" Feb 19 18:49:06 crc kubenswrapper[4749]: I0219 18:49:06.870383 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-c5zvr"] Feb 19 18:49:07 crc kubenswrapper[4749]: I0219 18:49:07.002150 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-metrics-certs\") pod \"openstack-operator-controller-manager-c59d96f56-stlgf\" (UID: \"4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382\") " pod="openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf" Feb 19 18:49:07 crc kubenswrapper[4749]: I0219 18:49:07.002219 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-webhook-certs\") pod \"openstack-operator-controller-manager-c59d96f56-stlgf\" (UID: \"4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382\") " pod="openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf" Feb 19 18:49:07 crc kubenswrapper[4749]: E0219 18:49:07.002366 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 18:49:07 crc kubenswrapper[4749]: E0219 18:49:07.002415 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-webhook-certs podName:4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382 nodeName:}" failed. No retries permitted until 2026-02-19 18:49:23.002398343 +0000 UTC m=+936.963618297 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-webhook-certs") pod "openstack-operator-controller-manager-c59d96f56-stlgf" (UID: "4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382") : secret "webhook-server-cert" not found Feb 19 18:49:07 crc kubenswrapper[4749]: I0219 18:49:07.009764 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-metrics-certs\") pod \"openstack-operator-controller-manager-c59d96f56-stlgf\" (UID: \"4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382\") " pod="openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf" Feb 19 18:49:07 crc kubenswrapper[4749]: I0219 18:49:07.076054 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cft725" Feb 19 18:49:07 crc kubenswrapper[4749]: I0219 18:49:07.107481 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-c5zvr" event={"ID":"0c080714-223f-4954-81ad-0fbf2d7ceff1","Type":"ContainerStarted","Data":"0a73381589f01e4254fc5c6d021cf208e9fe64f4b0a6295f60cf1d7e4c89e08a"} Feb 19 18:49:07 crc kubenswrapper[4749]: I0219 18:49:07.447782 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cft725"] Feb 19 18:49:08 crc kubenswrapper[4749]: I0219 18:49:08.124560 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cft725" event={"ID":"b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d","Type":"ContainerStarted","Data":"b906484d6eea52c8c7419652a5bda9755a14fb3ce2fddf7342567e786ffc1ba2"} Feb 19 18:49:10 crc kubenswrapper[4749]: I0219 18:49:10.447668 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-54ths" Feb 19 18:49:10 crc kubenswrapper[4749]: I0219 18:49:10.498677 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5k6kx" Feb 19 18:49:10 crc kubenswrapper[4749]: I0219 18:49:10.554956 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-kqx52" Feb 19 18:49:10 crc kubenswrapper[4749]: I0219 18:49:10.633761 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-xmlcc" Feb 19 18:49:10 crc kubenswrapper[4749]: I0219 18:49:10.756221 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-nlt7x" Feb 19 18:49:10 crc kubenswrapper[4749]: I0219 18:49:10.890169 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-66xhq" Feb 19 18:49:10 crc kubenswrapper[4749]: I0219 18:49:10.915777 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-7c76f" Feb 19 18:49:10 crc kubenswrapper[4749]: I0219 18:49:10.956077 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-g4tkq" Feb 19 18:49:10 crc kubenswrapper[4749]: I0219 18:49:10.978760 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-g44zq" Feb 19 18:49:10 crc kubenswrapper[4749]: I0219 18:49:10.991957 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-k6k2h" Feb 19 18:49:11 crc kubenswrapper[4749]: I0219 18:49:11.051671 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-x87ll" Feb 19 18:49:11 crc kubenswrapper[4749]: I0219 18:49:11.083990 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-875j6" Feb 19 18:49:11 crc kubenswrapper[4749]: I0219 18:49:11.142272 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-mcgwg" Feb 19 18:49:11 crc kubenswrapper[4749]: I0219 18:49:11.240991 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-xx724" Feb 19 18:49:11 crc kubenswrapper[4749]: I0219 18:49:11.244336 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-5rm6j" Feb 19 18:49:15 crc kubenswrapper[4749]: I0219 18:49:15.177420 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-56fd5cc5c9-s7k5m" event={"ID":"aaf03c23-f79b-4c42-9350-dd35ace208e3","Type":"ContainerStarted","Data":"908bb08413b9d750851df384629ce8691f4340616977653bc184ef222f576773"} Feb 19 18:49:15 crc kubenswrapper[4749]: I0219 18:49:15.178759 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-56fd5cc5c9-s7k5m" Feb 19 18:49:15 crc kubenswrapper[4749]: I0219 18:49:15.179909 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-7cjck" event={"ID":"a8475ec0-8fed-454b-9d2e-7008db016ae4","Type":"ContainerStarted","Data":"e21c752df250b55d4fd37d51c591aac09f20b86a814e01f32b4c15e9f5c08cbd"} Feb 19 18:49:15 crc kubenswrapper[4749]: I0219 18:49:15.180266 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-7cjck" Feb 19 18:49:15 crc kubenswrapper[4749]: I0219 18:49:15.181531 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-c5zvr" event={"ID":"0c080714-223f-4954-81ad-0fbf2d7ceff1","Type":"ContainerStarted","Data":"d9aef1304d704ccd80c27eb0d87076ab6306c6eb9a1055868bfd70487752a144"} Feb 19 18:49:15 crc kubenswrapper[4749]: I0219 18:49:15.181605 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-c5zvr" Feb 19 18:49:15 crc kubenswrapper[4749]: I0219 18:49:15.182740 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtd4f" event={"ID":"4a2510e7-7b2d-445a-b092-74831cb6701e","Type":"ContainerStarted","Data":"0cbd6bdee3e501cd2192ade21156242d1f2bba395506a83b5291ae253ef8e3bb"} Feb 19 18:49:15 crc kubenswrapper[4749]: I0219 18:49:15.184585 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-2w6pw" event={"ID":"f7febf8e-9f11-446d-b8f1-4bc8e0e2dce3","Type":"ContainerStarted","Data":"676d05ae2d01ca19e30964251138f954797ebe51f286a56209b00b35ced1aea2"} Feb 19 18:49:15 crc kubenswrapper[4749]: I0219 18:49:15.184929 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-2w6pw" Feb 19 18:49:15 crc kubenswrapper[4749]: I0219 18:49:15.186111 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-972mw" event={"ID":"61c9ed9f-0dff-4560-a6d3-a621e1a6ff09","Type":"ContainerStarted","Data":"abe78a16136bc1587ce2dfaeea5c7fb80602ed45b3685bc325c6801505d1cc88"} Feb 19 18:49:15 crc kubenswrapper[4749]: I0219 18:49:15.186433 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-972mw" Feb 19 18:49:15 crc kubenswrapper[4749]: I0219 18:49:15.187610 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cft725" event={"ID":"b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d","Type":"ContainerStarted","Data":"0bf6f2397e73c51565744a181d2ba51c001beccaf236789c46f3d2d37652d287"} Feb 19 18:49:15 crc kubenswrapper[4749]: I0219 18:49:15.188040 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cft725" Feb 19 18:49:15 crc kubenswrapper[4749]: I0219 18:49:15.195020 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-56fd5cc5c9-s7k5m" podStartSLOduration=3.051242115 podStartE2EDuration="25.195003259s" podCreationTimestamp="2026-02-19 18:48:50 +0000 UTC" firstStartedPulling="2026-02-19 18:48:52.189838638 +0000 UTC m=+906.151058582" lastFinishedPulling="2026-02-19 18:49:14.333599772 +0000 UTC m=+928.294819726" observedRunningTime="2026-02-19 18:49:15.189140247 +0000 UTC m=+929.150360201" watchObservedRunningTime="2026-02-19 18:49:15.195003259 +0000 UTC m=+929.156223213" Feb 19 18:49:15 crc kubenswrapper[4749]: I0219 18:49:15.207067 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-7cjck" podStartSLOduration=3.055309152 podStartE2EDuration="25.207049029s" podCreationTimestamp="2026-02-19 18:48:50 +0000 UTC" firstStartedPulling="2026-02-19 18:48:52.194772026 +0000 UTC m=+906.155991980" lastFinishedPulling="2026-02-19 18:49:14.346511903 +0000 UTC m=+928.307731857" observedRunningTime="2026-02-19 18:49:15.204647952 +0000 UTC m=+929.165867906" watchObservedRunningTime="2026-02-19 18:49:15.207049029 +0000 UTC m=+929.168268983" Feb 19 18:49:15 crc kubenswrapper[4749]: I0219 18:49:15.223334 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-2w6pw" podStartSLOduration=3.098540941 podStartE2EDuration="25.223314681s" podCreationTimestamp="2026-02-19 18:48:50 +0000 UTC" firstStartedPulling="2026-02-19 18:48:52.208696329 +0000 UTC m=+906.169916273" lastFinishedPulling="2026-02-19 18:49:14.333470059 +0000 UTC m=+928.294690013" observedRunningTime="2026-02-19 18:49:15.216478316 +0000 UTC m=+929.177698270" watchObservedRunningTime="2026-02-19 18:49:15.223314681 +0000 UTC m=+929.184534635" Feb 19 18:49:15 crc kubenswrapper[4749]: I0219 18:49:15.239556 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-972mw" podStartSLOduration=3.115681064 podStartE2EDuration="25.239537011s" podCreationTimestamp="2026-02-19 18:48:50 +0000 UTC" firstStartedPulling="2026-02-19 18:48:52.210658837 +0000 UTC m=+906.171878791" lastFinishedPulling="2026-02-19 18:49:14.334514784 +0000 UTC m=+928.295734738" observedRunningTime="2026-02-19 18:49:15.232677316 +0000 UTC m=+929.193897270" watchObservedRunningTime="2026-02-19 18:49:15.239537011 +0000 UTC m=+929.200756965" Feb 19 18:49:15 crc kubenswrapper[4749]: I0219 18:49:15.296424 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-c5zvr" podStartSLOduration=17.871402104 podStartE2EDuration="25.29640759s" podCreationTimestamp="2026-02-19 18:48:50 +0000 UTC" firstStartedPulling="2026-02-19 18:49:06.908450972 +0000 UTC m=+920.869670926" lastFinishedPulling="2026-02-19 18:49:14.333456458 +0000 UTC m=+928.294676412" observedRunningTime="2026-02-19 18:49:15.294627268 +0000 UTC m=+929.255847212" watchObservedRunningTime="2026-02-19 18:49:15.29640759 +0000 UTC m=+929.257627544" Feb 19 18:49:15 crc kubenswrapper[4749]: I0219 18:49:15.298417 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cft725" podStartSLOduration=18.429860077 podStartE2EDuration="25.298410478s" podCreationTimestamp="2026-02-19 18:48:50 +0000 UTC" firstStartedPulling="2026-02-19 18:49:07.475336349 +0000 UTC m=+921.436556293" lastFinishedPulling="2026-02-19 18:49:14.34388674 +0000 UTC m=+928.305106694" observedRunningTime="2026-02-19 18:49:15.267548385 +0000 UTC m=+929.228768349" watchObservedRunningTime="2026-02-19 18:49:15.298410478 +0000 UTC m=+929.259630432" Feb 19 18:49:15 crc kubenswrapper[4749]: I0219 18:49:15.317582 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtd4f" podStartSLOduration=3.11124145 podStartE2EDuration="25.31756843s" podCreationTimestamp="2026-02-19 18:48:50 +0000 UTC" firstStartedPulling="2026-02-19 18:48:52.189366667 +0000 UTC m=+906.150586621" lastFinishedPulling="2026-02-19 18:49:14.395693647 +0000 UTC m=+928.356913601" observedRunningTime="2026-02-19 18:49:15.31467403 +0000 UTC m=+929.275893974" watchObservedRunningTime="2026-02-19 18:49:15.31756843 +0000 UTC m=+929.278788384" Feb 19 18:49:21 crc kubenswrapper[4749]: I0219 18:49:21.108674 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-972mw" Feb 19 18:49:21 crc kubenswrapper[4749]: I0219 18:49:21.153311 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-2w6pw" Feb 19 18:49:21 crc kubenswrapper[4749]: I0219 18:49:21.281822 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-7cjck" Feb 19 18:49:21 crc kubenswrapper[4749]: I0219 18:49:21.295897 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-56fd5cc5c9-s7k5m" Feb 19 18:49:23 crc kubenswrapper[4749]: I0219 18:49:23.040541 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-webhook-certs\") pod \"openstack-operator-controller-manager-c59d96f56-stlgf\" (UID: \"4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382\") " pod="openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf" Feb 19 18:49:23 crc kubenswrapper[4749]: I0219 18:49:23.047081 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382-webhook-certs\") pod \"openstack-operator-controller-manager-c59d96f56-stlgf\" (UID: \"4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382\") " pod="openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf" Feb 19 18:49:23 crc kubenswrapper[4749]: I0219 18:49:23.192289 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf" Feb 19 18:49:23 crc kubenswrapper[4749]: I0219 18:49:23.464422 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf"] Feb 19 18:49:23 crc kubenswrapper[4749]: W0219 18:49:23.497334 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a0f1a4e_6bc1_49bb_a62b_9f51ca92b382.slice/crio-52f4e602680d62d6d9c9962ae7a99eb83243798440bbdde70dc3becf9717dac0 WatchSource:0}: Error finding container 52f4e602680d62d6d9c9962ae7a99eb83243798440bbdde70dc3becf9717dac0: Status 404 returned error can't find the container with id 52f4e602680d62d6d9c9962ae7a99eb83243798440bbdde70dc3becf9717dac0 Feb 19 18:49:24 crc kubenswrapper[4749]: I0219 18:49:24.244584 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf" event={"ID":"4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382","Type":"ContainerStarted","Data":"52f4e602680d62d6d9c9962ae7a99eb83243798440bbdde70dc3becf9717dac0"} Feb 19 18:49:25 crc kubenswrapper[4749]: I0219 18:49:25.252018 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf" event={"ID":"4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382","Type":"ContainerStarted","Data":"320b6bcfc540fc1404c643e246f6f1ae709beb3b1c1c549a50e097cb41f93f71"} Feb 19 18:49:25 crc kubenswrapper[4749]: I0219 18:49:25.252253 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf" Feb 19 18:49:25 crc kubenswrapper[4749]: I0219 18:49:25.283369 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf" podStartSLOduration=35.283329111 podStartE2EDuration="35.283329111s" podCreationTimestamp="2026-02-19 18:48:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:49:25.278567626 +0000 UTC m=+939.239787610" watchObservedRunningTime="2026-02-19 18:49:25.283329111 +0000 UTC m=+939.244549095" Feb 19 18:49:26 crc kubenswrapper[4749]: I0219 18:49:26.259738 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-c5zvr" Feb 19 18:49:27 crc kubenswrapper[4749]: I0219 18:49:27.082420 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cft725" Feb 19 18:49:33 crc kubenswrapper[4749]: I0219 18:49:33.208203 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-c59d96f56-stlgf" Feb 19 18:49:52 crc kubenswrapper[4749]: I0219 18:49:52.120717 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c6977f9b9-w4jrv"] Feb 19 18:49:52 crc kubenswrapper[4749]: I0219 18:49:52.129125 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c6977f9b9-w4jrv" Feb 19 18:49:52 crc kubenswrapper[4749]: I0219 18:49:52.134198 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 19 18:49:52 crc kubenswrapper[4749]: I0219 18:49:52.134447 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 19 18:49:52 crc kubenswrapper[4749]: I0219 18:49:52.134590 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 19 18:49:52 crc kubenswrapper[4749]: I0219 18:49:52.134722 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-k4ssn" Feb 19 18:49:52 crc kubenswrapper[4749]: I0219 18:49:52.139711 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c6977f9b9-w4jrv"] Feb 19 18:49:52 crc kubenswrapper[4749]: I0219 18:49:52.164000 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7fcbd895-gmcp6"] Feb 19 18:49:52 crc kubenswrapper[4749]: I0219 18:49:52.165165 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7fcbd895-gmcp6" Feb 19 18:49:52 crc kubenswrapper[4749]: I0219 18:49:52.168506 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 19 18:49:52 crc kubenswrapper[4749]: I0219 18:49:52.183829 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7fcbd895-gmcp6"] Feb 19 18:49:52 crc kubenswrapper[4749]: I0219 18:49:52.275082 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/006db2e2-fb9c-4a57-9f4c-fa4cb801588c-dns-svc\") pod \"dnsmasq-dns-6b7fcbd895-gmcp6\" (UID: \"006db2e2-fb9c-4a57-9f4c-fa4cb801588c\") " pod="openstack/dnsmasq-dns-6b7fcbd895-gmcp6" Feb 19 18:49:52 crc kubenswrapper[4749]: I0219 18:49:52.275123 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgjkt\" (UniqueName: \"kubernetes.io/projected/2a888ed9-b69c-4846-ab38-c29719ab3ab1-kube-api-access-dgjkt\") pod \"dnsmasq-dns-c6977f9b9-w4jrv\" (UID: \"2a888ed9-b69c-4846-ab38-c29719ab3ab1\") " pod="openstack/dnsmasq-dns-c6977f9b9-w4jrv" Feb 19 18:49:52 crc kubenswrapper[4749]: I0219 18:49:52.275286 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a888ed9-b69c-4846-ab38-c29719ab3ab1-config\") pod \"dnsmasq-dns-c6977f9b9-w4jrv\" (UID: \"2a888ed9-b69c-4846-ab38-c29719ab3ab1\") " pod="openstack/dnsmasq-dns-c6977f9b9-w4jrv" Feb 19 18:49:52 crc kubenswrapper[4749]: I0219 18:49:52.275348 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/006db2e2-fb9c-4a57-9f4c-fa4cb801588c-config\") pod \"dnsmasq-dns-6b7fcbd895-gmcp6\" (UID: \"006db2e2-fb9c-4a57-9f4c-fa4cb801588c\") " pod="openstack/dnsmasq-dns-6b7fcbd895-gmcp6" Feb 19 18:49:52 crc kubenswrapper[4749]: I0219 18:49:52.275554 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4k8n\" (UniqueName: \"kubernetes.io/projected/006db2e2-fb9c-4a57-9f4c-fa4cb801588c-kube-api-access-v4k8n\") pod \"dnsmasq-dns-6b7fcbd895-gmcp6\" (UID: \"006db2e2-fb9c-4a57-9f4c-fa4cb801588c\") " pod="openstack/dnsmasq-dns-6b7fcbd895-gmcp6" Feb 19 18:49:52 crc kubenswrapper[4749]: I0219 18:49:52.377434 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4k8n\" (UniqueName: \"kubernetes.io/projected/006db2e2-fb9c-4a57-9f4c-fa4cb801588c-kube-api-access-v4k8n\") pod \"dnsmasq-dns-6b7fcbd895-gmcp6\" (UID: \"006db2e2-fb9c-4a57-9f4c-fa4cb801588c\") " pod="openstack/dnsmasq-dns-6b7fcbd895-gmcp6" Feb 19 18:49:52 crc kubenswrapper[4749]: I0219 18:49:52.377566 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/006db2e2-fb9c-4a57-9f4c-fa4cb801588c-dns-svc\") pod \"dnsmasq-dns-6b7fcbd895-gmcp6\" (UID: \"006db2e2-fb9c-4a57-9f4c-fa4cb801588c\") " pod="openstack/dnsmasq-dns-6b7fcbd895-gmcp6" Feb 19 18:49:52 crc kubenswrapper[4749]: I0219 18:49:52.377613 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgjkt\" (UniqueName: \"kubernetes.io/projected/2a888ed9-b69c-4846-ab38-c29719ab3ab1-kube-api-access-dgjkt\") pod \"dnsmasq-dns-c6977f9b9-w4jrv\" (UID: \"2a888ed9-b69c-4846-ab38-c29719ab3ab1\") " pod="openstack/dnsmasq-dns-c6977f9b9-w4jrv" Feb 19 18:49:52 crc kubenswrapper[4749]: I0219 18:49:52.377755 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a888ed9-b69c-4846-ab38-c29719ab3ab1-config\") pod \"dnsmasq-dns-c6977f9b9-w4jrv\" (UID: \"2a888ed9-b69c-4846-ab38-c29719ab3ab1\") " pod="openstack/dnsmasq-dns-c6977f9b9-w4jrv" Feb 19 18:49:52 crc kubenswrapper[4749]: I0219 18:49:52.377820 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/006db2e2-fb9c-4a57-9f4c-fa4cb801588c-config\") pod \"dnsmasq-dns-6b7fcbd895-gmcp6\" (UID: \"006db2e2-fb9c-4a57-9f4c-fa4cb801588c\") " pod="openstack/dnsmasq-dns-6b7fcbd895-gmcp6" Feb 19 18:49:52 crc kubenswrapper[4749]: I0219 18:49:52.378612 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a888ed9-b69c-4846-ab38-c29719ab3ab1-config\") pod \"dnsmasq-dns-c6977f9b9-w4jrv\" (UID: \"2a888ed9-b69c-4846-ab38-c29719ab3ab1\") " pod="openstack/dnsmasq-dns-c6977f9b9-w4jrv" Feb 19 18:49:52 crc kubenswrapper[4749]: I0219 18:49:52.395795 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgjkt\" (UniqueName: \"kubernetes.io/projected/2a888ed9-b69c-4846-ab38-c29719ab3ab1-kube-api-access-dgjkt\") pod \"dnsmasq-dns-c6977f9b9-w4jrv\" (UID: \"2a888ed9-b69c-4846-ab38-c29719ab3ab1\") " pod="openstack/dnsmasq-dns-c6977f9b9-w4jrv" Feb 19 18:49:52 crc kubenswrapper[4749]: I0219 18:49:52.449356 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c6977f9b9-w4jrv" Feb 19 18:49:52 crc kubenswrapper[4749]: I0219 18:49:52.860685 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/006db2e2-fb9c-4a57-9f4c-fa4cb801588c-config\") pod \"dnsmasq-dns-6b7fcbd895-gmcp6\" (UID: \"006db2e2-fb9c-4a57-9f4c-fa4cb801588c\") " pod="openstack/dnsmasq-dns-6b7fcbd895-gmcp6" Feb 19 18:49:52 crc kubenswrapper[4749]: I0219 18:49:52.860710 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/006db2e2-fb9c-4a57-9f4c-fa4cb801588c-dns-svc\") pod \"dnsmasq-dns-6b7fcbd895-gmcp6\" (UID: \"006db2e2-fb9c-4a57-9f4c-fa4cb801588c\") " pod="openstack/dnsmasq-dns-6b7fcbd895-gmcp6" Feb 19 18:49:52 crc kubenswrapper[4749]: I0219 18:49:52.869009 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4k8n\" (UniqueName: \"kubernetes.io/projected/006db2e2-fb9c-4a57-9f4c-fa4cb801588c-kube-api-access-v4k8n\") pod \"dnsmasq-dns-6b7fcbd895-gmcp6\" (UID: \"006db2e2-fb9c-4a57-9f4c-fa4cb801588c\") " pod="openstack/dnsmasq-dns-6b7fcbd895-gmcp6" Feb 19 18:49:53 crc kubenswrapper[4749]: I0219 18:49:53.081945 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7fcbd895-gmcp6" Feb 19 18:49:53 crc kubenswrapper[4749]: I0219 18:49:53.211831 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c6977f9b9-w4jrv"] Feb 19 18:49:53 crc kubenswrapper[4749]: I0219 18:49:53.439682 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c6977f9b9-w4jrv" event={"ID":"2a888ed9-b69c-4846-ab38-c29719ab3ab1","Type":"ContainerStarted","Data":"73f622904f71c001fe7c2d27a36dc896c6ee309720d2e6960b2a07909b0f7fe7"} Feb 19 18:49:53 crc kubenswrapper[4749]: I0219 18:49:53.505672 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7fcbd895-gmcp6"] Feb 19 18:49:53 crc kubenswrapper[4749]: W0219 18:49:53.509679 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod006db2e2_fb9c_4a57_9f4c_fa4cb801588c.slice/crio-4b634d8d7f91a0a4b7b7d814267de840654857f5c68f27fe276fb2ad7d68b9b8 WatchSource:0}: Error finding container 4b634d8d7f91a0a4b7b7d814267de840654857f5c68f27fe276fb2ad7d68b9b8: Status 404 returned error can't find the container with id 4b634d8d7f91a0a4b7b7d814267de840654857f5c68f27fe276fb2ad7d68b9b8 Feb 19 18:49:54 crc kubenswrapper[4749]: I0219 18:49:54.447612 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7fcbd895-gmcp6" event={"ID":"006db2e2-fb9c-4a57-9f4c-fa4cb801588c","Type":"ContainerStarted","Data":"4b634d8d7f91a0a4b7b7d814267de840654857f5c68f27fe276fb2ad7d68b9b8"} Feb 19 18:49:54 crc kubenswrapper[4749]: I0219 18:49:54.725549 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 18:49:54 crc kubenswrapper[4749]: I0219 18:49:54.725606 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 18:49:55 crc kubenswrapper[4749]: I0219 18:49:55.837470 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7fcbd895-gmcp6"] Feb 19 18:49:55 crc kubenswrapper[4749]: I0219 18:49:55.862184 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8469c55d5f-zzv8c"] Feb 19 18:49:55 crc kubenswrapper[4749]: I0219 18:49:55.878215 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8469c55d5f-zzv8c" Feb 19 18:49:55 crc kubenswrapper[4749]: I0219 18:49:55.898960 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8469c55d5f-zzv8c"] Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.034420 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/453ecb3b-540e-4e07-9c23-451656544d03-config\") pod \"dnsmasq-dns-8469c55d5f-zzv8c\" (UID: \"453ecb3b-540e-4e07-9c23-451656544d03\") " pod="openstack/dnsmasq-dns-8469c55d5f-zzv8c" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.034521 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/453ecb3b-540e-4e07-9c23-451656544d03-dns-svc\") pod \"dnsmasq-dns-8469c55d5f-zzv8c\" (UID: \"453ecb3b-540e-4e07-9c23-451656544d03\") " pod="openstack/dnsmasq-dns-8469c55d5f-zzv8c" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.034552 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p58k\" (UniqueName: \"kubernetes.io/projected/453ecb3b-540e-4e07-9c23-451656544d03-kube-api-access-2p58k\") pod \"dnsmasq-dns-8469c55d5f-zzv8c\" (UID: \"453ecb3b-540e-4e07-9c23-451656544d03\") " pod="openstack/dnsmasq-dns-8469c55d5f-zzv8c" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.136329 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/453ecb3b-540e-4e07-9c23-451656544d03-config\") pod \"dnsmasq-dns-8469c55d5f-zzv8c\" (UID: \"453ecb3b-540e-4e07-9c23-451656544d03\") " pod="openstack/dnsmasq-dns-8469c55d5f-zzv8c" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.136445 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/453ecb3b-540e-4e07-9c23-451656544d03-dns-svc\") pod \"dnsmasq-dns-8469c55d5f-zzv8c\" (UID: \"453ecb3b-540e-4e07-9c23-451656544d03\") " pod="openstack/dnsmasq-dns-8469c55d5f-zzv8c" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.136475 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p58k\" (UniqueName: \"kubernetes.io/projected/453ecb3b-540e-4e07-9c23-451656544d03-kube-api-access-2p58k\") pod \"dnsmasq-dns-8469c55d5f-zzv8c\" (UID: \"453ecb3b-540e-4e07-9c23-451656544d03\") " pod="openstack/dnsmasq-dns-8469c55d5f-zzv8c" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.137429 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/453ecb3b-540e-4e07-9c23-451656544d03-config\") pod \"dnsmasq-dns-8469c55d5f-zzv8c\" (UID: \"453ecb3b-540e-4e07-9c23-451656544d03\") " pod="openstack/dnsmasq-dns-8469c55d5f-zzv8c" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.137476 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/453ecb3b-540e-4e07-9c23-451656544d03-dns-svc\") pod \"dnsmasq-dns-8469c55d5f-zzv8c\" (UID: \"453ecb3b-540e-4e07-9c23-451656544d03\") " pod="openstack/dnsmasq-dns-8469c55d5f-zzv8c" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.159308 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c6977f9b9-w4jrv"] Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.171778 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p58k\" (UniqueName: \"kubernetes.io/projected/453ecb3b-540e-4e07-9c23-451656544d03-kube-api-access-2p58k\") pod \"dnsmasq-dns-8469c55d5f-zzv8c\" (UID: \"453ecb3b-540e-4e07-9c23-451656544d03\") " pod="openstack/dnsmasq-dns-8469c55d5f-zzv8c" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.190350 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d4bcfd54c-vsmph"] Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.195622 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d4bcfd54c-vsmph" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.200206 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d4bcfd54c-vsmph"] Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.204874 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8469c55d5f-zzv8c" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.339632 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wd6g\" (UniqueName: \"kubernetes.io/projected/a952b799-be38-4df9-82cc-b6b09536f733-kube-api-access-9wd6g\") pod \"dnsmasq-dns-6d4bcfd54c-vsmph\" (UID: \"a952b799-be38-4df9-82cc-b6b09536f733\") " pod="openstack/dnsmasq-dns-6d4bcfd54c-vsmph" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.339950 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a952b799-be38-4df9-82cc-b6b09536f733-dns-svc\") pod \"dnsmasq-dns-6d4bcfd54c-vsmph\" (UID: \"a952b799-be38-4df9-82cc-b6b09536f733\") " pod="openstack/dnsmasq-dns-6d4bcfd54c-vsmph" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.340101 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a952b799-be38-4df9-82cc-b6b09536f733-config\") pod \"dnsmasq-dns-6d4bcfd54c-vsmph\" (UID: \"a952b799-be38-4df9-82cc-b6b09536f733\") " pod="openstack/dnsmasq-dns-6d4bcfd54c-vsmph" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.442009 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a952b799-be38-4df9-82cc-b6b09536f733-config\") pod \"dnsmasq-dns-6d4bcfd54c-vsmph\" (UID: \"a952b799-be38-4df9-82cc-b6b09536f733\") " pod="openstack/dnsmasq-dns-6d4bcfd54c-vsmph" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.442736 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8469c55d5f-zzv8c"] Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.443193 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a952b799-be38-4df9-82cc-b6b09536f733-config\") pod \"dnsmasq-dns-6d4bcfd54c-vsmph\" (UID: \"a952b799-be38-4df9-82cc-b6b09536f733\") " pod="openstack/dnsmasq-dns-6d4bcfd54c-vsmph" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.442201 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wd6g\" (UniqueName: \"kubernetes.io/projected/a952b799-be38-4df9-82cc-b6b09536f733-kube-api-access-9wd6g\") pod \"dnsmasq-dns-6d4bcfd54c-vsmph\" (UID: \"a952b799-be38-4df9-82cc-b6b09536f733\") " pod="openstack/dnsmasq-dns-6d4bcfd54c-vsmph" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.445101 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a952b799-be38-4df9-82cc-b6b09536f733-dns-svc\") pod \"dnsmasq-dns-6d4bcfd54c-vsmph\" (UID: \"a952b799-be38-4df9-82cc-b6b09536f733\") " pod="openstack/dnsmasq-dns-6d4bcfd54c-vsmph" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.446261 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a952b799-be38-4df9-82cc-b6b09536f733-dns-svc\") pod \"dnsmasq-dns-6d4bcfd54c-vsmph\" (UID: \"a952b799-be38-4df9-82cc-b6b09536f733\") " pod="openstack/dnsmasq-dns-6d4bcfd54c-vsmph" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.456381 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fc7ffd55-bhsd5"] Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.458035 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc7ffd55-bhsd5" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.476104 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fc7ffd55-bhsd5"] Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.483867 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wd6g\" (UniqueName: \"kubernetes.io/projected/a952b799-be38-4df9-82cc-b6b09536f733-kube-api-access-9wd6g\") pod \"dnsmasq-dns-6d4bcfd54c-vsmph\" (UID: \"a952b799-be38-4df9-82cc-b6b09536f733\") " pod="openstack/dnsmasq-dns-6d4bcfd54c-vsmph" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.519006 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d4bcfd54c-vsmph" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.648388 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed4e598f-abad-4bd7-997c-c4f54a2e2321-dns-svc\") pod \"dnsmasq-dns-7fc7ffd55-bhsd5\" (UID: \"ed4e598f-abad-4bd7-997c-c4f54a2e2321\") " pod="openstack/dnsmasq-dns-7fc7ffd55-bhsd5" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.648461 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfhld\" (UniqueName: \"kubernetes.io/projected/ed4e598f-abad-4bd7-997c-c4f54a2e2321-kube-api-access-qfhld\") pod \"dnsmasq-dns-7fc7ffd55-bhsd5\" (UID: \"ed4e598f-abad-4bd7-997c-c4f54a2e2321\") " pod="openstack/dnsmasq-dns-7fc7ffd55-bhsd5" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.648652 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed4e598f-abad-4bd7-997c-c4f54a2e2321-config\") pod \"dnsmasq-dns-7fc7ffd55-bhsd5\" (UID: \"ed4e598f-abad-4bd7-997c-c4f54a2e2321\") " pod="openstack/dnsmasq-dns-7fc7ffd55-bhsd5" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.741163 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d4bcfd54c-vsmph"] Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.749688 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed4e598f-abad-4bd7-997c-c4f54a2e2321-dns-svc\") pod \"dnsmasq-dns-7fc7ffd55-bhsd5\" (UID: \"ed4e598f-abad-4bd7-997c-c4f54a2e2321\") " pod="openstack/dnsmasq-dns-7fc7ffd55-bhsd5" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.749744 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfhld\" (UniqueName: \"kubernetes.io/projected/ed4e598f-abad-4bd7-997c-c4f54a2e2321-kube-api-access-qfhld\") pod \"dnsmasq-dns-7fc7ffd55-bhsd5\" (UID: \"ed4e598f-abad-4bd7-997c-c4f54a2e2321\") " pod="openstack/dnsmasq-dns-7fc7ffd55-bhsd5" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.749796 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed4e598f-abad-4bd7-997c-c4f54a2e2321-config\") pod \"dnsmasq-dns-7fc7ffd55-bhsd5\" (UID: \"ed4e598f-abad-4bd7-997c-c4f54a2e2321\") " pod="openstack/dnsmasq-dns-7fc7ffd55-bhsd5" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.750608 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed4e598f-abad-4bd7-997c-c4f54a2e2321-config\") pod \"dnsmasq-dns-7fc7ffd55-bhsd5\" (UID: \"ed4e598f-abad-4bd7-997c-c4f54a2e2321\") " pod="openstack/dnsmasq-dns-7fc7ffd55-bhsd5" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.751008 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed4e598f-abad-4bd7-997c-c4f54a2e2321-dns-svc\") pod \"dnsmasq-dns-7fc7ffd55-bhsd5\" (UID: \"ed4e598f-abad-4bd7-997c-c4f54a2e2321\") " pod="openstack/dnsmasq-dns-7fc7ffd55-bhsd5" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.764002 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8469c55d5f-zzv8c"] Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.770383 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfhld\" (UniqueName: \"kubernetes.io/projected/ed4e598f-abad-4bd7-997c-c4f54a2e2321-kube-api-access-qfhld\") pod \"dnsmasq-dns-7fc7ffd55-bhsd5\" (UID: \"ed4e598f-abad-4bd7-997c-c4f54a2e2321\") " pod="openstack/dnsmasq-dns-7fc7ffd55-bhsd5" Feb 19 18:49:56 crc kubenswrapper[4749]: I0219 18:49:56.818400 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc7ffd55-bhsd5" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.026281 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/notifications-rabbitmq-server-0"] Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.027705 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.035468 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"notifications-rabbitmq-erlang-cookie" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.035703 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-notifications-rabbitmq-svc" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.036002 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"notifications-rabbitmq-config-data" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.036222 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"notifications-rabbitmq-plugins-conf" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.036371 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"notifications-rabbitmq-server-dockercfg-sfhc9" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.037686 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"notifications-rabbitmq-default-user" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.047744 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"notifications-rabbitmq-server-conf" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.080424 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/notifications-rabbitmq-server-0"] Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.159874 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/27dfe8e9-686d-4703-b36d-df6b94491b40-erlang-cookie-secret\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.159916 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/27dfe8e9-686d-4703-b36d-df6b94491b40-rabbitmq-confd\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.159938 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.159956 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/27dfe8e9-686d-4703-b36d-df6b94491b40-pod-info\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.160013 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/27dfe8e9-686d-4703-b36d-df6b94491b40-config-data\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.160048 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/27dfe8e9-686d-4703-b36d-df6b94491b40-server-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.160074 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/27dfe8e9-686d-4703-b36d-df6b94491b40-plugins-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.160095 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/27dfe8e9-686d-4703-b36d-df6b94491b40-rabbitmq-tls\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.160117 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g7d7\" (UniqueName: \"kubernetes.io/projected/27dfe8e9-686d-4703-b36d-df6b94491b40-kube-api-access-8g7d7\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.160148 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/27dfe8e9-686d-4703-b36d-df6b94491b40-rabbitmq-erlang-cookie\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.160164 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/27dfe8e9-686d-4703-b36d-df6b94491b40-rabbitmq-plugins\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.262403 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/27dfe8e9-686d-4703-b36d-df6b94491b40-erlang-cookie-secret\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.262446 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/27dfe8e9-686d-4703-b36d-df6b94491b40-rabbitmq-confd\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.262466 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.263157 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/27dfe8e9-686d-4703-b36d-df6b94491b40-pod-info\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.263523 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.264196 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/27dfe8e9-686d-4703-b36d-df6b94491b40-config-data\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.264349 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/27dfe8e9-686d-4703-b36d-df6b94491b40-config-data\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.264375 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/27dfe8e9-686d-4703-b36d-df6b94491b40-server-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.264415 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/27dfe8e9-686d-4703-b36d-df6b94491b40-plugins-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.264438 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/27dfe8e9-686d-4703-b36d-df6b94491b40-rabbitmq-tls\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.264462 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g7d7\" (UniqueName: \"kubernetes.io/projected/27dfe8e9-686d-4703-b36d-df6b94491b40-kube-api-access-8g7d7\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.264500 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/27dfe8e9-686d-4703-b36d-df6b94491b40-rabbitmq-erlang-cookie\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.264514 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/27dfe8e9-686d-4703-b36d-df6b94491b40-rabbitmq-plugins\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.264772 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/27dfe8e9-686d-4703-b36d-df6b94491b40-rabbitmq-plugins\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.266674 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/27dfe8e9-686d-4703-b36d-df6b94491b40-plugins-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.266834 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/27dfe8e9-686d-4703-b36d-df6b94491b40-server-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.266884 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/27dfe8e9-686d-4703-b36d-df6b94491b40-erlang-cookie-secret\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.267145 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/27dfe8e9-686d-4703-b36d-df6b94491b40-rabbitmq-erlang-cookie\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.282618 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g7d7\" (UniqueName: \"kubernetes.io/projected/27dfe8e9-686d-4703-b36d-df6b94491b40-kube-api-access-8g7d7\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.284264 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/27dfe8e9-686d-4703-b36d-df6b94491b40-pod-info\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.284454 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/27dfe8e9-686d-4703-b36d-df6b94491b40-rabbitmq-confd\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.292847 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/27dfe8e9-686d-4703-b36d-df6b94491b40-rabbitmq-tls\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.303738 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.305110 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.308826 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.309135 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.309167 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.309211 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.309263 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.311501 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-grnfr" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.312287 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.316757 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"27dfe8e9-686d-4703-b36d-df6b94491b40\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.331143 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.369441 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.443110 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fc7ffd55-bhsd5"] Feb 19 18:49:57 crc kubenswrapper[4749]: W0219 18:49:57.460459 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded4e598f_abad_4bd7_997c_c4f54a2e2321.slice/crio-bb471e75ebb9a4ef02adbb213915fdcd2dd50b66f34ef2dc695e234de31dba71 WatchSource:0}: Error finding container bb471e75ebb9a4ef02adbb213915fdcd2dd50b66f34ef2dc695e234de31dba71: Status 404 returned error can't find the container with id bb471e75ebb9a4ef02adbb213915fdcd2dd50b66f34ef2dc695e234de31dba71 Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.467712 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/042fb593-4898-4085-889e-7ccb375cf969-pod-info\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.467763 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/042fb593-4898-4085-889e-7ccb375cf969-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.467790 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.467817 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/042fb593-4898-4085-889e-7ccb375cf969-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.467838 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/042fb593-4898-4085-889e-7ccb375cf969-config-data\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.467881 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/042fb593-4898-4085-889e-7ccb375cf969-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.467916 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/042fb593-4898-4085-889e-7ccb375cf969-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.467942 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf58l\" (UniqueName: \"kubernetes.io/projected/042fb593-4898-4085-889e-7ccb375cf969-kube-api-access-kf58l\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.467973 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/042fb593-4898-4085-889e-7ccb375cf969-server-conf\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.468205 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/042fb593-4898-4085-889e-7ccb375cf969-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.468238 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/042fb593-4898-4085-889e-7ccb375cf969-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.478838 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8469c55d5f-zzv8c" event={"ID":"453ecb3b-540e-4e07-9c23-451656544d03","Type":"ContainerStarted","Data":"60299bbc1ab847e2c15532dbb1f8b28d738f20b46025ac4363e393df02f1c529"} Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.480260 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc7ffd55-bhsd5" event={"ID":"ed4e598f-abad-4bd7-997c-c4f54a2e2321","Type":"ContainerStarted","Data":"bb471e75ebb9a4ef02adbb213915fdcd2dd50b66f34ef2dc695e234de31dba71"} Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.485830 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d4bcfd54c-vsmph" event={"ID":"a952b799-be38-4df9-82cc-b6b09536f733","Type":"ContainerStarted","Data":"0d539328743270de1cc13f3b0dc4de9af0e6a8336cfa8a093bf2af0472882e7c"} Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.564309 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.592989 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/042fb593-4898-4085-889e-7ccb375cf969-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.593079 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/042fb593-4898-4085-889e-7ccb375cf969-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.593154 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/042fb593-4898-4085-889e-7ccb375cf969-pod-info\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.595171 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/042fb593-4898-4085-889e-7ccb375cf969-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.597190 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/042fb593-4898-4085-889e-7ccb375cf969-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.597301 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.597348 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/042fb593-4898-4085-889e-7ccb375cf969-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.597379 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/042fb593-4898-4085-889e-7ccb375cf969-config-data\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.597465 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/042fb593-4898-4085-889e-7ccb375cf969-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.597508 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/042fb593-4898-4085-889e-7ccb375cf969-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.597547 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf58l\" (UniqueName: \"kubernetes.io/projected/042fb593-4898-4085-889e-7ccb375cf969-kube-api-access-kf58l\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.597592 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/042fb593-4898-4085-889e-7ccb375cf969-server-conf\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.599711 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/042fb593-4898-4085-889e-7ccb375cf969-server-conf\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.600080 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.602090 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/042fb593-4898-4085-889e-7ccb375cf969-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.602578 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/042fb593-4898-4085-889e-7ccb375cf969-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.602655 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.602819 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.603352 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/042fb593-4898-4085-889e-7ccb375cf969-config-data\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.611110 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/042fb593-4898-4085-889e-7ccb375cf969-pod-info\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.612778 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.613078 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.614008 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/042fb593-4898-4085-889e-7ccb375cf969-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.625656 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.625803 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.626005 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.626488 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.626854 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-hjc7c" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.636115 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/042fb593-4898-4085-889e-7ccb375cf969-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.627664 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/042fb593-4898-4085-889e-7ccb375cf969-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.638444 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf58l\" (UniqueName: \"kubernetes.io/projected/042fb593-4898-4085-889e-7ccb375cf969-kube-api-access-kf58l\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.684269 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.706510 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/008062c0-9ccf-4fd2-9b54-63196268da38-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.707331 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.707472 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/008062c0-9ccf-4fd2-9b54-63196268da38-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.707579 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/008062c0-9ccf-4fd2-9b54-63196268da38-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.707732 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/008062c0-9ccf-4fd2-9b54-63196268da38-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.707857 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/008062c0-9ccf-4fd2-9b54-63196268da38-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.708004 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/008062c0-9ccf-4fd2-9b54-63196268da38-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.708159 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/008062c0-9ccf-4fd2-9b54-63196268da38-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.708265 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9l8h\" (UniqueName: \"kubernetes.io/projected/008062c0-9ccf-4fd2-9b54-63196268da38-kube-api-access-j9l8h\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.708444 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/008062c0-9ccf-4fd2-9b54-63196268da38-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.708601 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/008062c0-9ccf-4fd2-9b54-63196268da38-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.810236 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/008062c0-9ccf-4fd2-9b54-63196268da38-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.810358 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/008062c0-9ccf-4fd2-9b54-63196268da38-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.810380 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/008062c0-9ccf-4fd2-9b54-63196268da38-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.810424 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.810464 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/008062c0-9ccf-4fd2-9b54-63196268da38-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.810481 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/008062c0-9ccf-4fd2-9b54-63196268da38-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.810511 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/008062c0-9ccf-4fd2-9b54-63196268da38-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.810534 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/008062c0-9ccf-4fd2-9b54-63196268da38-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.810556 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/008062c0-9ccf-4fd2-9b54-63196268da38-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.810575 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/008062c0-9ccf-4fd2-9b54-63196268da38-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.810591 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9l8h\" (UniqueName: \"kubernetes.io/projected/008062c0-9ccf-4fd2-9b54-63196268da38-kube-api-access-j9l8h\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.811757 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/008062c0-9ccf-4fd2-9b54-63196268da38-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.813443 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/008062c0-9ccf-4fd2-9b54-63196268da38-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.813651 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.816848 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/008062c0-9ccf-4fd2-9b54-63196268da38-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.817705 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/008062c0-9ccf-4fd2-9b54-63196268da38-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.818418 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/008062c0-9ccf-4fd2-9b54-63196268da38-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.818389 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/008062c0-9ccf-4fd2-9b54-63196268da38-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.822967 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/008062c0-9ccf-4fd2-9b54-63196268da38-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.826935 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/008062c0-9ccf-4fd2-9b54-63196268da38-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.829002 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/008062c0-9ccf-4fd2-9b54-63196268da38-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.829677 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9l8h\" (UniqueName: \"kubernetes.io/projected/008062c0-9ccf-4fd2-9b54-63196268da38-kube-api-access-j9l8h\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.868188 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.938340 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.957739 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 18:49:57 crc kubenswrapper[4749]: I0219 18:49:57.989275 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/notifications-rabbitmq-server-0"] Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.086898 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.088244 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.090500 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.090945 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-6sbdw" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.091121 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.091758 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.095500 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.096589 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.233337 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/affb1316-cbf5-4641-bdd7-186e390b9e7e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"affb1316-cbf5-4641-bdd7-186e390b9e7e\") " pod="openstack/openstack-galera-0" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.233399 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/affb1316-cbf5-4641-bdd7-186e390b9e7e-kolla-config\") pod \"openstack-galera-0\" (UID: \"affb1316-cbf5-4641-bdd7-186e390b9e7e\") " pod="openstack/openstack-galera-0" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.233426 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/affb1316-cbf5-4641-bdd7-186e390b9e7e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"affb1316-cbf5-4641-bdd7-186e390b9e7e\") " pod="openstack/openstack-galera-0" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.233837 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"affb1316-cbf5-4641-bdd7-186e390b9e7e\") " pod="openstack/openstack-galera-0" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.233924 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/affb1316-cbf5-4641-bdd7-186e390b9e7e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"affb1316-cbf5-4641-bdd7-186e390b9e7e\") " pod="openstack/openstack-galera-0" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.234006 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kktvv\" (UniqueName: \"kubernetes.io/projected/affb1316-cbf5-4641-bdd7-186e390b9e7e-kube-api-access-kktvv\") pod \"openstack-galera-0\" (UID: \"affb1316-cbf5-4641-bdd7-186e390b9e7e\") " pod="openstack/openstack-galera-0" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.234125 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/affb1316-cbf5-4641-bdd7-186e390b9e7e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"affb1316-cbf5-4641-bdd7-186e390b9e7e\") " pod="openstack/openstack-galera-0" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.234156 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/affb1316-cbf5-4641-bdd7-186e390b9e7e-config-data-default\") pod \"openstack-galera-0\" (UID: \"affb1316-cbf5-4641-bdd7-186e390b9e7e\") " pod="openstack/openstack-galera-0" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.337322 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/affb1316-cbf5-4641-bdd7-186e390b9e7e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"affb1316-cbf5-4641-bdd7-186e390b9e7e\") " pod="openstack/openstack-galera-0" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.337392 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/affb1316-cbf5-4641-bdd7-186e390b9e7e-kolla-config\") pod \"openstack-galera-0\" (UID: \"affb1316-cbf5-4641-bdd7-186e390b9e7e\") " pod="openstack/openstack-galera-0" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.337435 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/affb1316-cbf5-4641-bdd7-186e390b9e7e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"affb1316-cbf5-4641-bdd7-186e390b9e7e\") " pod="openstack/openstack-galera-0" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.337552 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"affb1316-cbf5-4641-bdd7-186e390b9e7e\") " pod="openstack/openstack-galera-0" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.337595 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/affb1316-cbf5-4641-bdd7-186e390b9e7e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"affb1316-cbf5-4641-bdd7-186e390b9e7e\") " pod="openstack/openstack-galera-0" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.337636 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kktvv\" (UniqueName: \"kubernetes.io/projected/affb1316-cbf5-4641-bdd7-186e390b9e7e-kube-api-access-kktvv\") pod \"openstack-galera-0\" (UID: \"affb1316-cbf5-4641-bdd7-186e390b9e7e\") " pod="openstack/openstack-galera-0" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.337694 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/affb1316-cbf5-4641-bdd7-186e390b9e7e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"affb1316-cbf5-4641-bdd7-186e390b9e7e\") " pod="openstack/openstack-galera-0" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.337718 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/affb1316-cbf5-4641-bdd7-186e390b9e7e-config-data-default\") pod \"openstack-galera-0\" (UID: \"affb1316-cbf5-4641-bdd7-186e390b9e7e\") " pod="openstack/openstack-galera-0" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.338904 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/affb1316-cbf5-4641-bdd7-186e390b9e7e-config-data-default\") pod \"openstack-galera-0\" (UID: \"affb1316-cbf5-4641-bdd7-186e390b9e7e\") " pod="openstack/openstack-galera-0" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.339283 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/affb1316-cbf5-4641-bdd7-186e390b9e7e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"affb1316-cbf5-4641-bdd7-186e390b9e7e\") " pod="openstack/openstack-galera-0" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.339376 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/affb1316-cbf5-4641-bdd7-186e390b9e7e-kolla-config\") pod \"openstack-galera-0\" (UID: \"affb1316-cbf5-4641-bdd7-186e390b9e7e\") " pod="openstack/openstack-galera-0" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.341371 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/affb1316-cbf5-4641-bdd7-186e390b9e7e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"affb1316-cbf5-4641-bdd7-186e390b9e7e\") " pod="openstack/openstack-galera-0" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.341493 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"affb1316-cbf5-4641-bdd7-186e390b9e7e\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-galera-0" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.350132 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/affb1316-cbf5-4641-bdd7-186e390b9e7e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"affb1316-cbf5-4641-bdd7-186e390b9e7e\") " pod="openstack/openstack-galera-0" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.353805 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/affb1316-cbf5-4641-bdd7-186e390b9e7e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"affb1316-cbf5-4641-bdd7-186e390b9e7e\") " pod="openstack/openstack-galera-0" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.357211 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kktvv\" (UniqueName: \"kubernetes.io/projected/affb1316-cbf5-4641-bdd7-186e390b9e7e-kube-api-access-kktvv\") pod \"openstack-galera-0\" (UID: \"affb1316-cbf5-4641-bdd7-186e390b9e7e\") " pod="openstack/openstack-galera-0" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.369159 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"affb1316-cbf5-4641-bdd7-186e390b9e7e\") " pod="openstack/openstack-galera-0" Feb 19 18:49:59 crc kubenswrapper[4749]: I0219 18:49:59.412406 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.349088 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.361923 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.366663 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.375379 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.383158 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.383248 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-2qzth" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.383254 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.466091 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e84776ec-57db-4685-84f6-f86655d9f079\") " pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.466164 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wczb5\" (UniqueName: \"kubernetes.io/projected/e84776ec-57db-4685-84f6-f86655d9f079-kube-api-access-wczb5\") pod \"openstack-cell1-galera-0\" (UID: \"e84776ec-57db-4685-84f6-f86655d9f079\") " pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.466317 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e84776ec-57db-4685-84f6-f86655d9f079-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e84776ec-57db-4685-84f6-f86655d9f079\") " pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.466709 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e84776ec-57db-4685-84f6-f86655d9f079-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e84776ec-57db-4685-84f6-f86655d9f079\") " pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.466829 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e84776ec-57db-4685-84f6-f86655d9f079-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e84776ec-57db-4685-84f6-f86655d9f079\") " pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.466965 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e84776ec-57db-4685-84f6-f86655d9f079-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e84776ec-57db-4685-84f6-f86655d9f079\") " pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.467056 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e84776ec-57db-4685-84f6-f86655d9f079-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e84776ec-57db-4685-84f6-f86655d9f079\") " pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.467107 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e84776ec-57db-4685-84f6-f86655d9f079-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e84776ec-57db-4685-84f6-f86655d9f079\") " pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.517001 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.519557 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.521762 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.524344 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.524612 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.524709 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-vhvvr" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.568295 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e84776ec-57db-4685-84f6-f86655d9f079\") " pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.568340 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wczb5\" (UniqueName: \"kubernetes.io/projected/e84776ec-57db-4685-84f6-f86655d9f079-kube-api-access-wczb5\") pod \"openstack-cell1-galera-0\" (UID: \"e84776ec-57db-4685-84f6-f86655d9f079\") " pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.568370 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e84776ec-57db-4685-84f6-f86655d9f079-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e84776ec-57db-4685-84f6-f86655d9f079\") " pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.568455 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e84776ec-57db-4685-84f6-f86655d9f079-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e84776ec-57db-4685-84f6-f86655d9f079\") " pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.568587 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e84776ec-57db-4685-84f6-f86655d9f079-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e84776ec-57db-4685-84f6-f86655d9f079\") " pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.568665 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e84776ec-57db-4685-84f6-f86655d9f079-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e84776ec-57db-4685-84f6-f86655d9f079\") " pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.568666 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e84776ec-57db-4685-84f6-f86655d9f079\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.569526 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e84776ec-57db-4685-84f6-f86655d9f079-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e84776ec-57db-4685-84f6-f86655d9f079\") " pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.570607 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e84776ec-57db-4685-84f6-f86655d9f079-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e84776ec-57db-4685-84f6-f86655d9f079\") " pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.570632 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e84776ec-57db-4685-84f6-f86655d9f079-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e84776ec-57db-4685-84f6-f86655d9f079\") " pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.570675 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e84776ec-57db-4685-84f6-f86655d9f079-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e84776ec-57db-4685-84f6-f86655d9f079\") " pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.571124 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e84776ec-57db-4685-84f6-f86655d9f079-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e84776ec-57db-4685-84f6-f86655d9f079\") " pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.572588 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e84776ec-57db-4685-84f6-f86655d9f079-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e84776ec-57db-4685-84f6-f86655d9f079\") " pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.573474 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e84776ec-57db-4685-84f6-f86655d9f079-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e84776ec-57db-4685-84f6-f86655d9f079\") " pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.575992 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e84776ec-57db-4685-84f6-f86655d9f079-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e84776ec-57db-4685-84f6-f86655d9f079\") " pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.587134 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wczb5\" (UniqueName: \"kubernetes.io/projected/e84776ec-57db-4685-84f6-f86655d9f079-kube-api-access-wczb5\") pod \"openstack-cell1-galera-0\" (UID: \"e84776ec-57db-4685-84f6-f86655d9f079\") " pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.592017 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e84776ec-57db-4685-84f6-f86655d9f079\") " pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.671662 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a890ae9-2fb7-4410-b2a5-3374f5555b0f-config-data\") pod \"memcached-0\" (UID: \"7a890ae9-2fb7-4410-b2a5-3374f5555b0f\") " pod="openstack/memcached-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.671761 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7a890ae9-2fb7-4410-b2a5-3374f5555b0f-kolla-config\") pod \"memcached-0\" (UID: \"7a890ae9-2fb7-4410-b2a5-3374f5555b0f\") " pod="openstack/memcached-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.671848 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a890ae9-2fb7-4410-b2a5-3374f5555b0f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7a890ae9-2fb7-4410-b2a5-3374f5555b0f\") " pod="openstack/memcached-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.671879 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a890ae9-2fb7-4410-b2a5-3374f5555b0f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7a890ae9-2fb7-4410-b2a5-3374f5555b0f\") " pod="openstack/memcached-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.671937 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7gg5\" (UniqueName: \"kubernetes.io/projected/7a890ae9-2fb7-4410-b2a5-3374f5555b0f-kube-api-access-b7gg5\") pod \"memcached-0\" (UID: \"7a890ae9-2fb7-4410-b2a5-3374f5555b0f\") " pod="openstack/memcached-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.713797 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.773454 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7gg5\" (UniqueName: \"kubernetes.io/projected/7a890ae9-2fb7-4410-b2a5-3374f5555b0f-kube-api-access-b7gg5\") pod \"memcached-0\" (UID: \"7a890ae9-2fb7-4410-b2a5-3374f5555b0f\") " pod="openstack/memcached-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.773580 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a890ae9-2fb7-4410-b2a5-3374f5555b0f-config-data\") pod \"memcached-0\" (UID: \"7a890ae9-2fb7-4410-b2a5-3374f5555b0f\") " pod="openstack/memcached-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.773625 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7a890ae9-2fb7-4410-b2a5-3374f5555b0f-kolla-config\") pod \"memcached-0\" (UID: \"7a890ae9-2fb7-4410-b2a5-3374f5555b0f\") " pod="openstack/memcached-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.774853 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a890ae9-2fb7-4410-b2a5-3374f5555b0f-config-data\") pod \"memcached-0\" (UID: \"7a890ae9-2fb7-4410-b2a5-3374f5555b0f\") " pod="openstack/memcached-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.774893 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7a890ae9-2fb7-4410-b2a5-3374f5555b0f-kolla-config\") pod \"memcached-0\" (UID: \"7a890ae9-2fb7-4410-b2a5-3374f5555b0f\") " pod="openstack/memcached-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.775283 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a890ae9-2fb7-4410-b2a5-3374f5555b0f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7a890ae9-2fb7-4410-b2a5-3374f5555b0f\") " pod="openstack/memcached-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.775366 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a890ae9-2fb7-4410-b2a5-3374f5555b0f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7a890ae9-2fb7-4410-b2a5-3374f5555b0f\") " pod="openstack/memcached-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.779172 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a890ae9-2fb7-4410-b2a5-3374f5555b0f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7a890ae9-2fb7-4410-b2a5-3374f5555b0f\") " pod="openstack/memcached-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.783763 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a890ae9-2fb7-4410-b2a5-3374f5555b0f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7a890ae9-2fb7-4410-b2a5-3374f5555b0f\") " pod="openstack/memcached-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.792248 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7gg5\" (UniqueName: \"kubernetes.io/projected/7a890ae9-2fb7-4410-b2a5-3374f5555b0f-kube-api-access-b7gg5\") pod \"memcached-0\" (UID: \"7a890ae9-2fb7-4410-b2a5-3374f5555b0f\") " pod="openstack/memcached-0" Feb 19 18:50:00 crc kubenswrapper[4749]: I0219 18:50:00.855369 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 18:50:02 crc kubenswrapper[4749]: I0219 18:50:02.946709 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 18:50:02 crc kubenswrapper[4749]: I0219 18:50:02.948085 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 18:50:02 crc kubenswrapper[4749]: I0219 18:50:02.955638 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-srz2c" Feb 19 18:50:02 crc kubenswrapper[4749]: I0219 18:50:02.964706 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 18:50:03 crc kubenswrapper[4749]: I0219 18:50:03.116908 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwmw4\" (UniqueName: \"kubernetes.io/projected/872559f0-434b-4f87-b6de-c8c56cad33c3-kube-api-access-gwmw4\") pod \"kube-state-metrics-0\" (UID: \"872559f0-434b-4f87-b6de-c8c56cad33c3\") " pod="openstack/kube-state-metrics-0" Feb 19 18:50:03 crc kubenswrapper[4749]: I0219 18:50:03.218530 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwmw4\" (UniqueName: \"kubernetes.io/projected/872559f0-434b-4f87-b6de-c8c56cad33c3-kube-api-access-gwmw4\") pod \"kube-state-metrics-0\" (UID: \"872559f0-434b-4f87-b6de-c8c56cad33c3\") " pod="openstack/kube-state-metrics-0" Feb 19 18:50:03 crc kubenswrapper[4749]: I0219 18:50:03.260460 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwmw4\" (UniqueName: \"kubernetes.io/projected/872559f0-434b-4f87-b6de-c8c56cad33c3-kube-api-access-gwmw4\") pod \"kube-state-metrics-0\" (UID: \"872559f0-434b-4f87-b6de-c8c56cad33c3\") " pod="openstack/kube-state-metrics-0" Feb 19 18:50:03 crc kubenswrapper[4749]: I0219 18:50:03.277437 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.312450 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.314813 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.317435 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.317672 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.317809 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.317956 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.318114 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.318489 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-h9bs6" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.318664 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.324631 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.334252 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.446094 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/21b53583-c33c-47c6-8351-35bd5f08e632-config\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.446138 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/21b53583-c33c-47c6-8351-35bd5f08e632-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.446162 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/21b53583-c33c-47c6-8351-35bd5f08e632-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.446190 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/21b53583-c33c-47c6-8351-35bd5f08e632-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.446241 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-82420923-4549-44bf-81a2-5cca6d09b55a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82420923-4549-44bf-81a2-5cca6d09b55a\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.446274 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/21b53583-c33c-47c6-8351-35bd5f08e632-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.446380 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/21b53583-c33c-47c6-8351-35bd5f08e632-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.446431 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/21b53583-c33c-47c6-8351-35bd5f08e632-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.446629 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhd9v\" (UniqueName: \"kubernetes.io/projected/21b53583-c33c-47c6-8351-35bd5f08e632-kube-api-access-fhd9v\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.446690 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/21b53583-c33c-47c6-8351-35bd5f08e632-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.550120 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/21b53583-c33c-47c6-8351-35bd5f08e632-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.550193 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/21b53583-c33c-47c6-8351-35bd5f08e632-config\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.550225 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/21b53583-c33c-47c6-8351-35bd5f08e632-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.550255 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/21b53583-c33c-47c6-8351-35bd5f08e632-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.550278 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/21b53583-c33c-47c6-8351-35bd5f08e632-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.550308 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-82420923-4549-44bf-81a2-5cca6d09b55a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82420923-4549-44bf-81a2-5cca6d09b55a\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.550335 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/21b53583-c33c-47c6-8351-35bd5f08e632-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.550412 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/21b53583-c33c-47c6-8351-35bd5f08e632-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.550436 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/21b53583-c33c-47c6-8351-35bd5f08e632-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.550505 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhd9v\" (UniqueName: \"kubernetes.io/projected/21b53583-c33c-47c6-8351-35bd5f08e632-kube-api-access-fhd9v\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.551424 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/21b53583-c33c-47c6-8351-35bd5f08e632-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.551747 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/21b53583-c33c-47c6-8351-35bd5f08e632-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.551893 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/21b53583-c33c-47c6-8351-35bd5f08e632-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.553956 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/21b53583-c33c-47c6-8351-35bd5f08e632-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.553963 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.554008 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-82420923-4549-44bf-81a2-5cca6d09b55a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82420923-4549-44bf-81a2-5cca6d09b55a\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7aef4d84e7f064b8dddb5f07903a3617545888f3f79f605754eebcaaed810a22/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.554373 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/21b53583-c33c-47c6-8351-35bd5f08e632-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.554926 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/21b53583-c33c-47c6-8351-35bd5f08e632-config\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.556705 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/21b53583-c33c-47c6-8351-35bd5f08e632-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.556923 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/21b53583-c33c-47c6-8351-35bd5f08e632-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.568714 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhd9v\" (UniqueName: \"kubernetes.io/projected/21b53583-c33c-47c6-8351-35bd5f08e632-kube-api-access-fhd9v\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.578730 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-82420923-4549-44bf-81a2-5cca6d09b55a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82420923-4549-44bf-81a2-5cca6d09b55a\") pod \"prometheus-metric-storage-0\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:04 crc kubenswrapper[4749]: I0219 18:50:04.648510 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.026223 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ztkm4"] Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.027675 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ztkm4" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.033082 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.033102 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-rqctj" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.033281 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.034065 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-sn6j7"] Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.035993 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-sn6j7" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.046050 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ztkm4"] Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.062976 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-sn6j7"] Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.175427 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/091cdb0e-a88c-4731-9a77-6a3c41e0fc1a-var-log\") pod \"ovn-controller-ovs-sn6j7\" (UID: \"091cdb0e-a88c-4731-9a77-6a3c41e0fc1a\") " pod="openstack/ovn-controller-ovs-sn6j7" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.175490 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7232b466-ffe3-4eab-ad4c-bb2ccac65929-scripts\") pod \"ovn-controller-ztkm4\" (UID: \"7232b466-ffe3-4eab-ad4c-bb2ccac65929\") " pod="openstack/ovn-controller-ztkm4" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.175568 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7232b466-ffe3-4eab-ad4c-bb2ccac65929-var-log-ovn\") pod \"ovn-controller-ztkm4\" (UID: \"7232b466-ffe3-4eab-ad4c-bb2ccac65929\") " pod="openstack/ovn-controller-ztkm4" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.175658 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/091cdb0e-a88c-4731-9a77-6a3c41e0fc1a-scripts\") pod \"ovn-controller-ovs-sn6j7\" (UID: \"091cdb0e-a88c-4731-9a77-6a3c41e0fc1a\") " pod="openstack/ovn-controller-ovs-sn6j7" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.175742 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/091cdb0e-a88c-4731-9a77-6a3c41e0fc1a-etc-ovs\") pod \"ovn-controller-ovs-sn6j7\" (UID: \"091cdb0e-a88c-4731-9a77-6a3c41e0fc1a\") " pod="openstack/ovn-controller-ovs-sn6j7" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.175784 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7232b466-ffe3-4eab-ad4c-bb2ccac65929-combined-ca-bundle\") pod \"ovn-controller-ztkm4\" (UID: \"7232b466-ffe3-4eab-ad4c-bb2ccac65929\") " pod="openstack/ovn-controller-ztkm4" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.175913 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/091cdb0e-a88c-4731-9a77-6a3c41e0fc1a-var-run\") pod \"ovn-controller-ovs-sn6j7\" (UID: \"091cdb0e-a88c-4731-9a77-6a3c41e0fc1a\") " pod="openstack/ovn-controller-ovs-sn6j7" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.175967 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2stjn\" (UniqueName: \"kubernetes.io/projected/7232b466-ffe3-4eab-ad4c-bb2ccac65929-kube-api-access-2stjn\") pod \"ovn-controller-ztkm4\" (UID: \"7232b466-ffe3-4eab-ad4c-bb2ccac65929\") " pod="openstack/ovn-controller-ztkm4" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.176013 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/091cdb0e-a88c-4731-9a77-6a3c41e0fc1a-var-lib\") pod \"ovn-controller-ovs-sn6j7\" (UID: \"091cdb0e-a88c-4731-9a77-6a3c41e0fc1a\") " pod="openstack/ovn-controller-ovs-sn6j7" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.176054 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms9p8\" (UniqueName: \"kubernetes.io/projected/091cdb0e-a88c-4731-9a77-6a3c41e0fc1a-kube-api-access-ms9p8\") pod \"ovn-controller-ovs-sn6j7\" (UID: \"091cdb0e-a88c-4731-9a77-6a3c41e0fc1a\") " pod="openstack/ovn-controller-ovs-sn6j7" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.176162 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7232b466-ffe3-4eab-ad4c-bb2ccac65929-var-run-ovn\") pod \"ovn-controller-ztkm4\" (UID: \"7232b466-ffe3-4eab-ad4c-bb2ccac65929\") " pod="openstack/ovn-controller-ztkm4" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.176244 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7232b466-ffe3-4eab-ad4c-bb2ccac65929-var-run\") pod \"ovn-controller-ztkm4\" (UID: \"7232b466-ffe3-4eab-ad4c-bb2ccac65929\") " pod="openstack/ovn-controller-ztkm4" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.176291 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7232b466-ffe3-4eab-ad4c-bb2ccac65929-ovn-controller-tls-certs\") pod \"ovn-controller-ztkm4\" (UID: \"7232b466-ffe3-4eab-ad4c-bb2ccac65929\") " pod="openstack/ovn-controller-ztkm4" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.278172 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/091cdb0e-a88c-4731-9a77-6a3c41e0fc1a-var-run\") pod \"ovn-controller-ovs-sn6j7\" (UID: \"091cdb0e-a88c-4731-9a77-6a3c41e0fc1a\") " pod="openstack/ovn-controller-ovs-sn6j7" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.278227 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2stjn\" (UniqueName: \"kubernetes.io/projected/7232b466-ffe3-4eab-ad4c-bb2ccac65929-kube-api-access-2stjn\") pod \"ovn-controller-ztkm4\" (UID: \"7232b466-ffe3-4eab-ad4c-bb2ccac65929\") " pod="openstack/ovn-controller-ztkm4" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.278257 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/091cdb0e-a88c-4731-9a77-6a3c41e0fc1a-var-lib\") pod \"ovn-controller-ovs-sn6j7\" (UID: \"091cdb0e-a88c-4731-9a77-6a3c41e0fc1a\") " pod="openstack/ovn-controller-ovs-sn6j7" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.278281 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms9p8\" (UniqueName: \"kubernetes.io/projected/091cdb0e-a88c-4731-9a77-6a3c41e0fc1a-kube-api-access-ms9p8\") pod \"ovn-controller-ovs-sn6j7\" (UID: \"091cdb0e-a88c-4731-9a77-6a3c41e0fc1a\") " pod="openstack/ovn-controller-ovs-sn6j7" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.278335 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7232b466-ffe3-4eab-ad4c-bb2ccac65929-var-run-ovn\") pod \"ovn-controller-ztkm4\" (UID: \"7232b466-ffe3-4eab-ad4c-bb2ccac65929\") " pod="openstack/ovn-controller-ztkm4" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.278385 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7232b466-ffe3-4eab-ad4c-bb2ccac65929-var-run\") pod \"ovn-controller-ztkm4\" (UID: \"7232b466-ffe3-4eab-ad4c-bb2ccac65929\") " pod="openstack/ovn-controller-ztkm4" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.278423 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7232b466-ffe3-4eab-ad4c-bb2ccac65929-ovn-controller-tls-certs\") pod \"ovn-controller-ztkm4\" (UID: \"7232b466-ffe3-4eab-ad4c-bb2ccac65929\") " pod="openstack/ovn-controller-ztkm4" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.278457 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/091cdb0e-a88c-4731-9a77-6a3c41e0fc1a-var-log\") pod \"ovn-controller-ovs-sn6j7\" (UID: \"091cdb0e-a88c-4731-9a77-6a3c41e0fc1a\") " pod="openstack/ovn-controller-ovs-sn6j7" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.278488 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7232b466-ffe3-4eab-ad4c-bb2ccac65929-scripts\") pod \"ovn-controller-ztkm4\" (UID: \"7232b466-ffe3-4eab-ad4c-bb2ccac65929\") " pod="openstack/ovn-controller-ztkm4" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.278506 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7232b466-ffe3-4eab-ad4c-bb2ccac65929-var-log-ovn\") pod \"ovn-controller-ztkm4\" (UID: \"7232b466-ffe3-4eab-ad4c-bb2ccac65929\") " pod="openstack/ovn-controller-ztkm4" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.278535 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/091cdb0e-a88c-4731-9a77-6a3c41e0fc1a-scripts\") pod \"ovn-controller-ovs-sn6j7\" (UID: \"091cdb0e-a88c-4731-9a77-6a3c41e0fc1a\") " pod="openstack/ovn-controller-ovs-sn6j7" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.278567 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/091cdb0e-a88c-4731-9a77-6a3c41e0fc1a-etc-ovs\") pod \"ovn-controller-ovs-sn6j7\" (UID: \"091cdb0e-a88c-4731-9a77-6a3c41e0fc1a\") " pod="openstack/ovn-controller-ovs-sn6j7" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.278594 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7232b466-ffe3-4eab-ad4c-bb2ccac65929-combined-ca-bundle\") pod \"ovn-controller-ztkm4\" (UID: \"7232b466-ffe3-4eab-ad4c-bb2ccac65929\") " pod="openstack/ovn-controller-ztkm4" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.278829 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/091cdb0e-a88c-4731-9a77-6a3c41e0fc1a-var-run\") pod \"ovn-controller-ovs-sn6j7\" (UID: \"091cdb0e-a88c-4731-9a77-6a3c41e0fc1a\") " pod="openstack/ovn-controller-ovs-sn6j7" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.278904 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/091cdb0e-a88c-4731-9a77-6a3c41e0fc1a-var-lib\") pod \"ovn-controller-ovs-sn6j7\" (UID: \"091cdb0e-a88c-4731-9a77-6a3c41e0fc1a\") " pod="openstack/ovn-controller-ovs-sn6j7" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.278930 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7232b466-ffe3-4eab-ad4c-bb2ccac65929-var-run-ovn\") pod \"ovn-controller-ztkm4\" (UID: \"7232b466-ffe3-4eab-ad4c-bb2ccac65929\") " pod="openstack/ovn-controller-ztkm4" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.278966 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7232b466-ffe3-4eab-ad4c-bb2ccac65929-var-run\") pod \"ovn-controller-ztkm4\" (UID: \"7232b466-ffe3-4eab-ad4c-bb2ccac65929\") " pod="openstack/ovn-controller-ztkm4" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.279108 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7232b466-ffe3-4eab-ad4c-bb2ccac65929-var-log-ovn\") pod \"ovn-controller-ztkm4\" (UID: \"7232b466-ffe3-4eab-ad4c-bb2ccac65929\") " pod="openstack/ovn-controller-ztkm4" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.279142 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/091cdb0e-a88c-4731-9a77-6a3c41e0fc1a-etc-ovs\") pod \"ovn-controller-ovs-sn6j7\" (UID: \"091cdb0e-a88c-4731-9a77-6a3c41e0fc1a\") " pod="openstack/ovn-controller-ovs-sn6j7" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.279210 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/091cdb0e-a88c-4731-9a77-6a3c41e0fc1a-var-log\") pod \"ovn-controller-ovs-sn6j7\" (UID: \"091cdb0e-a88c-4731-9a77-6a3c41e0fc1a\") " pod="openstack/ovn-controller-ovs-sn6j7" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.281335 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/091cdb0e-a88c-4731-9a77-6a3c41e0fc1a-scripts\") pod \"ovn-controller-ovs-sn6j7\" (UID: \"091cdb0e-a88c-4731-9a77-6a3c41e0fc1a\") " pod="openstack/ovn-controller-ovs-sn6j7" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.281333 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7232b466-ffe3-4eab-ad4c-bb2ccac65929-scripts\") pod \"ovn-controller-ztkm4\" (UID: \"7232b466-ffe3-4eab-ad4c-bb2ccac65929\") " pod="openstack/ovn-controller-ztkm4" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.283516 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7232b466-ffe3-4eab-ad4c-bb2ccac65929-combined-ca-bundle\") pod \"ovn-controller-ztkm4\" (UID: \"7232b466-ffe3-4eab-ad4c-bb2ccac65929\") " pod="openstack/ovn-controller-ztkm4" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.288653 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7232b466-ffe3-4eab-ad4c-bb2ccac65929-ovn-controller-tls-certs\") pod \"ovn-controller-ztkm4\" (UID: \"7232b466-ffe3-4eab-ad4c-bb2ccac65929\") " pod="openstack/ovn-controller-ztkm4" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.301860 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2stjn\" (UniqueName: \"kubernetes.io/projected/7232b466-ffe3-4eab-ad4c-bb2ccac65929-kube-api-access-2stjn\") pod \"ovn-controller-ztkm4\" (UID: \"7232b466-ffe3-4eab-ad4c-bb2ccac65929\") " pod="openstack/ovn-controller-ztkm4" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.302602 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms9p8\" (UniqueName: \"kubernetes.io/projected/091cdb0e-a88c-4731-9a77-6a3c41e0fc1a-kube-api-access-ms9p8\") pod \"ovn-controller-ovs-sn6j7\" (UID: \"091cdb0e-a88c-4731-9a77-6a3c41e0fc1a\") " pod="openstack/ovn-controller-ovs-sn6j7" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.350651 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ztkm4" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.357635 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-sn6j7" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.931090 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.932544 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.934811 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-rph62" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.934829 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.934991 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.935136 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.938813 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 19 18:50:06 crc kubenswrapper[4749]: I0219 18:50:06.947458 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 18:50:07 crc kubenswrapper[4749]: I0219 18:50:07.091670 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e2f6471c-3fea-45fc-8702-9022ff831352-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e2f6471c-3fea-45fc-8702-9022ff831352\") " pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:07 crc kubenswrapper[4749]: I0219 18:50:07.091740 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2f6471c-3fea-45fc-8702-9022ff831352-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e2f6471c-3fea-45fc-8702-9022ff831352\") " pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:07 crc kubenswrapper[4749]: I0219 18:50:07.091781 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f6471c-3fea-45fc-8702-9022ff831352-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e2f6471c-3fea-45fc-8702-9022ff831352\") " pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:07 crc kubenswrapper[4749]: I0219 18:50:07.091867 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcnjf\" (UniqueName: \"kubernetes.io/projected/e2f6471c-3fea-45fc-8702-9022ff831352-kube-api-access-lcnjf\") pod \"ovsdbserver-sb-0\" (UID: \"e2f6471c-3fea-45fc-8702-9022ff831352\") " pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:07 crc kubenswrapper[4749]: I0219 18:50:07.091901 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f6471c-3fea-45fc-8702-9022ff831352-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e2f6471c-3fea-45fc-8702-9022ff831352\") " pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:07 crc kubenswrapper[4749]: I0219 18:50:07.091933 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e2f6471c-3fea-45fc-8702-9022ff831352\") " pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:07 crc kubenswrapper[4749]: I0219 18:50:07.092015 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2f6471c-3fea-45fc-8702-9022ff831352-config\") pod \"ovsdbserver-sb-0\" (UID: \"e2f6471c-3fea-45fc-8702-9022ff831352\") " pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:07 crc kubenswrapper[4749]: I0219 18:50:07.092211 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f6471c-3fea-45fc-8702-9022ff831352-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e2f6471c-3fea-45fc-8702-9022ff831352\") " pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:07 crc kubenswrapper[4749]: I0219 18:50:07.194168 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e2f6471c-3fea-45fc-8702-9022ff831352\") " pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:07 crc kubenswrapper[4749]: I0219 18:50:07.194246 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2f6471c-3fea-45fc-8702-9022ff831352-config\") pod \"ovsdbserver-sb-0\" (UID: \"e2f6471c-3fea-45fc-8702-9022ff831352\") " pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:07 crc kubenswrapper[4749]: I0219 18:50:07.194308 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f6471c-3fea-45fc-8702-9022ff831352-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e2f6471c-3fea-45fc-8702-9022ff831352\") " pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:07 crc kubenswrapper[4749]: I0219 18:50:07.194370 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e2f6471c-3fea-45fc-8702-9022ff831352-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e2f6471c-3fea-45fc-8702-9022ff831352\") " pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:07 crc kubenswrapper[4749]: I0219 18:50:07.194404 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2f6471c-3fea-45fc-8702-9022ff831352-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e2f6471c-3fea-45fc-8702-9022ff831352\") " pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:07 crc kubenswrapper[4749]: I0219 18:50:07.194547 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f6471c-3fea-45fc-8702-9022ff831352-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e2f6471c-3fea-45fc-8702-9022ff831352\") " pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:07 crc kubenswrapper[4749]: I0219 18:50:07.194560 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e2f6471c-3fea-45fc-8702-9022ff831352\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:07 crc kubenswrapper[4749]: I0219 18:50:07.194881 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e2f6471c-3fea-45fc-8702-9022ff831352-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e2f6471c-3fea-45fc-8702-9022ff831352\") " pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:07 crc kubenswrapper[4749]: I0219 18:50:07.195483 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2f6471c-3fea-45fc-8702-9022ff831352-config\") pod \"ovsdbserver-sb-0\" (UID: \"e2f6471c-3fea-45fc-8702-9022ff831352\") " pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:07 crc kubenswrapper[4749]: I0219 18:50:07.195699 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2f6471c-3fea-45fc-8702-9022ff831352-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e2f6471c-3fea-45fc-8702-9022ff831352\") " pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:07 crc kubenswrapper[4749]: I0219 18:50:07.195772 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcnjf\" (UniqueName: \"kubernetes.io/projected/e2f6471c-3fea-45fc-8702-9022ff831352-kube-api-access-lcnjf\") pod \"ovsdbserver-sb-0\" (UID: \"e2f6471c-3fea-45fc-8702-9022ff831352\") " pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:07 crc kubenswrapper[4749]: I0219 18:50:07.209839 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f6471c-3fea-45fc-8702-9022ff831352-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e2f6471c-3fea-45fc-8702-9022ff831352\") " pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:07 crc kubenswrapper[4749]: I0219 18:50:07.211707 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f6471c-3fea-45fc-8702-9022ff831352-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e2f6471c-3fea-45fc-8702-9022ff831352\") " pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:07 crc kubenswrapper[4749]: I0219 18:50:07.217660 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f6471c-3fea-45fc-8702-9022ff831352-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e2f6471c-3fea-45fc-8702-9022ff831352\") " pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:07 crc kubenswrapper[4749]: I0219 18:50:07.226314 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f6471c-3fea-45fc-8702-9022ff831352-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e2f6471c-3fea-45fc-8702-9022ff831352\") " pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:07 crc kubenswrapper[4749]: I0219 18:50:07.227934 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcnjf\" (UniqueName: \"kubernetes.io/projected/e2f6471c-3fea-45fc-8702-9022ff831352-kube-api-access-lcnjf\") pod \"ovsdbserver-sb-0\" (UID: \"e2f6471c-3fea-45fc-8702-9022ff831352\") " pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:07 crc kubenswrapper[4749]: I0219 18:50:07.241449 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e2f6471c-3fea-45fc-8702-9022ff831352\") " pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:07 crc kubenswrapper[4749]: I0219 18:50:07.249497 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:09 crc kubenswrapper[4749]: I0219 18:50:09.961148 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 18:50:09 crc kubenswrapper[4749]: I0219 18:50:09.962591 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:09 crc kubenswrapper[4749]: I0219 18:50:09.966938 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 19 18:50:09 crc kubenswrapper[4749]: I0219 18:50:09.967117 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 19 18:50:09 crc kubenswrapper[4749]: I0219 18:50:09.967226 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 19 18:50:09 crc kubenswrapper[4749]: I0219 18:50:09.967724 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-4dmk4" Feb 19 18:50:09 crc kubenswrapper[4749]: I0219 18:50:09.986784 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 18:50:10 crc kubenswrapper[4749]: I0219 18:50:10.052527 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c7f450e-9c65-4f47-a259-c6e667660b59-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0c7f450e-9c65-4f47-a259-c6e667660b59\") " pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:10 crc kubenswrapper[4749]: I0219 18:50:10.052597 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c7f450e-9c65-4f47-a259-c6e667660b59-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0c7f450e-9c65-4f47-a259-c6e667660b59\") " pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:10 crc kubenswrapper[4749]: I0219 18:50:10.052685 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c7f450e-9c65-4f47-a259-c6e667660b59-config\") pod \"ovsdbserver-nb-0\" (UID: \"0c7f450e-9c65-4f47-a259-c6e667660b59\") " pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:10 crc kubenswrapper[4749]: I0219 18:50:10.052730 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc2jj\" (UniqueName: \"kubernetes.io/projected/0c7f450e-9c65-4f47-a259-c6e667660b59-kube-api-access-rc2jj\") pod \"ovsdbserver-nb-0\" (UID: \"0c7f450e-9c65-4f47-a259-c6e667660b59\") " pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:10 crc kubenswrapper[4749]: I0219 18:50:10.052758 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c7f450e-9c65-4f47-a259-c6e667660b59-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0c7f450e-9c65-4f47-a259-c6e667660b59\") " pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:10 crc kubenswrapper[4749]: I0219 18:50:10.052804 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c7f450e-9c65-4f47-a259-c6e667660b59-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0c7f450e-9c65-4f47-a259-c6e667660b59\") " pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:10 crc kubenswrapper[4749]: I0219 18:50:10.052830 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0c7f450e-9c65-4f47-a259-c6e667660b59-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0c7f450e-9c65-4f47-a259-c6e667660b59\") " pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:10 crc kubenswrapper[4749]: I0219 18:50:10.052847 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0c7f450e-9c65-4f47-a259-c6e667660b59\") " pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:10 crc kubenswrapper[4749]: I0219 18:50:10.154233 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c7f450e-9c65-4f47-a259-c6e667660b59-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0c7f450e-9c65-4f47-a259-c6e667660b59\") " pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:10 crc kubenswrapper[4749]: I0219 18:50:10.154293 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c7f450e-9c65-4f47-a259-c6e667660b59-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0c7f450e-9c65-4f47-a259-c6e667660b59\") " pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:10 crc kubenswrapper[4749]: I0219 18:50:10.154324 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c7f450e-9c65-4f47-a259-c6e667660b59-config\") pod \"ovsdbserver-nb-0\" (UID: \"0c7f450e-9c65-4f47-a259-c6e667660b59\") " pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:10 crc kubenswrapper[4749]: I0219 18:50:10.154358 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc2jj\" (UniqueName: \"kubernetes.io/projected/0c7f450e-9c65-4f47-a259-c6e667660b59-kube-api-access-rc2jj\") pod \"ovsdbserver-nb-0\" (UID: \"0c7f450e-9c65-4f47-a259-c6e667660b59\") " pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:10 crc kubenswrapper[4749]: I0219 18:50:10.154387 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c7f450e-9c65-4f47-a259-c6e667660b59-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0c7f450e-9c65-4f47-a259-c6e667660b59\") " pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:10 crc kubenswrapper[4749]: I0219 18:50:10.154422 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c7f450e-9c65-4f47-a259-c6e667660b59-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0c7f450e-9c65-4f47-a259-c6e667660b59\") " pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:10 crc kubenswrapper[4749]: I0219 18:50:10.154452 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0c7f450e-9c65-4f47-a259-c6e667660b59-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0c7f450e-9c65-4f47-a259-c6e667660b59\") " pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:10 crc kubenswrapper[4749]: I0219 18:50:10.154474 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0c7f450e-9c65-4f47-a259-c6e667660b59\") " pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:10 crc kubenswrapper[4749]: I0219 18:50:10.154756 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0c7f450e-9c65-4f47-a259-c6e667660b59\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:10 crc kubenswrapper[4749]: I0219 18:50:10.155085 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0c7f450e-9c65-4f47-a259-c6e667660b59-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0c7f450e-9c65-4f47-a259-c6e667660b59\") " pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:10 crc kubenswrapper[4749]: I0219 18:50:10.155616 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c7f450e-9c65-4f47-a259-c6e667660b59-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0c7f450e-9c65-4f47-a259-c6e667660b59\") " pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:10 crc kubenswrapper[4749]: I0219 18:50:10.155680 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c7f450e-9c65-4f47-a259-c6e667660b59-config\") pod \"ovsdbserver-nb-0\" (UID: \"0c7f450e-9c65-4f47-a259-c6e667660b59\") " pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:10 crc kubenswrapper[4749]: I0219 18:50:10.160781 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c7f450e-9c65-4f47-a259-c6e667660b59-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0c7f450e-9c65-4f47-a259-c6e667660b59\") " pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:10 crc kubenswrapper[4749]: I0219 18:50:10.161243 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c7f450e-9c65-4f47-a259-c6e667660b59-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0c7f450e-9c65-4f47-a259-c6e667660b59\") " pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:10 crc kubenswrapper[4749]: I0219 18:50:10.166730 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c7f450e-9c65-4f47-a259-c6e667660b59-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0c7f450e-9c65-4f47-a259-c6e667660b59\") " pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:10 crc kubenswrapper[4749]: I0219 18:50:10.181894 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc2jj\" (UniqueName: \"kubernetes.io/projected/0c7f450e-9c65-4f47-a259-c6e667660b59-kube-api-access-rc2jj\") pod \"ovsdbserver-nb-0\" (UID: \"0c7f450e-9c65-4f47-a259-c6e667660b59\") " pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:10 crc kubenswrapper[4749]: I0219 18:50:10.185623 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0c7f450e-9c65-4f47-a259-c6e667660b59\") " pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:10 crc kubenswrapper[4749]: I0219 18:50:10.278801 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:12 crc kubenswrapper[4749]: I0219 18:50:12.612081 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"27dfe8e9-686d-4703-b36d-df6b94491b40","Type":"ContainerStarted","Data":"7a711742112b795de0301aa068747d7f5687b391f7f9cf3607ca047d3b421911"} Feb 19 18:50:16 crc kubenswrapper[4749]: I0219 18:50:16.362118 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 18:50:21 crc kubenswrapper[4749]: W0219 18:50:21.526248 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21b53583_c33c_47c6_8351_35bd5f08e632.slice/crio-c68e6bead1f9133b15c5d597aecff39cffe2005148663c929951f581574c25e4 WatchSource:0}: Error finding container c68e6bead1f9133b15c5d597aecff39cffe2005148663c929951f581574c25e4: Status 404 returned error can't find the container with id c68e6bead1f9133b15c5d597aecff39cffe2005148663c929951f581574c25e4 Feb 19 18:50:21 crc kubenswrapper[4749]: I0219 18:50:21.678774 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"21b53583-c33c-47c6-8351-35bd5f08e632","Type":"ContainerStarted","Data":"c68e6bead1f9133b15c5d597aecff39cffe2005148663c929951f581574c25e4"} Feb 19 18:50:22 crc kubenswrapper[4749]: E0219 18:50:22.188868 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.75:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Feb 19 18:50:22 crc kubenswrapper[4749]: E0219 18:50:22.188930 4749 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.75:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Feb 19 18:50:22 crc kubenswrapper[4749]: E0219 18:50:22.189107 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.75:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dgjkt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-c6977f9b9-w4jrv_openstack(2a888ed9-b69c-4846-ab38-c29719ab3ab1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 18:50:22 crc kubenswrapper[4749]: E0219 18:50:22.190299 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-c6977f9b9-w4jrv" podUID="2a888ed9-b69c-4846-ab38-c29719ab3ab1" Feb 19 18:50:22 crc kubenswrapper[4749]: E0219 18:50:22.209059 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.75:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Feb 19 18:50:22 crc kubenswrapper[4749]: E0219 18:50:22.209109 4749 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.75:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Feb 19 18:50:22 crc kubenswrapper[4749]: E0219 18:50:22.209201 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.75:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n77hb9hddhdfhf5h5cch698h578h5f8h675h5c5hdch97h5bch59bh5b6h55h5bch556hb5h599h8dhc8h667h59ch659h578hcfh5c7h9dh645h554q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qfhld,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7fc7ffd55-bhsd5_openstack(ed4e598f-abad-4bd7-997c-c4f54a2e2321): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 18:50:22 crc kubenswrapper[4749]: E0219 18:50:22.213086 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7fc7ffd55-bhsd5" podUID="ed4e598f-abad-4bd7-997c-c4f54a2e2321" Feb 19 18:50:22 crc kubenswrapper[4749]: E0219 18:50:22.277175 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.75:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Feb 19 18:50:22 crc kubenswrapper[4749]: E0219 18:50:22.277235 4749 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.75:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Feb 19 18:50:22 crc kubenswrapper[4749]: E0219 18:50:22.277342 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.75:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n658h5c5h88h68dhb6h57dhd4h697hb8h8fh74hb7h54fh54dh548h7h55dhb8h9fh55dh688h5bbh5d5h675h669hb7h67hbbhffh668h5c7hc5q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9wd6g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6d4bcfd54c-vsmph_openstack(a952b799-be38-4df9-82cc-b6b09536f733): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 18:50:22 crc kubenswrapper[4749]: E0219 18:50:22.278631 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6d4bcfd54c-vsmph" podUID="a952b799-be38-4df9-82cc-b6b09536f733" Feb 19 18:50:22 crc kubenswrapper[4749]: E0219 18:50:22.312346 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.75:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Feb 19 18:50:22 crc kubenswrapper[4749]: E0219 18:50:22.312409 4749 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.75:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Feb 19 18:50:22 crc kubenswrapper[4749]: E0219 18:50:22.312527 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.75:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n684h65fh56h6fh87h85h57h76h5b7h94hffh649hfbh8ch5bch56fh5c5hbh86hf9h99h5dch95h66hd5h555h566h646h546h79h9dh55dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2p58k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-8469c55d5f-zzv8c_openstack(453ecb3b-540e-4e07-9c23-451656544d03): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 18:50:22 crc kubenswrapper[4749]: E0219 18:50:22.317295 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-8469c55d5f-zzv8c" podUID="453ecb3b-540e-4e07-9c23-451656544d03" Feb 19 18:50:22 crc kubenswrapper[4749]: E0219 18:50:22.383324 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.75:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Feb 19 18:50:22 crc kubenswrapper[4749]: E0219 18:50:22.383390 4749 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.75:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Feb 19 18:50:22 crc kubenswrapper[4749]: E0219 18:50:22.383514 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.75:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v4k8n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6b7fcbd895-gmcp6_openstack(006db2e2-fb9c-4a57-9f4c-fa4cb801588c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 18:50:22 crc kubenswrapper[4749]: E0219 18:50:22.384991 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6b7fcbd895-gmcp6" podUID="006db2e2-fb9c-4a57-9f4c-fa4cb801588c" Feb 19 18:50:22 crc kubenswrapper[4749]: E0219 18:50:22.688498 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.75:5001/podified-master-centos10/openstack-neutron-server:watcher_latest\\\"\"" pod="openstack/dnsmasq-dns-7fc7ffd55-bhsd5" podUID="ed4e598f-abad-4bd7-997c-c4f54a2e2321" Feb 19 18:50:22 crc kubenswrapper[4749]: E0219 18:50:22.690398 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.75:5001/podified-master-centos10/openstack-neutron-server:watcher_latest\\\"\"" pod="openstack/dnsmasq-dns-6d4bcfd54c-vsmph" podUID="a952b799-be38-4df9-82cc-b6b09536f733" Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.174898 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.180897 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.220236 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.317900 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.384054 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.466765 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.487844 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ztkm4"] Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.497218 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.502334 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 18:50:23 crc kubenswrapper[4749]: W0219 18:50:23.595183 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod008062c0_9ccf_4fd2_9b54_63196268da38.slice/crio-0ab55f0f2b448d6438081f2a89273710872f8800d82fadb48c5541554c55c361 WatchSource:0}: Error finding container 0ab55f0f2b448d6438081f2a89273710872f8800d82fadb48c5541554c55c361: Status 404 returned error can't find the container with id 0ab55f0f2b448d6438081f2a89273710872f8800d82fadb48c5541554c55c361 Feb 19 18:50:23 crc kubenswrapper[4749]: W0219 18:50:23.598340 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7232b466_ffe3_4eab_ad4c_bb2ccac65929.slice/crio-98b17d16e777ea751cb16dab15dfa861c659e942bb363be572128e1e11b7f19c WatchSource:0}: Error finding container 98b17d16e777ea751cb16dab15dfa861c659e942bb363be572128e1e11b7f19c: Status 404 returned error can't find the container with id 98b17d16e777ea751cb16dab15dfa861c659e942bb363be572128e1e11b7f19c Feb 19 18:50:23 crc kubenswrapper[4749]: W0219 18:50:23.610715 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2f6471c_3fea_45fc_8702_9022ff831352.slice/crio-3d7aa260b5eb64a98dbafc529fd2fb08a9463f9673505437f09154352fb6fa87 WatchSource:0}: Error finding container 3d7aa260b5eb64a98dbafc529fd2fb08a9463f9673505437f09154352fb6fa87: Status 404 returned error can't find the container with id 3d7aa260b5eb64a98dbafc529fd2fb08a9463f9673505437f09154352fb6fa87 Feb 19 18:50:23 crc kubenswrapper[4749]: W0219 18:50:23.612073 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c7f450e_9c65_4f47_a259_c6e667660b59.slice/crio-f4f338a564fa427c748a0e0041868221782caa6eac81c54a4a076bbff11abd03 WatchSource:0}: Error finding container f4f338a564fa427c748a0e0041868221782caa6eac81c54a4a076bbff11abd03: Status 404 returned error can't find the container with id f4f338a564fa427c748a0e0041868221782caa6eac81c54a4a076bbff11abd03 Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.696965 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"042fb593-4898-4085-889e-7ccb375cf969","Type":"ContainerStarted","Data":"598b1485a47e9995ae401e652012917e716c7a7c520c41e6fc138c367cf7e054"} Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.699215 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e2f6471c-3fea-45fc-8702-9022ff831352","Type":"ContainerStarted","Data":"3d7aa260b5eb64a98dbafc529fd2fb08a9463f9673505437f09154352fb6fa87"} Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.708019 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"affb1316-cbf5-4641-bdd7-186e390b9e7e","Type":"ContainerStarted","Data":"f241abbb1815fcfa55bd0d3b1876afa2a67059ce8489e3c91288d4a2da1f4649"} Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.709368 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c6977f9b9-w4jrv" event={"ID":"2a888ed9-b69c-4846-ab38-c29719ab3ab1","Type":"ContainerDied","Data":"73f622904f71c001fe7c2d27a36dc896c6ee309720d2e6960b2a07909b0f7fe7"} Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.709409 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73f622904f71c001fe7c2d27a36dc896c6ee309720d2e6960b2a07909b0f7fe7" Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.710972 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ztkm4" event={"ID":"7232b466-ffe3-4eab-ad4c-bb2ccac65929","Type":"ContainerStarted","Data":"98b17d16e777ea751cb16dab15dfa861c659e942bb363be572128e1e11b7f19c"} Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.723680 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e84776ec-57db-4685-84f6-f86655d9f079","Type":"ContainerStarted","Data":"e6026bccb2dad8edd81f96cfc1f80d1bb533a9b377a78017f22a313e7ede2908"} Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.728489 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0c7f450e-9c65-4f47-a259-c6e667660b59","Type":"ContainerStarted","Data":"f4f338a564fa427c748a0e0041868221782caa6eac81c54a4a076bbff11abd03"} Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.732338 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"008062c0-9ccf-4fd2-9b54-63196268da38","Type":"ContainerStarted","Data":"0ab55f0f2b448d6438081f2a89273710872f8800d82fadb48c5541554c55c361"} Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.743733 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7a890ae9-2fb7-4410-b2a5-3374f5555b0f","Type":"ContainerStarted","Data":"1d23302bb4ed8ad1a5cdd5e0180f7c75a0a182524b9dc0035d92ed3aa964556a"} Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.747163 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"27dfe8e9-686d-4703-b36d-df6b94491b40","Type":"ContainerStarted","Data":"c0e69d20b50ef864ddb00ad8f03e6c1432650658590c1887e3429fa7d1f8449f"} Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.749130 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7fcbd895-gmcp6" event={"ID":"006db2e2-fb9c-4a57-9f4c-fa4cb801588c","Type":"ContainerDied","Data":"4b634d8d7f91a0a4b7b7d814267de840654857f5c68f27fe276fb2ad7d68b9b8"} Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.749168 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b634d8d7f91a0a4b7b7d814267de840654857f5c68f27fe276fb2ad7d68b9b8" Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.761396 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8469c55d5f-zzv8c" event={"ID":"453ecb3b-540e-4e07-9c23-451656544d03","Type":"ContainerDied","Data":"60299bbc1ab847e2c15532dbb1f8b28d738f20b46025ac4363e393df02f1c529"} Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.761435 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60299bbc1ab847e2c15532dbb1f8b28d738f20b46025ac4363e393df02f1c529" Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.762703 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"872559f0-434b-4f87-b6de-c8c56cad33c3","Type":"ContainerStarted","Data":"abf5950bdbfbf9f265d9023b1fdf57043643ce879cd4128f70cb8637917306e2"} Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.772841 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8469c55d5f-zzv8c" Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.818902 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c6977f9b9-w4jrv" Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.832945 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7fcbd895-gmcp6" Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.931064 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/453ecb3b-540e-4e07-9c23-451656544d03-dns-svc\") pod \"453ecb3b-540e-4e07-9c23-451656544d03\" (UID: \"453ecb3b-540e-4e07-9c23-451656544d03\") " Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.931188 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/453ecb3b-540e-4e07-9c23-451656544d03-config\") pod \"453ecb3b-540e-4e07-9c23-451656544d03\" (UID: \"453ecb3b-540e-4e07-9c23-451656544d03\") " Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.931237 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/006db2e2-fb9c-4a57-9f4c-fa4cb801588c-config\") pod \"006db2e2-fb9c-4a57-9f4c-fa4cb801588c\" (UID: \"006db2e2-fb9c-4a57-9f4c-fa4cb801588c\") " Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.931263 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p58k\" (UniqueName: \"kubernetes.io/projected/453ecb3b-540e-4e07-9c23-451656544d03-kube-api-access-2p58k\") pod \"453ecb3b-540e-4e07-9c23-451656544d03\" (UID: \"453ecb3b-540e-4e07-9c23-451656544d03\") " Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.931370 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4k8n\" (UniqueName: \"kubernetes.io/projected/006db2e2-fb9c-4a57-9f4c-fa4cb801588c-kube-api-access-v4k8n\") pod \"006db2e2-fb9c-4a57-9f4c-fa4cb801588c\" (UID: \"006db2e2-fb9c-4a57-9f4c-fa4cb801588c\") " Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.931405 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgjkt\" (UniqueName: \"kubernetes.io/projected/2a888ed9-b69c-4846-ab38-c29719ab3ab1-kube-api-access-dgjkt\") pod \"2a888ed9-b69c-4846-ab38-c29719ab3ab1\" (UID: \"2a888ed9-b69c-4846-ab38-c29719ab3ab1\") " Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.931438 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/006db2e2-fb9c-4a57-9f4c-fa4cb801588c-dns-svc\") pod \"006db2e2-fb9c-4a57-9f4c-fa4cb801588c\" (UID: \"006db2e2-fb9c-4a57-9f4c-fa4cb801588c\") " Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.931469 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a888ed9-b69c-4846-ab38-c29719ab3ab1-config\") pod \"2a888ed9-b69c-4846-ab38-c29719ab3ab1\" (UID: \"2a888ed9-b69c-4846-ab38-c29719ab3ab1\") " Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.932063 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/453ecb3b-540e-4e07-9c23-451656544d03-config" (OuterVolumeSpecName: "config") pod "453ecb3b-540e-4e07-9c23-451656544d03" (UID: "453ecb3b-540e-4e07-9c23-451656544d03"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.932551 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/453ecb3b-540e-4e07-9c23-451656544d03-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "453ecb3b-540e-4e07-9c23-451656544d03" (UID: "453ecb3b-540e-4e07-9c23-451656544d03"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.932616 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/006db2e2-fb9c-4a57-9f4c-fa4cb801588c-config" (OuterVolumeSpecName: "config") pod "006db2e2-fb9c-4a57-9f4c-fa4cb801588c" (UID: "006db2e2-fb9c-4a57-9f4c-fa4cb801588c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.933087 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/006db2e2-fb9c-4a57-9f4c-fa4cb801588c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "006db2e2-fb9c-4a57-9f4c-fa4cb801588c" (UID: "006db2e2-fb9c-4a57-9f4c-fa4cb801588c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.933366 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a888ed9-b69c-4846-ab38-c29719ab3ab1-config" (OuterVolumeSpecName: "config") pod "2a888ed9-b69c-4846-ab38-c29719ab3ab1" (UID: "2a888ed9-b69c-4846-ab38-c29719ab3ab1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.937985 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/006db2e2-fb9c-4a57-9f4c-fa4cb801588c-kube-api-access-v4k8n" (OuterVolumeSpecName: "kube-api-access-v4k8n") pod "006db2e2-fb9c-4a57-9f4c-fa4cb801588c" (UID: "006db2e2-fb9c-4a57-9f4c-fa4cb801588c"). InnerVolumeSpecName "kube-api-access-v4k8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:50:23 crc kubenswrapper[4749]: I0219 18:50:23.940314 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/453ecb3b-540e-4e07-9c23-451656544d03-kube-api-access-2p58k" (OuterVolumeSpecName: "kube-api-access-2p58k") pod "453ecb3b-540e-4e07-9c23-451656544d03" (UID: "453ecb3b-540e-4e07-9c23-451656544d03"). InnerVolumeSpecName "kube-api-access-2p58k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:50:24 crc kubenswrapper[4749]: I0219 18:50:24.034016 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/006db2e2-fb9c-4a57-9f4c-fa4cb801588c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:24 crc kubenswrapper[4749]: I0219 18:50:24.034079 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a888ed9-b69c-4846-ab38-c29719ab3ab1-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:24 crc kubenswrapper[4749]: I0219 18:50:24.034090 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/453ecb3b-540e-4e07-9c23-451656544d03-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:24 crc kubenswrapper[4749]: I0219 18:50:24.034100 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/453ecb3b-540e-4e07-9c23-451656544d03-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:24 crc kubenswrapper[4749]: I0219 18:50:24.034179 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/006db2e2-fb9c-4a57-9f4c-fa4cb801588c-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:24 crc kubenswrapper[4749]: I0219 18:50:24.034190 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2p58k\" (UniqueName: \"kubernetes.io/projected/453ecb3b-540e-4e07-9c23-451656544d03-kube-api-access-2p58k\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:24 crc kubenswrapper[4749]: I0219 18:50:24.034200 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4k8n\" (UniqueName: \"kubernetes.io/projected/006db2e2-fb9c-4a57-9f4c-fa4cb801588c-kube-api-access-v4k8n\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:24 crc kubenswrapper[4749]: I0219 18:50:24.036263 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a888ed9-b69c-4846-ab38-c29719ab3ab1-kube-api-access-dgjkt" (OuterVolumeSpecName: "kube-api-access-dgjkt") pod "2a888ed9-b69c-4846-ab38-c29719ab3ab1" (UID: "2a888ed9-b69c-4846-ab38-c29719ab3ab1"). InnerVolumeSpecName "kube-api-access-dgjkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:50:24 crc kubenswrapper[4749]: I0219 18:50:24.064807 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-sn6j7"] Feb 19 18:50:24 crc kubenswrapper[4749]: I0219 18:50:24.135449 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgjkt\" (UniqueName: \"kubernetes.io/projected/2a888ed9-b69c-4846-ab38-c29719ab3ab1-kube-api-access-dgjkt\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:24 crc kubenswrapper[4749]: I0219 18:50:24.725886 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 18:50:24 crc kubenswrapper[4749]: I0219 18:50:24.725936 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 18:50:24 crc kubenswrapper[4749]: I0219 18:50:24.770868 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c6977f9b9-w4jrv" Feb 19 18:50:24 crc kubenswrapper[4749]: I0219 18:50:24.772321 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8469c55d5f-zzv8c" Feb 19 18:50:24 crc kubenswrapper[4749]: I0219 18:50:24.772853 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7fcbd895-gmcp6" Feb 19 18:50:24 crc kubenswrapper[4749]: I0219 18:50:24.780905 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-sn6j7" event={"ID":"091cdb0e-a88c-4731-9a77-6a3c41e0fc1a","Type":"ContainerStarted","Data":"738af88ededf71e31a1850d1f613e579e0422aba26e86126d9a8fc8d472a5836"} Feb 19 18:50:25 crc kubenswrapper[4749]: I0219 18:50:25.042153 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8469c55d5f-zzv8c"] Feb 19 18:50:25 crc kubenswrapper[4749]: I0219 18:50:25.054071 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8469c55d5f-zzv8c"] Feb 19 18:50:25 crc kubenswrapper[4749]: I0219 18:50:25.070508 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c6977f9b9-w4jrv"] Feb 19 18:50:25 crc kubenswrapper[4749]: I0219 18:50:25.077967 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c6977f9b9-w4jrv"] Feb 19 18:50:25 crc kubenswrapper[4749]: I0219 18:50:25.095084 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7fcbd895-gmcp6"] Feb 19 18:50:25 crc kubenswrapper[4749]: I0219 18:50:25.100280 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7fcbd895-gmcp6"] Feb 19 18:50:25 crc kubenswrapper[4749]: I0219 18:50:25.778591 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"008062c0-9ccf-4fd2-9b54-63196268da38","Type":"ContainerStarted","Data":"d424cd182682a0c438d89dbb09758c27fa545c0f0dd6fcc5b71037d7ded4f1c4"} Feb 19 18:50:25 crc kubenswrapper[4749]: I0219 18:50:25.781643 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"042fb593-4898-4085-889e-7ccb375cf969","Type":"ContainerStarted","Data":"fd26a9103b0a88682847d38bd2a0a2ca1f91ec3eea2089769c23970dca2fdfd3"} Feb 19 18:50:26 crc kubenswrapper[4749]: I0219 18:50:26.709552 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="006db2e2-fb9c-4a57-9f4c-fa4cb801588c" path="/var/lib/kubelet/pods/006db2e2-fb9c-4a57-9f4c-fa4cb801588c/volumes" Feb 19 18:50:26 crc kubenswrapper[4749]: I0219 18:50:26.710230 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a888ed9-b69c-4846-ab38-c29719ab3ab1" path="/var/lib/kubelet/pods/2a888ed9-b69c-4846-ab38-c29719ab3ab1/volumes" Feb 19 18:50:26 crc kubenswrapper[4749]: I0219 18:50:26.710593 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="453ecb3b-540e-4e07-9c23-451656544d03" path="/var/lib/kubelet/pods/453ecb3b-540e-4e07-9c23-451656544d03/volumes" Feb 19 18:50:26 crc kubenswrapper[4749]: I0219 18:50:26.809498 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"21b53583-c33c-47c6-8351-35bd5f08e632","Type":"ContainerStarted","Data":"8a8ffdf9e5d60f04cba0c6394d110f8543f0f87d221878e48343da1dfca4650e"} Feb 19 18:50:30 crc kubenswrapper[4749]: I0219 18:50:30.842676 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7a890ae9-2fb7-4410-b2a5-3374f5555b0f","Type":"ContainerStarted","Data":"57d10eb7b49624945f10d18d1282466c4762468a49fd1606b080e17cb7cbad11"} Feb 19 18:50:30 crc kubenswrapper[4749]: I0219 18:50:30.844314 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 19 18:50:30 crc kubenswrapper[4749]: I0219 18:50:30.862373 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=25.488656025 podStartE2EDuration="30.862352698s" podCreationTimestamp="2026-02-19 18:50:00 +0000 UTC" firstStartedPulling="2026-02-19 18:50:23.681210256 +0000 UTC m=+997.642430210" lastFinishedPulling="2026-02-19 18:50:29.054906939 +0000 UTC m=+1003.016126883" observedRunningTime="2026-02-19 18:50:30.860977455 +0000 UTC m=+1004.822197439" watchObservedRunningTime="2026-02-19 18:50:30.862352698 +0000 UTC m=+1004.823572652" Feb 19 18:50:31 crc kubenswrapper[4749]: I0219 18:50:31.852274 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"872559f0-434b-4f87-b6de-c8c56cad33c3","Type":"ContainerStarted","Data":"4137f06e3b49a547062d31e61ecdaae382c94d659728ddfdd4ead2d540194717"} Feb 19 18:50:31 crc kubenswrapper[4749]: I0219 18:50:31.852594 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 19 18:50:31 crc kubenswrapper[4749]: I0219 18:50:31.854930 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e2f6471c-3fea-45fc-8702-9022ff831352","Type":"ContainerStarted","Data":"c633f279b6286114f44effdf014927189839bb23fcc9024e0d4d87d61c94546c"} Feb 19 18:50:31 crc kubenswrapper[4749]: I0219 18:50:31.857005 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0c7f450e-9c65-4f47-a259-c6e667660b59","Type":"ContainerStarted","Data":"5efa70c8fc7034d7d14f1cea30f9dcfdd673295743bc9172dd6207c4f55f6d5a"} Feb 19 18:50:31 crc kubenswrapper[4749]: I0219 18:50:31.858742 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"affb1316-cbf5-4641-bdd7-186e390b9e7e","Type":"ContainerStarted","Data":"95117842f82f4291afd5168ceb126a41b70074eab7b061ad5d75b1798561eef2"} Feb 19 18:50:31 crc kubenswrapper[4749]: I0219 18:50:31.860735 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ztkm4" event={"ID":"7232b466-ffe3-4eab-ad4c-bb2ccac65929","Type":"ContainerStarted","Data":"de7c5ad122ec138177a634866a7333ed4d53aef69819e90c4d3c802a195ca019"} Feb 19 18:50:31 crc kubenswrapper[4749]: I0219 18:50:31.861608 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ztkm4" Feb 19 18:50:31 crc kubenswrapper[4749]: I0219 18:50:31.867360 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e84776ec-57db-4685-84f6-f86655d9f079","Type":"ContainerStarted","Data":"7d23666248d096d152cba583714b2eb367184b5f2b321bfc383c8f719e2fd902"} Feb 19 18:50:31 crc kubenswrapper[4749]: I0219 18:50:31.871777 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=22.691212161 podStartE2EDuration="29.871760149s" podCreationTimestamp="2026-02-19 18:50:02 +0000 UTC" firstStartedPulling="2026-02-19 18:50:23.605559435 +0000 UTC m=+997.566779389" lastFinishedPulling="2026-02-19 18:50:30.786107423 +0000 UTC m=+1004.747327377" observedRunningTime="2026-02-19 18:50:31.868705155 +0000 UTC m=+1005.829925129" watchObservedRunningTime="2026-02-19 18:50:31.871760149 +0000 UTC m=+1005.832980103" Feb 19 18:50:31 crc kubenswrapper[4749]: I0219 18:50:31.879648 4749 generic.go:334] "Generic (PLEG): container finished" podID="091cdb0e-a88c-4731-9a77-6a3c41e0fc1a" containerID="d9aaf7d992185fc278a78ca2c56e3486f7a908a26db31127ffa2d771d2e55e76" exitCode=0 Feb 19 18:50:31 crc kubenswrapper[4749]: I0219 18:50:31.880456 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-sn6j7" event={"ID":"091cdb0e-a88c-4731-9a77-6a3c41e0fc1a","Type":"ContainerDied","Data":"d9aaf7d992185fc278a78ca2c56e3486f7a908a26db31127ffa2d771d2e55e76"} Feb 19 18:50:31 crc kubenswrapper[4749]: I0219 18:50:31.914562 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ztkm4" podStartSLOduration=20.026007771 podStartE2EDuration="25.914540361s" podCreationTimestamp="2026-02-19 18:50:06 +0000 UTC" firstStartedPulling="2026-02-19 18:50:23.601551328 +0000 UTC m=+997.562771282" lastFinishedPulling="2026-02-19 18:50:29.490083908 +0000 UTC m=+1003.451303872" observedRunningTime="2026-02-19 18:50:31.886749044 +0000 UTC m=+1005.847969008" watchObservedRunningTime="2026-02-19 18:50:31.914540361 +0000 UTC m=+1005.875760315" Feb 19 18:50:32 crc kubenswrapper[4749]: I0219 18:50:32.890457 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-sn6j7" event={"ID":"091cdb0e-a88c-4731-9a77-6a3c41e0fc1a","Type":"ContainerStarted","Data":"263c1f184109bf7f93c9f4b1e1981b8a1c8e160dac5b0dbda4782df4f4c7a5e7"} Feb 19 18:50:32 crc kubenswrapper[4749]: I0219 18:50:32.893648 4749 generic.go:334] "Generic (PLEG): container finished" podID="21b53583-c33c-47c6-8351-35bd5f08e632" containerID="8a8ffdf9e5d60f04cba0c6394d110f8543f0f87d221878e48343da1dfca4650e" exitCode=0 Feb 19 18:50:32 crc kubenswrapper[4749]: I0219 18:50:32.893750 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"21b53583-c33c-47c6-8351-35bd5f08e632","Type":"ContainerDied","Data":"8a8ffdf9e5d60f04cba0c6394d110f8543f0f87d221878e48343da1dfca4650e"} Feb 19 18:50:33 crc kubenswrapper[4749]: I0219 18:50:33.902068 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-sn6j7" event={"ID":"091cdb0e-a88c-4731-9a77-6a3c41e0fc1a","Type":"ContainerStarted","Data":"7ce8257166391115e52718b4b73f90b6810cd148452586d0e06f509ccc275732"} Feb 19 18:50:33 crc kubenswrapper[4749]: I0219 18:50:33.902680 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-sn6j7" Feb 19 18:50:33 crc kubenswrapper[4749]: I0219 18:50:33.902733 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-sn6j7" Feb 19 18:50:33 crc kubenswrapper[4749]: I0219 18:50:33.907628 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0c7f450e-9c65-4f47-a259-c6e667660b59","Type":"ContainerStarted","Data":"3c2bc4b0ef17ec1377b7b08f093eebe39fec05c3afca724bea079228c1c6af25"} Feb 19 18:50:33 crc kubenswrapper[4749]: I0219 18:50:33.911119 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e2f6471c-3fea-45fc-8702-9022ff831352","Type":"ContainerStarted","Data":"72bcbdfbbee907d020f0ff54b2f096aa215e2d0d802943d638608865f3b267cf"} Feb 19 18:50:33 crc kubenswrapper[4749]: I0219 18:50:33.935660 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-sn6j7" podStartSLOduration=22.65710819 podStartE2EDuration="27.935640468s" podCreationTimestamp="2026-02-19 18:50:06 +0000 UTC" firstStartedPulling="2026-02-19 18:50:24.143134205 +0000 UTC m=+998.104354159" lastFinishedPulling="2026-02-19 18:50:29.421666483 +0000 UTC m=+1003.382886437" observedRunningTime="2026-02-19 18:50:33.930843632 +0000 UTC m=+1007.892063596" watchObservedRunningTime="2026-02-19 18:50:33.935640468 +0000 UTC m=+1007.896860422" Feb 19 18:50:33 crc kubenswrapper[4749]: I0219 18:50:33.956579 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=19.615451118 podStartE2EDuration="28.956559987s" podCreationTimestamp="2026-02-19 18:50:05 +0000 UTC" firstStartedPulling="2026-02-19 18:50:23.614711748 +0000 UTC m=+997.575931702" lastFinishedPulling="2026-02-19 18:50:32.955820627 +0000 UTC m=+1006.917040571" observedRunningTime="2026-02-19 18:50:33.954120907 +0000 UTC m=+1007.915340861" watchObservedRunningTime="2026-02-19 18:50:33.956559987 +0000 UTC m=+1007.917779951" Feb 19 18:50:33 crc kubenswrapper[4749]: I0219 18:50:33.979831 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=16.632997845 podStartE2EDuration="25.979813413s" podCreationTimestamp="2026-02-19 18:50:08 +0000 UTC" firstStartedPulling="2026-02-19 18:50:23.614690528 +0000 UTC m=+997.575910482" lastFinishedPulling="2026-02-19 18:50:32.961506096 +0000 UTC m=+1006.922726050" observedRunningTime="2026-02-19 18:50:33.972858154 +0000 UTC m=+1007.934078138" watchObservedRunningTime="2026-02-19 18:50:33.979813413 +0000 UTC m=+1007.941033367" Feb 19 18:50:34 crc kubenswrapper[4749]: I0219 18:50:34.249874 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:34 crc kubenswrapper[4749]: I0219 18:50:34.279358 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:34 crc kubenswrapper[4749]: I0219 18:50:34.288453 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:34 crc kubenswrapper[4749]: I0219 18:50:34.330937 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:34 crc kubenswrapper[4749]: I0219 18:50:34.916938 4749 generic.go:334] "Generic (PLEG): container finished" podID="ed4e598f-abad-4bd7-997c-c4f54a2e2321" containerID="5b1067ca11904662e8871d06d2d7eb4f225d389757b52fb232ac361677594b88" exitCode=0 Feb 19 18:50:34 crc kubenswrapper[4749]: I0219 18:50:34.918003 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc7ffd55-bhsd5" event={"ID":"ed4e598f-abad-4bd7-997c-c4f54a2e2321","Type":"ContainerDied","Data":"5b1067ca11904662e8871d06d2d7eb4f225d389757b52fb232ac361677594b88"} Feb 19 18:50:34 crc kubenswrapper[4749]: I0219 18:50:34.918792 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:34 crc kubenswrapper[4749]: I0219 18:50:34.919194 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:35 crc kubenswrapper[4749]: I0219 18:50:35.856857 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 19 18:50:35 crc kubenswrapper[4749]: I0219 18:50:35.926934 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc7ffd55-bhsd5" event={"ID":"ed4e598f-abad-4bd7-997c-c4f54a2e2321","Type":"ContainerStarted","Data":"ea8c40de4fe3e4c9ab0fc186943e8fb38b46bdcd4faac6edefdf7360d53cfea3"} Feb 19 18:50:35 crc kubenswrapper[4749]: I0219 18:50:35.927292 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fc7ffd55-bhsd5" Feb 19 18:50:36 crc kubenswrapper[4749]: I0219 18:50:36.974587 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 19 18:50:36 crc kubenswrapper[4749]: I0219 18:50:36.977180 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.010590 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fc7ffd55-bhsd5" podStartSLOduration=4.70728167 podStartE2EDuration="41.010570847s" podCreationTimestamp="2026-02-19 18:49:56 +0000 UTC" firstStartedPulling="2026-02-19 18:49:57.465654155 +0000 UTC m=+971.426874109" lastFinishedPulling="2026-02-19 18:50:33.768943302 +0000 UTC m=+1007.730163286" observedRunningTime="2026-02-19 18:50:35.949875889 +0000 UTC m=+1009.911095843" watchObservedRunningTime="2026-02-19 18:50:37.010570847 +0000 UTC m=+1010.971790801" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.230185 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d4bcfd54c-vsmph"] Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.274640 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c7f86b5bc-tks59"] Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.275929 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c7f86b5bc-tks59" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.278245 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.304018 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c7f86b5bc-tks59"] Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.351988 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-sbvrv"] Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.352929 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-sbvrv" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.359736 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.369140 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59199734-1adc-46b0-9208-75331e4b868c-combined-ca-bundle\") pod \"ovn-controller-metrics-sbvrv\" (UID: \"59199734-1adc-46b0-9208-75331e4b868c\") " pod="openstack/ovn-controller-metrics-sbvrv" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.369188 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1791b0e1-f604-483e-b4f3-79efb0779828-config\") pod \"dnsmasq-dns-6c7f86b5bc-tks59\" (UID: \"1791b0e1-f604-483e-b4f3-79efb0779828\") " pod="openstack/dnsmasq-dns-6c7f86b5bc-tks59" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.369220 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59199734-1adc-46b0-9208-75331e4b868c-config\") pod \"ovn-controller-metrics-sbvrv\" (UID: \"59199734-1adc-46b0-9208-75331e4b868c\") " pod="openstack/ovn-controller-metrics-sbvrv" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.369250 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1791b0e1-f604-483e-b4f3-79efb0779828-dns-svc\") pod \"dnsmasq-dns-6c7f86b5bc-tks59\" (UID: \"1791b0e1-f604-483e-b4f3-79efb0779828\") " pod="openstack/dnsmasq-dns-6c7f86b5bc-tks59" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.369268 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/59199734-1adc-46b0-9208-75331e4b868c-ovs-rundir\") pod \"ovn-controller-metrics-sbvrv\" (UID: \"59199734-1adc-46b0-9208-75331e4b868c\") " pod="openstack/ovn-controller-metrics-sbvrv" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.369294 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z4sd\" (UniqueName: \"kubernetes.io/projected/59199734-1adc-46b0-9208-75331e4b868c-kube-api-access-9z4sd\") pod \"ovn-controller-metrics-sbvrv\" (UID: \"59199734-1adc-46b0-9208-75331e4b868c\") " pod="openstack/ovn-controller-metrics-sbvrv" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.369321 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/59199734-1adc-46b0-9208-75331e4b868c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-sbvrv\" (UID: \"59199734-1adc-46b0-9208-75331e4b868c\") " pod="openstack/ovn-controller-metrics-sbvrv" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.369342 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1791b0e1-f604-483e-b4f3-79efb0779828-ovsdbserver-sb\") pod \"dnsmasq-dns-6c7f86b5bc-tks59\" (UID: \"1791b0e1-f604-483e-b4f3-79efb0779828\") " pod="openstack/dnsmasq-dns-6c7f86b5bc-tks59" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.369366 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46w9l\" (UniqueName: \"kubernetes.io/projected/1791b0e1-f604-483e-b4f3-79efb0779828-kube-api-access-46w9l\") pod \"dnsmasq-dns-6c7f86b5bc-tks59\" (UID: \"1791b0e1-f604-483e-b4f3-79efb0779828\") " pod="openstack/dnsmasq-dns-6c7f86b5bc-tks59" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.369385 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/59199734-1adc-46b0-9208-75331e4b868c-ovn-rundir\") pod \"ovn-controller-metrics-sbvrv\" (UID: \"59199734-1adc-46b0-9208-75331e4b868c\") " pod="openstack/ovn-controller-metrics-sbvrv" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.382220 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-sbvrv"] Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.472430 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/59199734-1adc-46b0-9208-75331e4b868c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-sbvrv\" (UID: \"59199734-1adc-46b0-9208-75331e4b868c\") " pod="openstack/ovn-controller-metrics-sbvrv" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.472513 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1791b0e1-f604-483e-b4f3-79efb0779828-ovsdbserver-sb\") pod \"dnsmasq-dns-6c7f86b5bc-tks59\" (UID: \"1791b0e1-f604-483e-b4f3-79efb0779828\") " pod="openstack/dnsmasq-dns-6c7f86b5bc-tks59" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.472548 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46w9l\" (UniqueName: \"kubernetes.io/projected/1791b0e1-f604-483e-b4f3-79efb0779828-kube-api-access-46w9l\") pod \"dnsmasq-dns-6c7f86b5bc-tks59\" (UID: \"1791b0e1-f604-483e-b4f3-79efb0779828\") " pod="openstack/dnsmasq-dns-6c7f86b5bc-tks59" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.472574 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/59199734-1adc-46b0-9208-75331e4b868c-ovn-rundir\") pod \"ovn-controller-metrics-sbvrv\" (UID: \"59199734-1adc-46b0-9208-75331e4b868c\") " pod="openstack/ovn-controller-metrics-sbvrv" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.472636 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59199734-1adc-46b0-9208-75331e4b868c-combined-ca-bundle\") pod \"ovn-controller-metrics-sbvrv\" (UID: \"59199734-1adc-46b0-9208-75331e4b868c\") " pod="openstack/ovn-controller-metrics-sbvrv" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.472858 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1791b0e1-f604-483e-b4f3-79efb0779828-config\") pod \"dnsmasq-dns-6c7f86b5bc-tks59\" (UID: \"1791b0e1-f604-483e-b4f3-79efb0779828\") " pod="openstack/dnsmasq-dns-6c7f86b5bc-tks59" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.472928 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59199734-1adc-46b0-9208-75331e4b868c-config\") pod \"ovn-controller-metrics-sbvrv\" (UID: \"59199734-1adc-46b0-9208-75331e4b868c\") " pod="openstack/ovn-controller-metrics-sbvrv" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.472980 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1791b0e1-f604-483e-b4f3-79efb0779828-dns-svc\") pod \"dnsmasq-dns-6c7f86b5bc-tks59\" (UID: \"1791b0e1-f604-483e-b4f3-79efb0779828\") " pod="openstack/dnsmasq-dns-6c7f86b5bc-tks59" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.473001 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/59199734-1adc-46b0-9208-75331e4b868c-ovs-rundir\") pod \"ovn-controller-metrics-sbvrv\" (UID: \"59199734-1adc-46b0-9208-75331e4b868c\") " pod="openstack/ovn-controller-metrics-sbvrv" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.473066 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z4sd\" (UniqueName: \"kubernetes.io/projected/59199734-1adc-46b0-9208-75331e4b868c-kube-api-access-9z4sd\") pod \"ovn-controller-metrics-sbvrv\" (UID: \"59199734-1adc-46b0-9208-75331e4b868c\") " pod="openstack/ovn-controller-metrics-sbvrv" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.475264 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1791b0e1-f604-483e-b4f3-79efb0779828-ovsdbserver-sb\") pod \"dnsmasq-dns-6c7f86b5bc-tks59\" (UID: \"1791b0e1-f604-483e-b4f3-79efb0779828\") " pod="openstack/dnsmasq-dns-6c7f86b5bc-tks59" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.475829 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59199734-1adc-46b0-9208-75331e4b868c-config\") pod \"ovn-controller-metrics-sbvrv\" (UID: \"59199734-1adc-46b0-9208-75331e4b868c\") " pod="openstack/ovn-controller-metrics-sbvrv" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.476639 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/59199734-1adc-46b0-9208-75331e4b868c-ovn-rundir\") pod \"ovn-controller-metrics-sbvrv\" (UID: \"59199734-1adc-46b0-9208-75331e4b868c\") " pod="openstack/ovn-controller-metrics-sbvrv" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.476734 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/59199734-1adc-46b0-9208-75331e4b868c-ovs-rundir\") pod \"ovn-controller-metrics-sbvrv\" (UID: \"59199734-1adc-46b0-9208-75331e4b868c\") " pod="openstack/ovn-controller-metrics-sbvrv" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.476941 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1791b0e1-f604-483e-b4f3-79efb0779828-dns-svc\") pod \"dnsmasq-dns-6c7f86b5bc-tks59\" (UID: \"1791b0e1-f604-483e-b4f3-79efb0779828\") " pod="openstack/dnsmasq-dns-6c7f86b5bc-tks59" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.477112 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1791b0e1-f604-483e-b4f3-79efb0779828-config\") pod \"dnsmasq-dns-6c7f86b5bc-tks59\" (UID: \"1791b0e1-f604-483e-b4f3-79efb0779828\") " pod="openstack/dnsmasq-dns-6c7f86b5bc-tks59" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.484115 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59199734-1adc-46b0-9208-75331e4b868c-combined-ca-bundle\") pod \"ovn-controller-metrics-sbvrv\" (UID: \"59199734-1adc-46b0-9208-75331e4b868c\") " pod="openstack/ovn-controller-metrics-sbvrv" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.487843 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/59199734-1adc-46b0-9208-75331e4b868c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-sbvrv\" (UID: \"59199734-1adc-46b0-9208-75331e4b868c\") " pod="openstack/ovn-controller-metrics-sbvrv" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.500786 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z4sd\" (UniqueName: \"kubernetes.io/projected/59199734-1adc-46b0-9208-75331e4b868c-kube-api-access-9z4sd\") pod \"ovn-controller-metrics-sbvrv\" (UID: \"59199734-1adc-46b0-9208-75331e4b868c\") " pod="openstack/ovn-controller-metrics-sbvrv" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.511277 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46w9l\" (UniqueName: \"kubernetes.io/projected/1791b0e1-f604-483e-b4f3-79efb0779828-kube-api-access-46w9l\") pod \"dnsmasq-dns-6c7f86b5bc-tks59\" (UID: \"1791b0e1-f604-483e-b4f3-79efb0779828\") " pod="openstack/dnsmasq-dns-6c7f86b5bc-tks59" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.530043 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fc7ffd55-bhsd5"] Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.543169 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.544501 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.559335 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.559349 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.559392 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-6pj9t" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.565883 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.566812 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.574482 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2231660b-7776-4cf8-a793-7d592dd23ecf-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2231660b-7776-4cf8-a793-7d592dd23ecf\") " pod="openstack/ovn-northd-0" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.574558 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2231660b-7776-4cf8-a793-7d592dd23ecf-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2231660b-7776-4cf8-a793-7d592dd23ecf\") " pod="openstack/ovn-northd-0" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.574588 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2231660b-7776-4cf8-a793-7d592dd23ecf-scripts\") pod \"ovn-northd-0\" (UID: \"2231660b-7776-4cf8-a793-7d592dd23ecf\") " pod="openstack/ovn-northd-0" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.574610 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2231660b-7776-4cf8-a793-7d592dd23ecf-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2231660b-7776-4cf8-a793-7d592dd23ecf\") " pod="openstack/ovn-northd-0" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.574637 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grtx4\" (UniqueName: \"kubernetes.io/projected/2231660b-7776-4cf8-a793-7d592dd23ecf-kube-api-access-grtx4\") pod \"ovn-northd-0\" (UID: \"2231660b-7776-4cf8-a793-7d592dd23ecf\") " pod="openstack/ovn-northd-0" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.574683 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2231660b-7776-4cf8-a793-7d592dd23ecf-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2231660b-7776-4cf8-a793-7d592dd23ecf\") " pod="openstack/ovn-northd-0" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.574698 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2231660b-7776-4cf8-a793-7d592dd23ecf-config\") pod \"ovn-northd-0\" (UID: \"2231660b-7776-4cf8-a793-7d592dd23ecf\") " pod="openstack/ovn-northd-0" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.590822 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-559c64c447-q9fg6"] Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.609393 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-559c64c447-q9fg6" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.611907 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c7f86b5bc-tks59" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.613236 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.635044 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-559c64c447-q9fg6"] Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.677008 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grtx4\" (UniqueName: \"kubernetes.io/projected/2231660b-7776-4cf8-a793-7d592dd23ecf-kube-api-access-grtx4\") pod \"ovn-northd-0\" (UID: \"2231660b-7776-4cf8-a793-7d592dd23ecf\") " pod="openstack/ovn-northd-0" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.677096 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/507b0ab0-b98c-490e-95d4-c9b7e88d969d-dns-svc\") pod \"dnsmasq-dns-559c64c447-q9fg6\" (UID: \"507b0ab0-b98c-490e-95d4-c9b7e88d969d\") " pod="openstack/dnsmasq-dns-559c64c447-q9fg6" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.677129 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/507b0ab0-b98c-490e-95d4-c9b7e88d969d-ovsdbserver-nb\") pod \"dnsmasq-dns-559c64c447-q9fg6\" (UID: \"507b0ab0-b98c-490e-95d4-c9b7e88d969d\") " pod="openstack/dnsmasq-dns-559c64c447-q9fg6" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.677207 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2231660b-7776-4cf8-a793-7d592dd23ecf-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2231660b-7776-4cf8-a793-7d592dd23ecf\") " pod="openstack/ovn-northd-0" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.677229 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2231660b-7776-4cf8-a793-7d592dd23ecf-config\") pod \"ovn-northd-0\" (UID: \"2231660b-7776-4cf8-a793-7d592dd23ecf\") " pod="openstack/ovn-northd-0" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.677267 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2231660b-7776-4cf8-a793-7d592dd23ecf-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2231660b-7776-4cf8-a793-7d592dd23ecf\") " pod="openstack/ovn-northd-0" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.677311 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5whk2\" (UniqueName: \"kubernetes.io/projected/507b0ab0-b98c-490e-95d4-c9b7e88d969d-kube-api-access-5whk2\") pod \"dnsmasq-dns-559c64c447-q9fg6\" (UID: \"507b0ab0-b98c-490e-95d4-c9b7e88d969d\") " pod="openstack/dnsmasq-dns-559c64c447-q9fg6" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.677328 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/507b0ab0-b98c-490e-95d4-c9b7e88d969d-ovsdbserver-sb\") pod \"dnsmasq-dns-559c64c447-q9fg6\" (UID: \"507b0ab0-b98c-490e-95d4-c9b7e88d969d\") " pod="openstack/dnsmasq-dns-559c64c447-q9fg6" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.677350 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2231660b-7776-4cf8-a793-7d592dd23ecf-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2231660b-7776-4cf8-a793-7d592dd23ecf\") " pod="openstack/ovn-northd-0" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.677377 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2231660b-7776-4cf8-a793-7d592dd23ecf-scripts\") pod \"ovn-northd-0\" (UID: \"2231660b-7776-4cf8-a793-7d592dd23ecf\") " pod="openstack/ovn-northd-0" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.677396 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2231660b-7776-4cf8-a793-7d592dd23ecf-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2231660b-7776-4cf8-a793-7d592dd23ecf\") " pod="openstack/ovn-northd-0" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.677411 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/507b0ab0-b98c-490e-95d4-c9b7e88d969d-config\") pod \"dnsmasq-dns-559c64c447-q9fg6\" (UID: \"507b0ab0-b98c-490e-95d4-c9b7e88d969d\") " pod="openstack/dnsmasq-dns-559c64c447-q9fg6" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.678940 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2231660b-7776-4cf8-a793-7d592dd23ecf-config\") pod \"ovn-northd-0\" (UID: \"2231660b-7776-4cf8-a793-7d592dd23ecf\") " pod="openstack/ovn-northd-0" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.680322 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2231660b-7776-4cf8-a793-7d592dd23ecf-scripts\") pod \"ovn-northd-0\" (UID: \"2231660b-7776-4cf8-a793-7d592dd23ecf\") " pod="openstack/ovn-northd-0" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.681962 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2231660b-7776-4cf8-a793-7d592dd23ecf-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2231660b-7776-4cf8-a793-7d592dd23ecf\") " pod="openstack/ovn-northd-0" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.681997 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-sbvrv" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.682317 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2231660b-7776-4cf8-a793-7d592dd23ecf-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2231660b-7776-4cf8-a793-7d592dd23ecf\") " pod="openstack/ovn-northd-0" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.685402 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2231660b-7776-4cf8-a793-7d592dd23ecf-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2231660b-7776-4cf8-a793-7d592dd23ecf\") " pod="openstack/ovn-northd-0" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.701397 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grtx4\" (UniqueName: \"kubernetes.io/projected/2231660b-7776-4cf8-a793-7d592dd23ecf-kube-api-access-grtx4\") pod \"ovn-northd-0\" (UID: \"2231660b-7776-4cf8-a793-7d592dd23ecf\") " pod="openstack/ovn-northd-0" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.701455 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2231660b-7776-4cf8-a793-7d592dd23ecf-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2231660b-7776-4cf8-a793-7d592dd23ecf\") " pod="openstack/ovn-northd-0" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.780013 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5whk2\" (UniqueName: \"kubernetes.io/projected/507b0ab0-b98c-490e-95d4-c9b7e88d969d-kube-api-access-5whk2\") pod \"dnsmasq-dns-559c64c447-q9fg6\" (UID: \"507b0ab0-b98c-490e-95d4-c9b7e88d969d\") " pod="openstack/dnsmasq-dns-559c64c447-q9fg6" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.780078 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/507b0ab0-b98c-490e-95d4-c9b7e88d969d-ovsdbserver-sb\") pod \"dnsmasq-dns-559c64c447-q9fg6\" (UID: \"507b0ab0-b98c-490e-95d4-c9b7e88d969d\") " pod="openstack/dnsmasq-dns-559c64c447-q9fg6" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.780127 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/507b0ab0-b98c-490e-95d4-c9b7e88d969d-config\") pod \"dnsmasq-dns-559c64c447-q9fg6\" (UID: \"507b0ab0-b98c-490e-95d4-c9b7e88d969d\") " pod="openstack/dnsmasq-dns-559c64c447-q9fg6" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.780155 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/507b0ab0-b98c-490e-95d4-c9b7e88d969d-dns-svc\") pod \"dnsmasq-dns-559c64c447-q9fg6\" (UID: \"507b0ab0-b98c-490e-95d4-c9b7e88d969d\") " pod="openstack/dnsmasq-dns-559c64c447-q9fg6" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.780182 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/507b0ab0-b98c-490e-95d4-c9b7e88d969d-ovsdbserver-nb\") pod \"dnsmasq-dns-559c64c447-q9fg6\" (UID: \"507b0ab0-b98c-490e-95d4-c9b7e88d969d\") " pod="openstack/dnsmasq-dns-559c64c447-q9fg6" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.781053 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/507b0ab0-b98c-490e-95d4-c9b7e88d969d-ovsdbserver-sb\") pod \"dnsmasq-dns-559c64c447-q9fg6\" (UID: \"507b0ab0-b98c-490e-95d4-c9b7e88d969d\") " pod="openstack/dnsmasq-dns-559c64c447-q9fg6" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.781164 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/507b0ab0-b98c-490e-95d4-c9b7e88d969d-dns-svc\") pod \"dnsmasq-dns-559c64c447-q9fg6\" (UID: \"507b0ab0-b98c-490e-95d4-c9b7e88d969d\") " pod="openstack/dnsmasq-dns-559c64c447-q9fg6" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.781261 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/507b0ab0-b98c-490e-95d4-c9b7e88d969d-ovsdbserver-nb\") pod \"dnsmasq-dns-559c64c447-q9fg6\" (UID: \"507b0ab0-b98c-490e-95d4-c9b7e88d969d\") " pod="openstack/dnsmasq-dns-559c64c447-q9fg6" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.781348 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/507b0ab0-b98c-490e-95d4-c9b7e88d969d-config\") pod \"dnsmasq-dns-559c64c447-q9fg6\" (UID: \"507b0ab0-b98c-490e-95d4-c9b7e88d969d\") " pod="openstack/dnsmasq-dns-559c64c447-q9fg6" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.799593 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5whk2\" (UniqueName: \"kubernetes.io/projected/507b0ab0-b98c-490e-95d4-c9b7e88d969d-kube-api-access-5whk2\") pod \"dnsmasq-dns-559c64c447-q9fg6\" (UID: \"507b0ab0-b98c-490e-95d4-c9b7e88d969d\") " pod="openstack/dnsmasq-dns-559c64c447-q9fg6" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.888715 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.936501 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-559c64c447-q9fg6" Feb 19 18:50:37 crc kubenswrapper[4749]: I0219 18:50:37.942237 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fc7ffd55-bhsd5" podUID="ed4e598f-abad-4bd7-997c-c4f54a2e2321" containerName="dnsmasq-dns" containerID="cri-o://ea8c40de4fe3e4c9ab0fc186943e8fb38b46bdcd4faac6edefdf7360d53cfea3" gracePeriod=10 Feb 19 18:50:38 crc kubenswrapper[4749]: I0219 18:50:38.952478 4749 generic.go:334] "Generic (PLEG): container finished" podID="ed4e598f-abad-4bd7-997c-c4f54a2e2321" containerID="ea8c40de4fe3e4c9ab0fc186943e8fb38b46bdcd4faac6edefdf7360d53cfea3" exitCode=0 Feb 19 18:50:38 crc kubenswrapper[4749]: I0219 18:50:38.952522 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc7ffd55-bhsd5" event={"ID":"ed4e598f-abad-4bd7-997c-c4f54a2e2321","Type":"ContainerDied","Data":"ea8c40de4fe3e4c9ab0fc186943e8fb38b46bdcd4faac6edefdf7360d53cfea3"} Feb 19 18:50:39 crc kubenswrapper[4749]: I0219 18:50:39.370383 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc7ffd55-bhsd5" Feb 19 18:50:39 crc kubenswrapper[4749]: I0219 18:50:39.516432 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed4e598f-abad-4bd7-997c-c4f54a2e2321-dns-svc\") pod \"ed4e598f-abad-4bd7-997c-c4f54a2e2321\" (UID: \"ed4e598f-abad-4bd7-997c-c4f54a2e2321\") " Feb 19 18:50:39 crc kubenswrapper[4749]: I0219 18:50:39.516477 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed4e598f-abad-4bd7-997c-c4f54a2e2321-config\") pod \"ed4e598f-abad-4bd7-997c-c4f54a2e2321\" (UID: \"ed4e598f-abad-4bd7-997c-c4f54a2e2321\") " Feb 19 18:50:39 crc kubenswrapper[4749]: I0219 18:50:39.516654 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfhld\" (UniqueName: \"kubernetes.io/projected/ed4e598f-abad-4bd7-997c-c4f54a2e2321-kube-api-access-qfhld\") pod \"ed4e598f-abad-4bd7-997c-c4f54a2e2321\" (UID: \"ed4e598f-abad-4bd7-997c-c4f54a2e2321\") " Feb 19 18:50:39 crc kubenswrapper[4749]: I0219 18:50:39.523113 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed4e598f-abad-4bd7-997c-c4f54a2e2321-kube-api-access-qfhld" (OuterVolumeSpecName: "kube-api-access-qfhld") pod "ed4e598f-abad-4bd7-997c-c4f54a2e2321" (UID: "ed4e598f-abad-4bd7-997c-c4f54a2e2321"). InnerVolumeSpecName "kube-api-access-qfhld". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:50:39 crc kubenswrapper[4749]: I0219 18:50:39.546774 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 18:50:39 crc kubenswrapper[4749]: I0219 18:50:39.554901 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed4e598f-abad-4bd7-997c-c4f54a2e2321-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ed4e598f-abad-4bd7-997c-c4f54a2e2321" (UID: "ed4e598f-abad-4bd7-997c-c4f54a2e2321"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:50:39 crc kubenswrapper[4749]: I0219 18:50:39.555468 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed4e598f-abad-4bd7-997c-c4f54a2e2321-config" (OuterVolumeSpecName: "config") pod "ed4e598f-abad-4bd7-997c-c4f54a2e2321" (UID: "ed4e598f-abad-4bd7-997c-c4f54a2e2321"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:50:39 crc kubenswrapper[4749]: W0219 18:50:39.561226 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2231660b_7776_4cf8_a793_7d592dd23ecf.slice/crio-d4ac1d789b9eb1743bf0e62569a2d68efd72f61735be424cb4528af8c341c424 WatchSource:0}: Error finding container d4ac1d789b9eb1743bf0e62569a2d68efd72f61735be424cb4528af8c341c424: Status 404 returned error can't find the container with id d4ac1d789b9eb1743bf0e62569a2d68efd72f61735be424cb4528af8c341c424 Feb 19 18:50:39 crc kubenswrapper[4749]: I0219 18:50:39.618128 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfhld\" (UniqueName: \"kubernetes.io/projected/ed4e598f-abad-4bd7-997c-c4f54a2e2321-kube-api-access-qfhld\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:39 crc kubenswrapper[4749]: I0219 18:50:39.618157 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed4e598f-abad-4bd7-997c-c4f54a2e2321-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:39 crc kubenswrapper[4749]: I0219 18:50:39.618165 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed4e598f-abad-4bd7-997c-c4f54a2e2321-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:39 crc kubenswrapper[4749]: I0219 18:50:39.639062 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-sbvrv"] Feb 19 18:50:39 crc kubenswrapper[4749]: I0219 18:50:39.645422 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c7f86b5bc-tks59"] Feb 19 18:50:39 crc kubenswrapper[4749]: I0219 18:50:39.651581 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-559c64c447-q9fg6"] Feb 19 18:50:39 crc kubenswrapper[4749]: W0219 18:50:39.656204 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59199734_1adc_46b0_9208_75331e4b868c.slice/crio-102d838a171db586fb65f8b49b3e1c12d1e81d06e70bfe5179eaf754e4d30654 WatchSource:0}: Error finding container 102d838a171db586fb65f8b49b3e1c12d1e81d06e70bfe5179eaf754e4d30654: Status 404 returned error can't find the container with id 102d838a171db586fb65f8b49b3e1c12d1e81d06e70bfe5179eaf754e4d30654 Feb 19 18:50:39 crc kubenswrapper[4749]: W0219 18:50:39.661762 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1791b0e1_f604_483e_b4f3_79efb0779828.slice/crio-7ca5d848a8edcf399f69e4496df88f257a9acf317ee172a5971446acc738d3f9 WatchSource:0}: Error finding container 7ca5d848a8edcf399f69e4496df88f257a9acf317ee172a5971446acc738d3f9: Status 404 returned error can't find the container with id 7ca5d848a8edcf399f69e4496df88f257a9acf317ee172a5971446acc738d3f9 Feb 19 18:50:39 crc kubenswrapper[4749]: I0219 18:50:39.960214 4749 generic.go:334] "Generic (PLEG): container finished" podID="1791b0e1-f604-483e-b4f3-79efb0779828" containerID="aae4c96dad7ebe1a2d5a2cd30fbb4333009f03fa387f0a0d855c07094dddbd3f" exitCode=0 Feb 19 18:50:39 crc kubenswrapper[4749]: I0219 18:50:39.960593 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7f86b5bc-tks59" event={"ID":"1791b0e1-f604-483e-b4f3-79efb0779828","Type":"ContainerDied","Data":"aae4c96dad7ebe1a2d5a2cd30fbb4333009f03fa387f0a0d855c07094dddbd3f"} Feb 19 18:50:39 crc kubenswrapper[4749]: I0219 18:50:39.960620 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7f86b5bc-tks59" event={"ID":"1791b0e1-f604-483e-b4f3-79efb0779828","Type":"ContainerStarted","Data":"7ca5d848a8edcf399f69e4496df88f257a9acf317ee172a5971446acc738d3f9"} Feb 19 18:50:39 crc kubenswrapper[4749]: I0219 18:50:39.963141 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"21b53583-c33c-47c6-8351-35bd5f08e632","Type":"ContainerStarted","Data":"31b79e96d287fde0e6ca67cdf86a525c83fa917860e1ecaa0e024d228a6b4ca3"} Feb 19 18:50:39 crc kubenswrapper[4749]: I0219 18:50:39.983715 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2231660b-7776-4cf8-a793-7d592dd23ecf","Type":"ContainerStarted","Data":"d4ac1d789b9eb1743bf0e62569a2d68efd72f61735be424cb4528af8c341c424"} Feb 19 18:50:39 crc kubenswrapper[4749]: I0219 18:50:39.986866 4749 generic.go:334] "Generic (PLEG): container finished" podID="a952b799-be38-4df9-82cc-b6b09536f733" containerID="c48671f7a6e7df2e763554de45cf319141b99a66ca0716f6ad587d17e523bab5" exitCode=0 Feb 19 18:50:39 crc kubenswrapper[4749]: I0219 18:50:39.986949 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d4bcfd54c-vsmph" event={"ID":"a952b799-be38-4df9-82cc-b6b09536f733","Type":"ContainerDied","Data":"c48671f7a6e7df2e763554de45cf319141b99a66ca0716f6ad587d17e523bab5"} Feb 19 18:50:39 crc kubenswrapper[4749]: I0219 18:50:39.991617 4749 generic.go:334] "Generic (PLEG): container finished" podID="507b0ab0-b98c-490e-95d4-c9b7e88d969d" containerID="d02276dbe0776f7bbcc9b6f9910d28c39272126b3c3a3ed275e977d7660d65ec" exitCode=0 Feb 19 18:50:39 crc kubenswrapper[4749]: I0219 18:50:39.991903 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-559c64c447-q9fg6" event={"ID":"507b0ab0-b98c-490e-95d4-c9b7e88d969d","Type":"ContainerDied","Data":"d02276dbe0776f7bbcc9b6f9910d28c39272126b3c3a3ed275e977d7660d65ec"} Feb 19 18:50:39 crc kubenswrapper[4749]: I0219 18:50:39.991946 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-559c64c447-q9fg6" event={"ID":"507b0ab0-b98c-490e-95d4-c9b7e88d969d","Type":"ContainerStarted","Data":"7c7b695bba76fa6c318d54c4cd95a7239e967590eb1a4ecd71d81904cfecd71b"} Feb 19 18:50:39 crc kubenswrapper[4749]: I0219 18:50:39.993282 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc7ffd55-bhsd5" Feb 19 18:50:39 crc kubenswrapper[4749]: I0219 18:50:39.993281 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc7ffd55-bhsd5" event={"ID":"ed4e598f-abad-4bd7-997c-c4f54a2e2321","Type":"ContainerDied","Data":"bb471e75ebb9a4ef02adbb213915fdcd2dd50b66f34ef2dc695e234de31dba71"} Feb 19 18:50:39 crc kubenswrapper[4749]: I0219 18:50:39.993383 4749 scope.go:117] "RemoveContainer" containerID="ea8c40de4fe3e4c9ab0fc186943e8fb38b46bdcd4faac6edefdf7360d53cfea3" Feb 19 18:50:40 crc kubenswrapper[4749]: I0219 18:50:40.003652 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-sbvrv" event={"ID":"59199734-1adc-46b0-9208-75331e4b868c","Type":"ContainerStarted","Data":"aa2f0f92a78c2bb87a63a8dceb7b059823aa477200eb0a02ac4afd44ea7ee150"} Feb 19 18:50:40 crc kubenswrapper[4749]: I0219 18:50:40.003736 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-sbvrv" event={"ID":"59199734-1adc-46b0-9208-75331e4b868c","Type":"ContainerStarted","Data":"102d838a171db586fb65f8b49b3e1c12d1e81d06e70bfe5179eaf754e4d30654"} Feb 19 18:50:40 crc kubenswrapper[4749]: I0219 18:50:40.084070 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-sbvrv" podStartSLOduration=3.084026172 podStartE2EDuration="3.084026172s" podCreationTimestamp="2026-02-19 18:50:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:50:40.045387211 +0000 UTC m=+1014.006607165" watchObservedRunningTime="2026-02-19 18:50:40.084026172 +0000 UTC m=+1014.045246126" Feb 19 18:50:40 crc kubenswrapper[4749]: I0219 18:50:40.122469 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fc7ffd55-bhsd5"] Feb 19 18:50:40 crc kubenswrapper[4749]: I0219 18:50:40.132598 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fc7ffd55-bhsd5"] Feb 19 18:50:40 crc kubenswrapper[4749]: I0219 18:50:40.242236 4749 scope.go:117] "RemoveContainer" containerID="5b1067ca11904662e8871d06d2d7eb4f225d389757b52fb232ac361677594b88" Feb 19 18:50:40 crc kubenswrapper[4749]: I0219 18:50:40.467780 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d4bcfd54c-vsmph" Feb 19 18:50:40 crc kubenswrapper[4749]: I0219 18:50:40.637525 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wd6g\" (UniqueName: \"kubernetes.io/projected/a952b799-be38-4df9-82cc-b6b09536f733-kube-api-access-9wd6g\") pod \"a952b799-be38-4df9-82cc-b6b09536f733\" (UID: \"a952b799-be38-4df9-82cc-b6b09536f733\") " Feb 19 18:50:40 crc kubenswrapper[4749]: I0219 18:50:40.637819 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a952b799-be38-4df9-82cc-b6b09536f733-dns-svc\") pod \"a952b799-be38-4df9-82cc-b6b09536f733\" (UID: \"a952b799-be38-4df9-82cc-b6b09536f733\") " Feb 19 18:50:40 crc kubenswrapper[4749]: I0219 18:50:40.637887 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a952b799-be38-4df9-82cc-b6b09536f733-config\") pod \"a952b799-be38-4df9-82cc-b6b09536f733\" (UID: \"a952b799-be38-4df9-82cc-b6b09536f733\") " Feb 19 18:50:40 crc kubenswrapper[4749]: I0219 18:50:40.641555 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a952b799-be38-4df9-82cc-b6b09536f733-kube-api-access-9wd6g" (OuterVolumeSpecName: "kube-api-access-9wd6g") pod "a952b799-be38-4df9-82cc-b6b09536f733" (UID: "a952b799-be38-4df9-82cc-b6b09536f733"). InnerVolumeSpecName "kube-api-access-9wd6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:50:40 crc kubenswrapper[4749]: I0219 18:50:40.656107 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a952b799-be38-4df9-82cc-b6b09536f733-config" (OuterVolumeSpecName: "config") pod "a952b799-be38-4df9-82cc-b6b09536f733" (UID: "a952b799-be38-4df9-82cc-b6b09536f733"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:50:40 crc kubenswrapper[4749]: I0219 18:50:40.660752 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a952b799-be38-4df9-82cc-b6b09536f733-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a952b799-be38-4df9-82cc-b6b09536f733" (UID: "a952b799-be38-4df9-82cc-b6b09536f733"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:50:40 crc kubenswrapper[4749]: I0219 18:50:40.693110 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed4e598f-abad-4bd7-997c-c4f54a2e2321" path="/var/lib/kubelet/pods/ed4e598f-abad-4bd7-997c-c4f54a2e2321/volumes" Feb 19 18:50:40 crc kubenswrapper[4749]: I0219 18:50:40.739566 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wd6g\" (UniqueName: \"kubernetes.io/projected/a952b799-be38-4df9-82cc-b6b09536f733-kube-api-access-9wd6g\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:40 crc kubenswrapper[4749]: I0219 18:50:40.739605 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a952b799-be38-4df9-82cc-b6b09536f733-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:40 crc kubenswrapper[4749]: I0219 18:50:40.739618 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a952b799-be38-4df9-82cc-b6b09536f733-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:41 crc kubenswrapper[4749]: I0219 18:50:41.014782 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7f86b5bc-tks59" event={"ID":"1791b0e1-f604-483e-b4f3-79efb0779828","Type":"ContainerStarted","Data":"ff2cd942b500f9ff402b86e4661b7086436627224e6cbea39c903f00a21622a1"} Feb 19 18:50:41 crc kubenswrapper[4749]: I0219 18:50:41.014966 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c7f86b5bc-tks59" Feb 19 18:50:41 crc kubenswrapper[4749]: I0219 18:50:41.018255 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2231660b-7776-4cf8-a793-7d592dd23ecf","Type":"ContainerStarted","Data":"491485ff857b462958219404aa90ec9e4d71bab35d668577410de9a0ece0393a"} Feb 19 18:50:41 crc kubenswrapper[4749]: I0219 18:50:41.018287 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2231660b-7776-4cf8-a793-7d592dd23ecf","Type":"ContainerStarted","Data":"881cf9e3b65bb789245ccbcd89d38f8b4996af0310c2280d81d07d4ab42ecc73"} Feb 19 18:50:41 crc kubenswrapper[4749]: I0219 18:50:41.019225 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 19 18:50:41 crc kubenswrapper[4749]: I0219 18:50:41.039606 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d4bcfd54c-vsmph" event={"ID":"a952b799-be38-4df9-82cc-b6b09536f733","Type":"ContainerDied","Data":"0d539328743270de1cc13f3b0dc4de9af0e6a8336cfa8a093bf2af0472882e7c"} Feb 19 18:50:41 crc kubenswrapper[4749]: I0219 18:50:41.039634 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d4bcfd54c-vsmph" Feb 19 18:50:41 crc kubenswrapper[4749]: I0219 18:50:41.039670 4749 scope.go:117] "RemoveContainer" containerID="c48671f7a6e7df2e763554de45cf319141b99a66ca0716f6ad587d17e523bab5" Feb 19 18:50:41 crc kubenswrapper[4749]: I0219 18:50:41.042648 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-559c64c447-q9fg6" event={"ID":"507b0ab0-b98c-490e-95d4-c9b7e88d969d","Type":"ContainerStarted","Data":"e502235c7d34cb1f2a8de23d13ceaa0f8459280bf7adb82e9a4f68e6f4f6cfc1"} Feb 19 18:50:41 crc kubenswrapper[4749]: I0219 18:50:41.061121 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c7f86b5bc-tks59" podStartSLOduration=4.061098075 podStartE2EDuration="4.061098075s" podCreationTimestamp="2026-02-19 18:50:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:50:41.051786509 +0000 UTC m=+1015.013006473" watchObservedRunningTime="2026-02-19 18:50:41.061098075 +0000 UTC m=+1015.022318049" Feb 19 18:50:41 crc kubenswrapper[4749]: I0219 18:50:41.077679 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-559c64c447-q9fg6" podStartSLOduration=4.077655098 podStartE2EDuration="4.077655098s" podCreationTimestamp="2026-02-19 18:50:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:50:41.071626932 +0000 UTC m=+1015.032846886" watchObservedRunningTime="2026-02-19 18:50:41.077655098 +0000 UTC m=+1015.038875052" Feb 19 18:50:41 crc kubenswrapper[4749]: I0219 18:50:41.121984 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.428517164 podStartE2EDuration="4.121964077s" podCreationTimestamp="2026-02-19 18:50:37 +0000 UTC" firstStartedPulling="2026-02-19 18:50:39.56532213 +0000 UTC m=+1013.526542084" lastFinishedPulling="2026-02-19 18:50:40.258769043 +0000 UTC m=+1014.219988997" observedRunningTime="2026-02-19 18:50:41.112440065 +0000 UTC m=+1015.073660009" watchObservedRunningTime="2026-02-19 18:50:41.121964077 +0000 UTC m=+1015.083184041" Feb 19 18:50:41 crc kubenswrapper[4749]: I0219 18:50:41.164331 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d4bcfd54c-vsmph"] Feb 19 18:50:41 crc kubenswrapper[4749]: I0219 18:50:41.171786 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d4bcfd54c-vsmph"] Feb 19 18:50:42 crc kubenswrapper[4749]: I0219 18:50:42.057194 4749 generic.go:334] "Generic (PLEG): container finished" podID="e84776ec-57db-4685-84f6-f86655d9f079" containerID="7d23666248d096d152cba583714b2eb367184b5f2b321bfc383c8f719e2fd902" exitCode=0 Feb 19 18:50:42 crc kubenswrapper[4749]: I0219 18:50:42.057263 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e84776ec-57db-4685-84f6-f86655d9f079","Type":"ContainerDied","Data":"7d23666248d096d152cba583714b2eb367184b5f2b321bfc383c8f719e2fd902"} Feb 19 18:50:42 crc kubenswrapper[4749]: I0219 18:50:42.062755 4749 generic.go:334] "Generic (PLEG): container finished" podID="affb1316-cbf5-4641-bdd7-186e390b9e7e" containerID="95117842f82f4291afd5168ceb126a41b70074eab7b061ad5d75b1798561eef2" exitCode=0 Feb 19 18:50:42 crc kubenswrapper[4749]: I0219 18:50:42.064124 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"affb1316-cbf5-4641-bdd7-186e390b9e7e","Type":"ContainerDied","Data":"95117842f82f4291afd5168ceb126a41b70074eab7b061ad5d75b1798561eef2"} Feb 19 18:50:42 crc kubenswrapper[4749]: I0219 18:50:42.064326 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-559c64c447-q9fg6" Feb 19 18:50:42 crc kubenswrapper[4749]: I0219 18:50:42.690623 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a952b799-be38-4df9-82cc-b6b09536f733" path="/var/lib/kubelet/pods/a952b799-be38-4df9-82cc-b6b09536f733/volumes" Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.072420 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e84776ec-57db-4685-84f6-f86655d9f079","Type":"ContainerStarted","Data":"cb6d8f9db1beb8b990904a51d119e4a5b077c7a5ec6a962d8d52ceaaa90a407b"} Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.074197 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"affb1316-cbf5-4641-bdd7-186e390b9e7e","Type":"ContainerStarted","Data":"391a7e09a830fd3c1b008af8115b4713335cb4b700daf53578faf20d3bcd0bb0"} Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.076481 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"21b53583-c33c-47c6-8351-35bd5f08e632","Type":"ContainerStarted","Data":"d0e5ba129d30866e58603466a4a238ab61f5e23b51d34dab6469ae21da450e2c"} Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.102384 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=38.284739798 podStartE2EDuration="44.102357023s" podCreationTimestamp="2026-02-19 18:49:59 +0000 UTC" firstStartedPulling="2026-02-19 18:50:23.673050048 +0000 UTC m=+997.634270002" lastFinishedPulling="2026-02-19 18:50:29.490667263 +0000 UTC m=+1003.451887227" observedRunningTime="2026-02-19 18:50:43.09149762 +0000 UTC m=+1017.052717584" watchObservedRunningTime="2026-02-19 18:50:43.102357023 +0000 UTC m=+1017.063576997" Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.123582 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=39.314689047 podStartE2EDuration="45.123561089s" podCreationTimestamp="2026-02-19 18:49:58 +0000 UTC" firstStartedPulling="2026-02-19 18:50:23.681215376 +0000 UTC m=+997.642435330" lastFinishedPulling="2026-02-19 18:50:29.490087418 +0000 UTC m=+1003.451307372" observedRunningTime="2026-02-19 18:50:43.1153509 +0000 UTC m=+1017.076570884" watchObservedRunningTime="2026-02-19 18:50:43.123561089 +0000 UTC m=+1017.084781053" Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.281198 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.485626 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-559c64c447-q9fg6"] Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.536405 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-746f4bbcc9-sjckp"] Feb 19 18:50:43 crc kubenswrapper[4749]: E0219 18:50:43.536742 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed4e598f-abad-4bd7-997c-c4f54a2e2321" containerName="dnsmasq-dns" Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.536760 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed4e598f-abad-4bd7-997c-c4f54a2e2321" containerName="dnsmasq-dns" Feb 19 18:50:43 crc kubenswrapper[4749]: E0219 18:50:43.536782 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed4e598f-abad-4bd7-997c-c4f54a2e2321" containerName="init" Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.536789 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed4e598f-abad-4bd7-997c-c4f54a2e2321" containerName="init" Feb 19 18:50:43 crc kubenswrapper[4749]: E0219 18:50:43.536812 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a952b799-be38-4df9-82cc-b6b09536f733" containerName="init" Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.536818 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a952b799-be38-4df9-82cc-b6b09536f733" containerName="init" Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.536981 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a952b799-be38-4df9-82cc-b6b09536f733" containerName="init" Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.537007 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed4e598f-abad-4bd7-997c-c4f54a2e2321" containerName="dnsmasq-dns" Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.537931 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.564121 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-746f4bbcc9-sjckp"] Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.621776 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/feeb65e2-e83b-4028-b7de-fa94205ccd40-dns-svc\") pod \"dnsmasq-dns-746f4bbcc9-sjckp\" (UID: \"feeb65e2-e83b-4028-b7de-fa94205ccd40\") " pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.621819 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feeb65e2-e83b-4028-b7de-fa94205ccd40-config\") pod \"dnsmasq-dns-746f4bbcc9-sjckp\" (UID: \"feeb65e2-e83b-4028-b7de-fa94205ccd40\") " pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.621844 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/feeb65e2-e83b-4028-b7de-fa94205ccd40-ovsdbserver-nb\") pod \"dnsmasq-dns-746f4bbcc9-sjckp\" (UID: \"feeb65e2-e83b-4028-b7de-fa94205ccd40\") " pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.621866 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/feeb65e2-e83b-4028-b7de-fa94205ccd40-ovsdbserver-sb\") pod \"dnsmasq-dns-746f4bbcc9-sjckp\" (UID: \"feeb65e2-e83b-4028-b7de-fa94205ccd40\") " pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.621956 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gp2p\" (UniqueName: \"kubernetes.io/projected/feeb65e2-e83b-4028-b7de-fa94205ccd40-kube-api-access-4gp2p\") pod \"dnsmasq-dns-746f4bbcc9-sjckp\" (UID: \"feeb65e2-e83b-4028-b7de-fa94205ccd40\") " pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.723012 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gp2p\" (UniqueName: \"kubernetes.io/projected/feeb65e2-e83b-4028-b7de-fa94205ccd40-kube-api-access-4gp2p\") pod \"dnsmasq-dns-746f4bbcc9-sjckp\" (UID: \"feeb65e2-e83b-4028-b7de-fa94205ccd40\") " pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.723093 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/feeb65e2-e83b-4028-b7de-fa94205ccd40-dns-svc\") pod \"dnsmasq-dns-746f4bbcc9-sjckp\" (UID: \"feeb65e2-e83b-4028-b7de-fa94205ccd40\") " pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.723120 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feeb65e2-e83b-4028-b7de-fa94205ccd40-config\") pod \"dnsmasq-dns-746f4bbcc9-sjckp\" (UID: \"feeb65e2-e83b-4028-b7de-fa94205ccd40\") " pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.723149 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/feeb65e2-e83b-4028-b7de-fa94205ccd40-ovsdbserver-nb\") pod \"dnsmasq-dns-746f4bbcc9-sjckp\" (UID: \"feeb65e2-e83b-4028-b7de-fa94205ccd40\") " pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.723164 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/feeb65e2-e83b-4028-b7de-fa94205ccd40-ovsdbserver-sb\") pod \"dnsmasq-dns-746f4bbcc9-sjckp\" (UID: \"feeb65e2-e83b-4028-b7de-fa94205ccd40\") " pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.723964 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/feeb65e2-e83b-4028-b7de-fa94205ccd40-dns-svc\") pod \"dnsmasq-dns-746f4bbcc9-sjckp\" (UID: \"feeb65e2-e83b-4028-b7de-fa94205ccd40\") " pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.724202 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/feeb65e2-e83b-4028-b7de-fa94205ccd40-ovsdbserver-sb\") pod \"dnsmasq-dns-746f4bbcc9-sjckp\" (UID: \"feeb65e2-e83b-4028-b7de-fa94205ccd40\") " pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.724778 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feeb65e2-e83b-4028-b7de-fa94205ccd40-config\") pod \"dnsmasq-dns-746f4bbcc9-sjckp\" (UID: \"feeb65e2-e83b-4028-b7de-fa94205ccd40\") " pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.724926 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/feeb65e2-e83b-4028-b7de-fa94205ccd40-ovsdbserver-nb\") pod \"dnsmasq-dns-746f4bbcc9-sjckp\" (UID: \"feeb65e2-e83b-4028-b7de-fa94205ccd40\") " pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.755850 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gp2p\" (UniqueName: \"kubernetes.io/projected/feeb65e2-e83b-4028-b7de-fa94205ccd40-kube-api-access-4gp2p\") pod \"dnsmasq-dns-746f4bbcc9-sjckp\" (UID: \"feeb65e2-e83b-4028-b7de-fa94205ccd40\") " pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" Feb 19 18:50:43 crc kubenswrapper[4749]: I0219 18:50:43.858690 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.086098 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-559c64c447-q9fg6" podUID="507b0ab0-b98c-490e-95d4-c9b7e88d969d" containerName="dnsmasq-dns" containerID="cri-o://e502235c7d34cb1f2a8de23d13ceaa0f8459280bf7adb82e9a4f68e6f4f6cfc1" gracePeriod=10 Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.325881 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-746f4bbcc9-sjckp"] Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.511983 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-559c64c447-q9fg6" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.538420 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/507b0ab0-b98c-490e-95d4-c9b7e88d969d-config\") pod \"507b0ab0-b98c-490e-95d4-c9b7e88d969d\" (UID: \"507b0ab0-b98c-490e-95d4-c9b7e88d969d\") " Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.538637 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/507b0ab0-b98c-490e-95d4-c9b7e88d969d-ovsdbserver-nb\") pod \"507b0ab0-b98c-490e-95d4-c9b7e88d969d\" (UID: \"507b0ab0-b98c-490e-95d4-c9b7e88d969d\") " Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.538726 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5whk2\" (UniqueName: \"kubernetes.io/projected/507b0ab0-b98c-490e-95d4-c9b7e88d969d-kube-api-access-5whk2\") pod \"507b0ab0-b98c-490e-95d4-c9b7e88d969d\" (UID: \"507b0ab0-b98c-490e-95d4-c9b7e88d969d\") " Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.538752 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/507b0ab0-b98c-490e-95d4-c9b7e88d969d-ovsdbserver-sb\") pod \"507b0ab0-b98c-490e-95d4-c9b7e88d969d\" (UID: \"507b0ab0-b98c-490e-95d4-c9b7e88d969d\") " Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.538803 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/507b0ab0-b98c-490e-95d4-c9b7e88d969d-dns-svc\") pod \"507b0ab0-b98c-490e-95d4-c9b7e88d969d\" (UID: \"507b0ab0-b98c-490e-95d4-c9b7e88d969d\") " Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.547446 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/507b0ab0-b98c-490e-95d4-c9b7e88d969d-kube-api-access-5whk2" (OuterVolumeSpecName: "kube-api-access-5whk2") pod "507b0ab0-b98c-490e-95d4-c9b7e88d969d" (UID: "507b0ab0-b98c-490e-95d4-c9b7e88d969d"). InnerVolumeSpecName "kube-api-access-5whk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.588495 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/507b0ab0-b98c-490e-95d4-c9b7e88d969d-config" (OuterVolumeSpecName: "config") pod "507b0ab0-b98c-490e-95d4-c9b7e88d969d" (UID: "507b0ab0-b98c-490e-95d4-c9b7e88d969d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.590949 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/507b0ab0-b98c-490e-95d4-c9b7e88d969d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "507b0ab0-b98c-490e-95d4-c9b7e88d969d" (UID: "507b0ab0-b98c-490e-95d4-c9b7e88d969d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.591895 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/507b0ab0-b98c-490e-95d4-c9b7e88d969d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "507b0ab0-b98c-490e-95d4-c9b7e88d969d" (UID: "507b0ab0-b98c-490e-95d4-c9b7e88d969d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.599413 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/507b0ab0-b98c-490e-95d4-c9b7e88d969d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "507b0ab0-b98c-490e-95d4-c9b7e88d969d" (UID: "507b0ab0-b98c-490e-95d4-c9b7e88d969d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.641369 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/507b0ab0-b98c-490e-95d4-c9b7e88d969d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.641404 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5whk2\" (UniqueName: \"kubernetes.io/projected/507b0ab0-b98c-490e-95d4-c9b7e88d969d-kube-api-access-5whk2\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.641416 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/507b0ab0-b98c-490e-95d4-c9b7e88d969d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.641425 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/507b0ab0-b98c-490e-95d4-c9b7e88d969d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.641434 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/507b0ab0-b98c-490e-95d4-c9b7e88d969d-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.719767 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 19 18:50:44 crc kubenswrapper[4749]: E0219 18:50:44.720169 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="507b0ab0-b98c-490e-95d4-c9b7e88d969d" containerName="dnsmasq-dns" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.720184 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="507b0ab0-b98c-490e-95d4-c9b7e88d969d" containerName="dnsmasq-dns" Feb 19 18:50:44 crc kubenswrapper[4749]: E0219 18:50:44.720197 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="507b0ab0-b98c-490e-95d4-c9b7e88d969d" containerName="init" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.720203 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="507b0ab0-b98c-490e-95d4-c9b7e88d969d" containerName="init" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.720446 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="507b0ab0-b98c-490e-95d4-c9b7e88d969d" containerName="dnsmasq-dns" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.746157 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.746369 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.752325 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.752605 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.752831 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.752340 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-gbskg" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.851958 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ece11938-c758-4d62-ad84-c630d040f511-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"ece11938-c758-4d62-ad84-c630d040f511\") " pod="openstack/swift-storage-0" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.852299 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ece11938-c758-4d62-ad84-c630d040f511-etc-swift\") pod \"swift-storage-0\" (UID: \"ece11938-c758-4d62-ad84-c630d040f511\") " pod="openstack/swift-storage-0" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.852340 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ece11938-c758-4d62-ad84-c630d040f511-lock\") pod \"swift-storage-0\" (UID: \"ece11938-c758-4d62-ad84-c630d040f511\") " pod="openstack/swift-storage-0" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.852387 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ece11938-c758-4d62-ad84-c630d040f511-cache\") pod \"swift-storage-0\" (UID: \"ece11938-c758-4d62-ad84-c630d040f511\") " pod="openstack/swift-storage-0" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.852421 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"ece11938-c758-4d62-ad84-c630d040f511\") " pod="openstack/swift-storage-0" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.852451 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlfzf\" (UniqueName: \"kubernetes.io/projected/ece11938-c758-4d62-ad84-c630d040f511-kube-api-access-nlfzf\") pod \"swift-storage-0\" (UID: \"ece11938-c758-4d62-ad84-c630d040f511\") " pod="openstack/swift-storage-0" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.955360 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ece11938-c758-4d62-ad84-c630d040f511-cache\") pod \"swift-storage-0\" (UID: \"ece11938-c758-4d62-ad84-c630d040f511\") " pod="openstack/swift-storage-0" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.955467 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"ece11938-c758-4d62-ad84-c630d040f511\") " pod="openstack/swift-storage-0" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.955539 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlfzf\" (UniqueName: \"kubernetes.io/projected/ece11938-c758-4d62-ad84-c630d040f511-kube-api-access-nlfzf\") pod \"swift-storage-0\" (UID: \"ece11938-c758-4d62-ad84-c630d040f511\") " pod="openstack/swift-storage-0" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.955606 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ece11938-c758-4d62-ad84-c630d040f511-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"ece11938-c758-4d62-ad84-c630d040f511\") " pod="openstack/swift-storage-0" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.955637 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ece11938-c758-4d62-ad84-c630d040f511-etc-swift\") pod \"swift-storage-0\" (UID: \"ece11938-c758-4d62-ad84-c630d040f511\") " pod="openstack/swift-storage-0" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.955925 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"ece11938-c758-4d62-ad84-c630d040f511\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/swift-storage-0" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.955979 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ece11938-c758-4d62-ad84-c630d040f511-cache\") pod \"swift-storage-0\" (UID: \"ece11938-c758-4d62-ad84-c630d040f511\") " pod="openstack/swift-storage-0" Feb 19 18:50:44 crc kubenswrapper[4749]: E0219 18:50:44.956958 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 18:50:44 crc kubenswrapper[4749]: E0219 18:50:44.956980 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 18:50:44 crc kubenswrapper[4749]: E0219 18:50:44.957067 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ece11938-c758-4d62-ad84-c630d040f511-etc-swift podName:ece11938-c758-4d62-ad84-c630d040f511 nodeName:}" failed. No retries permitted until 2026-02-19 18:50:45.457010881 +0000 UTC m=+1019.418230835 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ece11938-c758-4d62-ad84-c630d040f511-etc-swift") pod "swift-storage-0" (UID: "ece11938-c758-4d62-ad84-c630d040f511") : configmap "swift-ring-files" not found Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.957202 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ece11938-c758-4d62-ad84-c630d040f511-lock\") pod \"swift-storage-0\" (UID: \"ece11938-c758-4d62-ad84-c630d040f511\") " pod="openstack/swift-storage-0" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.957646 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ece11938-c758-4d62-ad84-c630d040f511-lock\") pod \"swift-storage-0\" (UID: \"ece11938-c758-4d62-ad84-c630d040f511\") " pod="openstack/swift-storage-0" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.971413 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ece11938-c758-4d62-ad84-c630d040f511-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"ece11938-c758-4d62-ad84-c630d040f511\") " pod="openstack/swift-storage-0" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.974888 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlfzf\" (UniqueName: \"kubernetes.io/projected/ece11938-c758-4d62-ad84-c630d040f511-kube-api-access-nlfzf\") pod \"swift-storage-0\" (UID: \"ece11938-c758-4d62-ad84-c630d040f511\") " pod="openstack/swift-storage-0" Feb 19 18:50:44 crc kubenswrapper[4749]: I0219 18:50:44.995837 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"ece11938-c758-4d62-ad84-c630d040f511\") " pod="openstack/swift-storage-0" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.096604 4749 generic.go:334] "Generic (PLEG): container finished" podID="507b0ab0-b98c-490e-95d4-c9b7e88d969d" containerID="e502235c7d34cb1f2a8de23d13ceaa0f8459280bf7adb82e9a4f68e6f4f6cfc1" exitCode=0 Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.096667 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-559c64c447-q9fg6" event={"ID":"507b0ab0-b98c-490e-95d4-c9b7e88d969d","Type":"ContainerDied","Data":"e502235c7d34cb1f2a8de23d13ceaa0f8459280bf7adb82e9a4f68e6f4f6cfc1"} Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.096696 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-559c64c447-q9fg6" event={"ID":"507b0ab0-b98c-490e-95d4-c9b7e88d969d","Type":"ContainerDied","Data":"7c7b695bba76fa6c318d54c4cd95a7239e967590eb1a4ecd71d81904cfecd71b"} Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.096701 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-559c64c447-q9fg6" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.096712 4749 scope.go:117] "RemoveContainer" containerID="e502235c7d34cb1f2a8de23d13ceaa0f8459280bf7adb82e9a4f68e6f4f6cfc1" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.098287 4749 generic.go:334] "Generic (PLEG): container finished" podID="feeb65e2-e83b-4028-b7de-fa94205ccd40" containerID="cd262b4a040b37c0de314cd9b9ee003eee8028544d645592dbdc70c9d5d62512" exitCode=0 Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.098319 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" event={"ID":"feeb65e2-e83b-4028-b7de-fa94205ccd40","Type":"ContainerDied","Data":"cd262b4a040b37c0de314cd9b9ee003eee8028544d645592dbdc70c9d5d62512"} Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.098336 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" event={"ID":"feeb65e2-e83b-4028-b7de-fa94205ccd40","Type":"ContainerStarted","Data":"76ff485dcce11c75362c064274cdb60ebe4ed6ff9773cdf30c4c7334c86f0081"} Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.141513 4749 scope.go:117] "RemoveContainer" containerID="d02276dbe0776f7bbcc9b6f9910d28c39272126b3c3a3ed275e977d7660d65ec" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.147933 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-559c64c447-q9fg6"] Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.159451 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-559c64c447-q9fg6"] Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.214114 4749 scope.go:117] "RemoveContainer" containerID="e502235c7d34cb1f2a8de23d13ceaa0f8459280bf7adb82e9a4f68e6f4f6cfc1" Feb 19 18:50:45 crc kubenswrapper[4749]: E0219 18:50:45.214560 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e502235c7d34cb1f2a8de23d13ceaa0f8459280bf7adb82e9a4f68e6f4f6cfc1\": container with ID starting with e502235c7d34cb1f2a8de23d13ceaa0f8459280bf7adb82e9a4f68e6f4f6cfc1 not found: ID does not exist" containerID="e502235c7d34cb1f2a8de23d13ceaa0f8459280bf7adb82e9a4f68e6f4f6cfc1" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.214590 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e502235c7d34cb1f2a8de23d13ceaa0f8459280bf7adb82e9a4f68e6f4f6cfc1"} err="failed to get container status \"e502235c7d34cb1f2a8de23d13ceaa0f8459280bf7adb82e9a4f68e6f4f6cfc1\": rpc error: code = NotFound desc = could not find container \"e502235c7d34cb1f2a8de23d13ceaa0f8459280bf7adb82e9a4f68e6f4f6cfc1\": container with ID starting with e502235c7d34cb1f2a8de23d13ceaa0f8459280bf7adb82e9a4f68e6f4f6cfc1 not found: ID does not exist" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.214614 4749 scope.go:117] "RemoveContainer" containerID="d02276dbe0776f7bbcc9b6f9910d28c39272126b3c3a3ed275e977d7660d65ec" Feb 19 18:50:45 crc kubenswrapper[4749]: E0219 18:50:45.217555 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d02276dbe0776f7bbcc9b6f9910d28c39272126b3c3a3ed275e977d7660d65ec\": container with ID starting with d02276dbe0776f7bbcc9b6f9910d28c39272126b3c3a3ed275e977d7660d65ec not found: ID does not exist" containerID="d02276dbe0776f7bbcc9b6f9910d28c39272126b3c3a3ed275e977d7660d65ec" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.217592 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d02276dbe0776f7bbcc9b6f9910d28c39272126b3c3a3ed275e977d7660d65ec"} err="failed to get container status \"d02276dbe0776f7bbcc9b6f9910d28c39272126b3c3a3ed275e977d7660d65ec\": rpc error: code = NotFound desc = could not find container \"d02276dbe0776f7bbcc9b6f9910d28c39272126b3c3a3ed275e977d7660d65ec\": container with ID starting with d02276dbe0776f7bbcc9b6f9910d28c39272126b3c3a3ed275e977d7660d65ec not found: ID does not exist" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.325432 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-gf9kk"] Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.327277 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gf9kk" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.332656 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.332931 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.339525 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.360487 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-gf9kk"] Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.366332 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p2pn\" (UniqueName: \"kubernetes.io/projected/63ce035e-5489-4e24-8b72-55137a407adc-kube-api-access-7p2pn\") pod \"swift-ring-rebalance-gf9kk\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " pod="openstack/swift-ring-rebalance-gf9kk" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.366632 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/63ce035e-5489-4e24-8b72-55137a407adc-ring-data-devices\") pod \"swift-ring-rebalance-gf9kk\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " pod="openstack/swift-ring-rebalance-gf9kk" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.366811 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/63ce035e-5489-4e24-8b72-55137a407adc-dispersionconf\") pod \"swift-ring-rebalance-gf9kk\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " pod="openstack/swift-ring-rebalance-gf9kk" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.366950 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63ce035e-5489-4e24-8b72-55137a407adc-combined-ca-bundle\") pod \"swift-ring-rebalance-gf9kk\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " pod="openstack/swift-ring-rebalance-gf9kk" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.367141 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63ce035e-5489-4e24-8b72-55137a407adc-scripts\") pod \"swift-ring-rebalance-gf9kk\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " pod="openstack/swift-ring-rebalance-gf9kk" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.367282 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/63ce035e-5489-4e24-8b72-55137a407adc-etc-swift\") pod \"swift-ring-rebalance-gf9kk\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " pod="openstack/swift-ring-rebalance-gf9kk" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.389164 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/63ce035e-5489-4e24-8b72-55137a407adc-swiftconf\") pod \"swift-ring-rebalance-gf9kk\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " pod="openstack/swift-ring-rebalance-gf9kk" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.389278 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-czb4z"] Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.390605 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-czb4z" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.395888 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-gf9kk"] Feb 19 18:50:45 crc kubenswrapper[4749]: E0219 18:50:45.397297 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-7p2pn ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-gf9kk" podUID="63ce035e-5489-4e24-8b72-55137a407adc" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.402261 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-czb4z"] Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.490551 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p2pn\" (UniqueName: \"kubernetes.io/projected/63ce035e-5489-4e24-8b72-55137a407adc-kube-api-access-7p2pn\") pod \"swift-ring-rebalance-gf9kk\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " pod="openstack/swift-ring-rebalance-gf9kk" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.490608 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f1234ce5-5e40-4f76-a3b5-8b47853bf147-swiftconf\") pod \"swift-ring-rebalance-czb4z\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " pod="openstack/swift-ring-rebalance-czb4z" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.490631 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/63ce035e-5489-4e24-8b72-55137a407adc-ring-data-devices\") pod \"swift-ring-rebalance-gf9kk\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " pod="openstack/swift-ring-rebalance-gf9kk" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.490649 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67qqd\" (UniqueName: \"kubernetes.io/projected/f1234ce5-5e40-4f76-a3b5-8b47853bf147-kube-api-access-67qqd\") pod \"swift-ring-rebalance-czb4z\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " pod="openstack/swift-ring-rebalance-czb4z" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.490694 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/63ce035e-5489-4e24-8b72-55137a407adc-dispersionconf\") pod \"swift-ring-rebalance-gf9kk\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " pod="openstack/swift-ring-rebalance-gf9kk" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.490719 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63ce035e-5489-4e24-8b72-55137a407adc-combined-ca-bundle\") pod \"swift-ring-rebalance-gf9kk\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " pod="openstack/swift-ring-rebalance-gf9kk" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.490751 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ece11938-c758-4d62-ad84-c630d040f511-etc-swift\") pod \"swift-storage-0\" (UID: \"ece11938-c758-4d62-ad84-c630d040f511\") " pod="openstack/swift-storage-0" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.490769 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f1234ce5-5e40-4f76-a3b5-8b47853bf147-dispersionconf\") pod \"swift-ring-rebalance-czb4z\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " pod="openstack/swift-ring-rebalance-czb4z" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.490785 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1234ce5-5e40-4f76-a3b5-8b47853bf147-combined-ca-bundle\") pod \"swift-ring-rebalance-czb4z\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " pod="openstack/swift-ring-rebalance-czb4z" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.490805 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63ce035e-5489-4e24-8b72-55137a407adc-scripts\") pod \"swift-ring-rebalance-gf9kk\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " pod="openstack/swift-ring-rebalance-gf9kk" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.490839 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/63ce035e-5489-4e24-8b72-55137a407adc-etc-swift\") pod \"swift-ring-rebalance-gf9kk\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " pod="openstack/swift-ring-rebalance-gf9kk" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.491000 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/63ce035e-5489-4e24-8b72-55137a407adc-swiftconf\") pod \"swift-ring-rebalance-gf9kk\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " pod="openstack/swift-ring-rebalance-gf9kk" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.491092 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f1234ce5-5e40-4f76-a3b5-8b47853bf147-etc-swift\") pod \"swift-ring-rebalance-czb4z\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " pod="openstack/swift-ring-rebalance-czb4z" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.491115 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1234ce5-5e40-4f76-a3b5-8b47853bf147-scripts\") pod \"swift-ring-rebalance-czb4z\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " pod="openstack/swift-ring-rebalance-czb4z" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.491145 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f1234ce5-5e40-4f76-a3b5-8b47853bf147-ring-data-devices\") pod \"swift-ring-rebalance-czb4z\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " pod="openstack/swift-ring-rebalance-czb4z" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.491344 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/63ce035e-5489-4e24-8b72-55137a407adc-etc-swift\") pod \"swift-ring-rebalance-gf9kk\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " pod="openstack/swift-ring-rebalance-gf9kk" Feb 19 18:50:45 crc kubenswrapper[4749]: E0219 18:50:45.491586 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 18:50:45 crc kubenswrapper[4749]: E0219 18:50:45.491605 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 18:50:45 crc kubenswrapper[4749]: E0219 18:50:45.491644 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ece11938-c758-4d62-ad84-c630d040f511-etc-swift podName:ece11938-c758-4d62-ad84-c630d040f511 nodeName:}" failed. No retries permitted until 2026-02-19 18:50:46.49162944 +0000 UTC m=+1020.452849394 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ece11938-c758-4d62-ad84-c630d040f511-etc-swift") pod "swift-storage-0" (UID: "ece11938-c758-4d62-ad84-c630d040f511") : configmap "swift-ring-files" not found Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.492588 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/63ce035e-5489-4e24-8b72-55137a407adc-ring-data-devices\") pod \"swift-ring-rebalance-gf9kk\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " pod="openstack/swift-ring-rebalance-gf9kk" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.492991 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63ce035e-5489-4e24-8b72-55137a407adc-scripts\") pod \"swift-ring-rebalance-gf9kk\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " pod="openstack/swift-ring-rebalance-gf9kk" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.500984 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63ce035e-5489-4e24-8b72-55137a407adc-combined-ca-bundle\") pod \"swift-ring-rebalance-gf9kk\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " pod="openstack/swift-ring-rebalance-gf9kk" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.504510 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/63ce035e-5489-4e24-8b72-55137a407adc-swiftconf\") pod \"swift-ring-rebalance-gf9kk\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " pod="openstack/swift-ring-rebalance-gf9kk" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.504704 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/63ce035e-5489-4e24-8b72-55137a407adc-dispersionconf\") pod \"swift-ring-rebalance-gf9kk\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " pod="openstack/swift-ring-rebalance-gf9kk" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.514378 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p2pn\" (UniqueName: \"kubernetes.io/projected/63ce035e-5489-4e24-8b72-55137a407adc-kube-api-access-7p2pn\") pod \"swift-ring-rebalance-gf9kk\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " pod="openstack/swift-ring-rebalance-gf9kk" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.593010 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f1234ce5-5e40-4f76-a3b5-8b47853bf147-etc-swift\") pod \"swift-ring-rebalance-czb4z\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " pod="openstack/swift-ring-rebalance-czb4z" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.593061 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1234ce5-5e40-4f76-a3b5-8b47853bf147-scripts\") pod \"swift-ring-rebalance-czb4z\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " pod="openstack/swift-ring-rebalance-czb4z" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.593084 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f1234ce5-5e40-4f76-a3b5-8b47853bf147-ring-data-devices\") pod \"swift-ring-rebalance-czb4z\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " pod="openstack/swift-ring-rebalance-czb4z" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.593114 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f1234ce5-5e40-4f76-a3b5-8b47853bf147-swiftconf\") pod \"swift-ring-rebalance-czb4z\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " pod="openstack/swift-ring-rebalance-czb4z" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.593134 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67qqd\" (UniqueName: \"kubernetes.io/projected/f1234ce5-5e40-4f76-a3b5-8b47853bf147-kube-api-access-67qqd\") pod \"swift-ring-rebalance-czb4z\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " pod="openstack/swift-ring-rebalance-czb4z" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.593214 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f1234ce5-5e40-4f76-a3b5-8b47853bf147-dispersionconf\") pod \"swift-ring-rebalance-czb4z\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " pod="openstack/swift-ring-rebalance-czb4z" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.593235 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1234ce5-5e40-4f76-a3b5-8b47853bf147-combined-ca-bundle\") pod \"swift-ring-rebalance-czb4z\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " pod="openstack/swift-ring-rebalance-czb4z" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.593914 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1234ce5-5e40-4f76-a3b5-8b47853bf147-scripts\") pod \"swift-ring-rebalance-czb4z\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " pod="openstack/swift-ring-rebalance-czb4z" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.593956 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f1234ce5-5e40-4f76-a3b5-8b47853bf147-ring-data-devices\") pod \"swift-ring-rebalance-czb4z\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " pod="openstack/swift-ring-rebalance-czb4z" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.594549 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f1234ce5-5e40-4f76-a3b5-8b47853bf147-etc-swift\") pod \"swift-ring-rebalance-czb4z\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " pod="openstack/swift-ring-rebalance-czb4z" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.596724 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f1234ce5-5e40-4f76-a3b5-8b47853bf147-dispersionconf\") pod \"swift-ring-rebalance-czb4z\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " pod="openstack/swift-ring-rebalance-czb4z" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.597519 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1234ce5-5e40-4f76-a3b5-8b47853bf147-combined-ca-bundle\") pod \"swift-ring-rebalance-czb4z\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " pod="openstack/swift-ring-rebalance-czb4z" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.598504 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f1234ce5-5e40-4f76-a3b5-8b47853bf147-swiftconf\") pod \"swift-ring-rebalance-czb4z\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " pod="openstack/swift-ring-rebalance-czb4z" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.608333 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67qqd\" (UniqueName: \"kubernetes.io/projected/f1234ce5-5e40-4f76-a3b5-8b47853bf147-kube-api-access-67qqd\") pod \"swift-ring-rebalance-czb4z\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " pod="openstack/swift-ring-rebalance-czb4z" Feb 19 18:50:45 crc kubenswrapper[4749]: I0219 18:50:45.714565 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-czb4z" Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.113771 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gf9kk" Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.115284 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" event={"ID":"feeb65e2-e83b-4028-b7de-fa94205ccd40","Type":"ContainerStarted","Data":"8ba3d5b99d0381c433531c1399e7c1130409eac94c1973be2efe946a9286bcb0"} Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.115315 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.122366 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gf9kk" Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.133408 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" podStartSLOduration=3.133388835 podStartE2EDuration="3.133388835s" podCreationTimestamp="2026-02-19 18:50:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:50:46.131892889 +0000 UTC m=+1020.093112843" watchObservedRunningTime="2026-02-19 18:50:46.133388835 +0000 UTC m=+1020.094608789" Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.183678 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-czb4z"] Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.205407 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63ce035e-5489-4e24-8b72-55137a407adc-scripts\") pod \"63ce035e-5489-4e24-8b72-55137a407adc\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.205436 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/63ce035e-5489-4e24-8b72-55137a407adc-ring-data-devices\") pod \"63ce035e-5489-4e24-8b72-55137a407adc\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.205485 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/63ce035e-5489-4e24-8b72-55137a407adc-etc-swift\") pod \"63ce035e-5489-4e24-8b72-55137a407adc\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.205530 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/63ce035e-5489-4e24-8b72-55137a407adc-dispersionconf\") pod \"63ce035e-5489-4e24-8b72-55137a407adc\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.205556 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7p2pn\" (UniqueName: \"kubernetes.io/projected/63ce035e-5489-4e24-8b72-55137a407adc-kube-api-access-7p2pn\") pod \"63ce035e-5489-4e24-8b72-55137a407adc\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.205574 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63ce035e-5489-4e24-8b72-55137a407adc-combined-ca-bundle\") pod \"63ce035e-5489-4e24-8b72-55137a407adc\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.205605 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/63ce035e-5489-4e24-8b72-55137a407adc-swiftconf\") pod \"63ce035e-5489-4e24-8b72-55137a407adc\" (UID: \"63ce035e-5489-4e24-8b72-55137a407adc\") " Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.206010 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63ce035e-5489-4e24-8b72-55137a407adc-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "63ce035e-5489-4e24-8b72-55137a407adc" (UID: "63ce035e-5489-4e24-8b72-55137a407adc"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.206386 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63ce035e-5489-4e24-8b72-55137a407adc-scripts" (OuterVolumeSpecName: "scripts") pod "63ce035e-5489-4e24-8b72-55137a407adc" (UID: "63ce035e-5489-4e24-8b72-55137a407adc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.206453 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63ce035e-5489-4e24-8b72-55137a407adc-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "63ce035e-5489-4e24-8b72-55137a407adc" (UID: "63ce035e-5489-4e24-8b72-55137a407adc"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.211609 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63ce035e-5489-4e24-8b72-55137a407adc-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "63ce035e-5489-4e24-8b72-55137a407adc" (UID: "63ce035e-5489-4e24-8b72-55137a407adc"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.211702 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63ce035e-5489-4e24-8b72-55137a407adc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63ce035e-5489-4e24-8b72-55137a407adc" (UID: "63ce035e-5489-4e24-8b72-55137a407adc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.212133 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63ce035e-5489-4e24-8b72-55137a407adc-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "63ce035e-5489-4e24-8b72-55137a407adc" (UID: "63ce035e-5489-4e24-8b72-55137a407adc"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.214318 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63ce035e-5489-4e24-8b72-55137a407adc-kube-api-access-7p2pn" (OuterVolumeSpecName: "kube-api-access-7p2pn") pod "63ce035e-5489-4e24-8b72-55137a407adc" (UID: "63ce035e-5489-4e24-8b72-55137a407adc"). InnerVolumeSpecName "kube-api-access-7p2pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.306867 4749 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/63ce035e-5489-4e24-8b72-55137a407adc-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.306899 4749 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/63ce035e-5489-4e24-8b72-55137a407adc-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.306909 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7p2pn\" (UniqueName: \"kubernetes.io/projected/63ce035e-5489-4e24-8b72-55137a407adc-kube-api-access-7p2pn\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.306917 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63ce035e-5489-4e24-8b72-55137a407adc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.306926 4749 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/63ce035e-5489-4e24-8b72-55137a407adc-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.306934 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63ce035e-5489-4e24-8b72-55137a407adc-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.306942 4749 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/63ce035e-5489-4e24-8b72-55137a407adc-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.513837 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ece11938-c758-4d62-ad84-c630d040f511-etc-swift\") pod \"swift-storage-0\" (UID: \"ece11938-c758-4d62-ad84-c630d040f511\") " pod="openstack/swift-storage-0" Feb 19 18:50:46 crc kubenswrapper[4749]: E0219 18:50:46.514057 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 18:50:46 crc kubenswrapper[4749]: E0219 18:50:46.514085 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 18:50:46 crc kubenswrapper[4749]: E0219 18:50:46.514149 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ece11938-c758-4d62-ad84-c630d040f511-etc-swift podName:ece11938-c758-4d62-ad84-c630d040f511 nodeName:}" failed. No retries permitted until 2026-02-19 18:50:48.514126779 +0000 UTC m=+1022.475346733 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ece11938-c758-4d62-ad84-c630d040f511-etc-swift") pod "swift-storage-0" (UID: "ece11938-c758-4d62-ad84-c630d040f511") : configmap "swift-ring-files" not found Feb 19 18:50:46 crc kubenswrapper[4749]: I0219 18:50:46.718251 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="507b0ab0-b98c-490e-95d4-c9b7e88d969d" path="/var/lib/kubelet/pods/507b0ab0-b98c-490e-95d4-c9b7e88d969d/volumes" Feb 19 18:50:47 crc kubenswrapper[4749]: I0219 18:50:47.120644 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gf9kk" Feb 19 18:50:47 crc kubenswrapper[4749]: I0219 18:50:47.158526 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-gf9kk"] Feb 19 18:50:47 crc kubenswrapper[4749]: I0219 18:50:47.165272 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-gf9kk"] Feb 19 18:50:47 crc kubenswrapper[4749]: I0219 18:50:47.613795 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c7f86b5bc-tks59" Feb 19 18:50:48 crc kubenswrapper[4749]: I0219 18:50:48.546923 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ece11938-c758-4d62-ad84-c630d040f511-etc-swift\") pod \"swift-storage-0\" (UID: \"ece11938-c758-4d62-ad84-c630d040f511\") " pod="openstack/swift-storage-0" Feb 19 18:50:48 crc kubenswrapper[4749]: E0219 18:50:48.547797 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 18:50:48 crc kubenswrapper[4749]: E0219 18:50:48.547894 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 18:50:48 crc kubenswrapper[4749]: E0219 18:50:48.548045 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ece11938-c758-4d62-ad84-c630d040f511-etc-swift podName:ece11938-c758-4d62-ad84-c630d040f511 nodeName:}" failed. No retries permitted until 2026-02-19 18:50:52.548011118 +0000 UTC m=+1026.509231072 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ece11938-c758-4d62-ad84-c630d040f511-etc-swift") pod "swift-storage-0" (UID: "ece11938-c758-4d62-ad84-c630d040f511") : configmap "swift-ring-files" not found Feb 19 18:50:48 crc kubenswrapper[4749]: I0219 18:50:48.688986 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63ce035e-5489-4e24-8b72-55137a407adc" path="/var/lib/kubelet/pods/63ce035e-5489-4e24-8b72-55137a407adc/volumes" Feb 19 18:50:49 crc kubenswrapper[4749]: I0219 18:50:49.413120 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 19 18:50:49 crc kubenswrapper[4749]: I0219 18:50:49.413173 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 19 18:50:50 crc kubenswrapper[4749]: I0219 18:50:50.714756 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:50 crc kubenswrapper[4749]: I0219 18:50:50.715111 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:50 crc kubenswrapper[4749]: I0219 18:50:50.737626 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 19 18:50:50 crc kubenswrapper[4749]: I0219 18:50:50.845786 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 19 18:50:52 crc kubenswrapper[4749]: I0219 18:50:52.619299 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ece11938-c758-4d62-ad84-c630d040f511-etc-swift\") pod \"swift-storage-0\" (UID: \"ece11938-c758-4d62-ad84-c630d040f511\") " pod="openstack/swift-storage-0" Feb 19 18:50:52 crc kubenswrapper[4749]: E0219 18:50:52.619483 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 18:50:52 crc kubenswrapper[4749]: E0219 18:50:52.619778 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 18:50:52 crc kubenswrapper[4749]: E0219 18:50:52.619828 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ece11938-c758-4d62-ad84-c630d040f511-etc-swift podName:ece11938-c758-4d62-ad84-c630d040f511 nodeName:}" failed. No retries permitted until 2026-02-19 18:51:00.619809493 +0000 UTC m=+1034.581029447 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ece11938-c758-4d62-ad84-c630d040f511-etc-swift") pod "swift-storage-0" (UID: "ece11938-c758-4d62-ad84-c630d040f511") : configmap "swift-ring-files" not found Feb 19 18:50:53 crc kubenswrapper[4749]: I0219 18:50:53.861146 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" Feb 19 18:50:53 crc kubenswrapper[4749]: I0219 18:50:53.913623 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c7f86b5bc-tks59"] Feb 19 18:50:53 crc kubenswrapper[4749]: I0219 18:50:53.913869 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c7f86b5bc-tks59" podUID="1791b0e1-f604-483e-b4f3-79efb0779828" containerName="dnsmasq-dns" containerID="cri-o://ff2cd942b500f9ff402b86e4661b7086436627224e6cbea39c903f00a21622a1" gracePeriod=10 Feb 19 18:50:54 crc kubenswrapper[4749]: W0219 18:50:54.032184 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1234ce5_5e40_4f76_a3b5_8b47853bf147.slice/crio-c2b7c510e5516736fca67d3867c88a114e489f997d7b70e78518e9c519fbe3b1 WatchSource:0}: Error finding container c2b7c510e5516736fca67d3867c88a114e489f997d7b70e78518e9c519fbe3b1: Status 404 returned error can't find the container with id c2b7c510e5516736fca67d3867c88a114e489f997d7b70e78518e9c519fbe3b1 Feb 19 18:50:54 crc kubenswrapper[4749]: I0219 18:50:54.176706 4749 generic.go:334] "Generic (PLEG): container finished" podID="1791b0e1-f604-483e-b4f3-79efb0779828" containerID="ff2cd942b500f9ff402b86e4661b7086436627224e6cbea39c903f00a21622a1" exitCode=0 Feb 19 18:50:54 crc kubenswrapper[4749]: I0219 18:50:54.176786 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7f86b5bc-tks59" event={"ID":"1791b0e1-f604-483e-b4f3-79efb0779828","Type":"ContainerDied","Data":"ff2cd942b500f9ff402b86e4661b7086436627224e6cbea39c903f00a21622a1"} Feb 19 18:50:54 crc kubenswrapper[4749]: I0219 18:50:54.178227 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-czb4z" event={"ID":"f1234ce5-5e40-4f76-a3b5-8b47853bf147","Type":"ContainerStarted","Data":"c2b7c510e5516736fca67d3867c88a114e489f997d7b70e78518e9c519fbe3b1"} Feb 19 18:50:54 crc kubenswrapper[4749]: I0219 18:50:54.556166 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c7f86b5bc-tks59" Feb 19 18:50:54 crc kubenswrapper[4749]: I0219 18:50:54.725260 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 18:50:54 crc kubenswrapper[4749]: I0219 18:50:54.725315 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 18:50:54 crc kubenswrapper[4749]: I0219 18:50:54.725362 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 18:50:54 crc kubenswrapper[4749]: I0219 18:50:54.726639 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"53301e57110cdb8ea70a0d60fa7f17a0a5c063180c8f2db9d35e0ad01b3622e9"} pod="openshift-machine-config-operator/machine-config-daemon-nzldt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 18:50:54 crc kubenswrapper[4749]: I0219 18:50:54.726698 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" containerID="cri-o://53301e57110cdb8ea70a0d60fa7f17a0a5c063180c8f2db9d35e0ad01b3622e9" gracePeriod=600 Feb 19 18:50:54 crc kubenswrapper[4749]: I0219 18:50:54.753645 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1791b0e1-f604-483e-b4f3-79efb0779828-dns-svc\") pod \"1791b0e1-f604-483e-b4f3-79efb0779828\" (UID: \"1791b0e1-f604-483e-b4f3-79efb0779828\") " Feb 19 18:50:54 crc kubenswrapper[4749]: I0219 18:50:54.753935 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1791b0e1-f604-483e-b4f3-79efb0779828-ovsdbserver-sb\") pod \"1791b0e1-f604-483e-b4f3-79efb0779828\" (UID: \"1791b0e1-f604-483e-b4f3-79efb0779828\") " Feb 19 18:50:54 crc kubenswrapper[4749]: I0219 18:50:54.753986 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1791b0e1-f604-483e-b4f3-79efb0779828-config\") pod \"1791b0e1-f604-483e-b4f3-79efb0779828\" (UID: \"1791b0e1-f604-483e-b4f3-79efb0779828\") " Feb 19 18:50:54 crc kubenswrapper[4749]: I0219 18:50:54.754056 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46w9l\" (UniqueName: \"kubernetes.io/projected/1791b0e1-f604-483e-b4f3-79efb0779828-kube-api-access-46w9l\") pod \"1791b0e1-f604-483e-b4f3-79efb0779828\" (UID: \"1791b0e1-f604-483e-b4f3-79efb0779828\") " Feb 19 18:50:54 crc kubenswrapper[4749]: I0219 18:50:54.768317 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1791b0e1-f604-483e-b4f3-79efb0779828-kube-api-access-46w9l" (OuterVolumeSpecName: "kube-api-access-46w9l") pod "1791b0e1-f604-483e-b4f3-79efb0779828" (UID: "1791b0e1-f604-483e-b4f3-79efb0779828"). InnerVolumeSpecName "kube-api-access-46w9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:50:54 crc kubenswrapper[4749]: I0219 18:50:54.794978 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1791b0e1-f604-483e-b4f3-79efb0779828-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1791b0e1-f604-483e-b4f3-79efb0779828" (UID: "1791b0e1-f604-483e-b4f3-79efb0779828"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:50:54 crc kubenswrapper[4749]: I0219 18:50:54.795433 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1791b0e1-f604-483e-b4f3-79efb0779828-config" (OuterVolumeSpecName: "config") pod "1791b0e1-f604-483e-b4f3-79efb0779828" (UID: "1791b0e1-f604-483e-b4f3-79efb0779828"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:50:54 crc kubenswrapper[4749]: I0219 18:50:54.796115 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1791b0e1-f604-483e-b4f3-79efb0779828-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1791b0e1-f604-483e-b4f3-79efb0779828" (UID: "1791b0e1-f604-483e-b4f3-79efb0779828"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:50:54 crc kubenswrapper[4749]: I0219 18:50:54.855686 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46w9l\" (UniqueName: \"kubernetes.io/projected/1791b0e1-f604-483e-b4f3-79efb0779828-kube-api-access-46w9l\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:54 crc kubenswrapper[4749]: I0219 18:50:54.855714 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1791b0e1-f604-483e-b4f3-79efb0779828-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:54 crc kubenswrapper[4749]: I0219 18:50:54.855725 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1791b0e1-f604-483e-b4f3-79efb0779828-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:54 crc kubenswrapper[4749]: I0219 18:50:54.855732 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1791b0e1-f604-483e-b4f3-79efb0779828-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:55 crc kubenswrapper[4749]: I0219 18:50:55.189695 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"21b53583-c33c-47c6-8351-35bd5f08e632","Type":"ContainerStarted","Data":"805ffae3cb911a697a50fd95139308d89663c03e7c9b94a22c8bde6c38c656a4"} Feb 19 18:50:55 crc kubenswrapper[4749]: I0219 18:50:55.194092 4749 generic.go:334] "Generic (PLEG): container finished" podID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerID="53301e57110cdb8ea70a0d60fa7f17a0a5c063180c8f2db9d35e0ad01b3622e9" exitCode=0 Feb 19 18:50:55 crc kubenswrapper[4749]: I0219 18:50:55.194179 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerDied","Data":"53301e57110cdb8ea70a0d60fa7f17a0a5c063180c8f2db9d35e0ad01b3622e9"} Feb 19 18:50:55 crc kubenswrapper[4749]: I0219 18:50:55.194211 4749 scope.go:117] "RemoveContainer" containerID="d32840142f59a3f51a6617459783496dfcb99167a3d91ff021347454591db672" Feb 19 18:50:55 crc kubenswrapper[4749]: I0219 18:50:55.197054 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7f86b5bc-tks59" event={"ID":"1791b0e1-f604-483e-b4f3-79efb0779828","Type":"ContainerDied","Data":"7ca5d848a8edcf399f69e4496df88f257a9acf317ee172a5971446acc738d3f9"} Feb 19 18:50:55 crc kubenswrapper[4749]: I0219 18:50:55.197134 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c7f86b5bc-tks59" Feb 19 18:50:55 crc kubenswrapper[4749]: I0219 18:50:55.219450 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=19.450408377 podStartE2EDuration="52.219428257s" podCreationTimestamp="2026-02-19 18:50:03 +0000 UTC" firstStartedPulling="2026-02-19 18:50:21.5315614 +0000 UTC m=+995.492781344" lastFinishedPulling="2026-02-19 18:50:54.30058127 +0000 UTC m=+1028.261801224" observedRunningTime="2026-02-19 18:50:55.211638998 +0000 UTC m=+1029.172858962" watchObservedRunningTime="2026-02-19 18:50:55.219428257 +0000 UTC m=+1029.180648231" Feb 19 18:50:55 crc kubenswrapper[4749]: I0219 18:50:55.236049 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c7f86b5bc-tks59"] Feb 19 18:50:55 crc kubenswrapper[4749]: I0219 18:50:55.242660 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c7f86b5bc-tks59"] Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.101649 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-f544-account-create-update-h22hj"] Feb 19 18:50:56 crc kubenswrapper[4749]: E0219 18:50:56.103505 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1791b0e1-f604-483e-b4f3-79efb0779828" containerName="init" Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.103539 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1791b0e1-f604-483e-b4f3-79efb0779828" containerName="init" Feb 19 18:50:56 crc kubenswrapper[4749]: E0219 18:50:56.103604 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1791b0e1-f604-483e-b4f3-79efb0779828" containerName="dnsmasq-dns" Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.103615 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1791b0e1-f604-483e-b4f3-79efb0779828" containerName="dnsmasq-dns" Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.104774 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1791b0e1-f604-483e-b4f3-79efb0779828" containerName="dnsmasq-dns" Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.105453 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f544-account-create-update-h22hj" Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.107912 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.130716 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f544-account-create-update-h22hj"] Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.141957 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-8mb6x"] Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.145605 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8mb6x" Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.157594 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8mb6x"] Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.207938 4749 generic.go:334] "Generic (PLEG): container finished" podID="27dfe8e9-686d-4703-b36d-df6b94491b40" containerID="c0e69d20b50ef864ddb00ad8f03e6c1432650658590c1887e3429fa7d1f8449f" exitCode=0 Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.208092 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"27dfe8e9-686d-4703-b36d-df6b94491b40","Type":"ContainerDied","Data":"c0e69d20b50ef864ddb00ad8f03e6c1432650658590c1887e3429fa7d1f8449f"} Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.284075 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1abcce5b-61f2-44f5-82e3-9d476a81ccd8-operator-scripts\") pod \"glance-f544-account-create-update-h22hj\" (UID: \"1abcce5b-61f2-44f5-82e3-9d476a81ccd8\") " pod="openstack/glance-f544-account-create-update-h22hj" Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.284140 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a056c11-d814-41ff-a7b7-a1aaa36ed053-operator-scripts\") pod \"glance-db-create-8mb6x\" (UID: \"0a056c11-d814-41ff-a7b7-a1aaa36ed053\") " pod="openstack/glance-db-create-8mb6x" Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.284248 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29dn8\" (UniqueName: \"kubernetes.io/projected/1abcce5b-61f2-44f5-82e3-9d476a81ccd8-kube-api-access-29dn8\") pod \"glance-f544-account-create-update-h22hj\" (UID: \"1abcce5b-61f2-44f5-82e3-9d476a81ccd8\") " pod="openstack/glance-f544-account-create-update-h22hj" Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.284289 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p574n\" (UniqueName: \"kubernetes.io/projected/0a056c11-d814-41ff-a7b7-a1aaa36ed053-kube-api-access-p574n\") pod \"glance-db-create-8mb6x\" (UID: \"0a056c11-d814-41ff-a7b7-a1aaa36ed053\") " pod="openstack/glance-db-create-8mb6x" Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.386169 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29dn8\" (UniqueName: \"kubernetes.io/projected/1abcce5b-61f2-44f5-82e3-9d476a81ccd8-kube-api-access-29dn8\") pod \"glance-f544-account-create-update-h22hj\" (UID: \"1abcce5b-61f2-44f5-82e3-9d476a81ccd8\") " pod="openstack/glance-f544-account-create-update-h22hj" Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.386445 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p574n\" (UniqueName: \"kubernetes.io/projected/0a056c11-d814-41ff-a7b7-a1aaa36ed053-kube-api-access-p574n\") pod \"glance-db-create-8mb6x\" (UID: \"0a056c11-d814-41ff-a7b7-a1aaa36ed053\") " pod="openstack/glance-db-create-8mb6x" Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.386693 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1abcce5b-61f2-44f5-82e3-9d476a81ccd8-operator-scripts\") pod \"glance-f544-account-create-update-h22hj\" (UID: \"1abcce5b-61f2-44f5-82e3-9d476a81ccd8\") " pod="openstack/glance-f544-account-create-update-h22hj" Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.386771 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a056c11-d814-41ff-a7b7-a1aaa36ed053-operator-scripts\") pod \"glance-db-create-8mb6x\" (UID: \"0a056c11-d814-41ff-a7b7-a1aaa36ed053\") " pod="openstack/glance-db-create-8mb6x" Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.387889 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a056c11-d814-41ff-a7b7-a1aaa36ed053-operator-scripts\") pod \"glance-db-create-8mb6x\" (UID: \"0a056c11-d814-41ff-a7b7-a1aaa36ed053\") " pod="openstack/glance-db-create-8mb6x" Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.388433 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1abcce5b-61f2-44f5-82e3-9d476a81ccd8-operator-scripts\") pod \"glance-f544-account-create-update-h22hj\" (UID: \"1abcce5b-61f2-44f5-82e3-9d476a81ccd8\") " pod="openstack/glance-f544-account-create-update-h22hj" Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.410911 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29dn8\" (UniqueName: \"kubernetes.io/projected/1abcce5b-61f2-44f5-82e3-9d476a81ccd8-kube-api-access-29dn8\") pod \"glance-f544-account-create-update-h22hj\" (UID: \"1abcce5b-61f2-44f5-82e3-9d476a81ccd8\") " pod="openstack/glance-f544-account-create-update-h22hj" Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.413546 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p574n\" (UniqueName: \"kubernetes.io/projected/0a056c11-d814-41ff-a7b7-a1aaa36ed053-kube-api-access-p574n\") pod \"glance-db-create-8mb6x\" (UID: \"0a056c11-d814-41ff-a7b7-a1aaa36ed053\") " pod="openstack/glance-db-create-8mb6x" Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.485749 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8mb6x" Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.488950 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f544-account-create-update-h22hj" Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.688398 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1791b0e1-f604-483e-b4f3-79efb0779828" path="/var/lib/kubelet/pods/1791b0e1-f604-483e-b4f3-79efb0779828/volumes" Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.849120 4749 scope.go:117] "RemoveContainer" containerID="ff2cd942b500f9ff402b86e4661b7086436627224e6cbea39c903f00a21622a1" Feb 19 18:50:56 crc kubenswrapper[4749]: I0219 18:50:56.984601 4749 scope.go:117] "RemoveContainer" containerID="aae4c96dad7ebe1a2d5a2cd30fbb4333009f03fa387f0a0d855c07094dddbd3f" Feb 19 18:50:57 crc kubenswrapper[4749]: I0219 18:50:57.067073 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:57 crc kubenswrapper[4749]: I0219 18:50:57.217492 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-czb4z" event={"ID":"f1234ce5-5e40-4f76-a3b5-8b47853bf147","Type":"ContainerStarted","Data":"7e902fa8be8764cd07b40bd1e4e2bf1cedf7edf7ab3e68df5a989c8e0f2e26c7"} Feb 19 18:50:57 crc kubenswrapper[4749]: I0219 18:50:57.219478 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"27dfe8e9-686d-4703-b36d-df6b94491b40","Type":"ContainerStarted","Data":"7ffdbd28264433132e17faa8877085137b620997fecfc3aab4e00954788a4641"} Feb 19 18:50:57 crc kubenswrapper[4749]: I0219 18:50:57.220229 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:50:57 crc kubenswrapper[4749]: I0219 18:50:57.220304 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 19 18:50:57 crc kubenswrapper[4749]: I0219 18:50:57.223907 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerStarted","Data":"92113cb5e4b06748bc8c8134a5b7e7475e1c6d6f0cf9e918b487ab08df45bb1f"} Feb 19 18:50:57 crc kubenswrapper[4749]: I0219 18:50:57.226567 4749 generic.go:334] "Generic (PLEG): container finished" podID="042fb593-4898-4085-889e-7ccb375cf969" containerID="fd26a9103b0a88682847d38bd2a0a2ca1f91ec3eea2089769c23970dca2fdfd3" exitCode=0 Feb 19 18:50:57 crc kubenswrapper[4749]: I0219 18:50:57.226630 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"042fb593-4898-4085-889e-7ccb375cf969","Type":"ContainerDied","Data":"fd26a9103b0a88682847d38bd2a0a2ca1f91ec3eea2089769c23970dca2fdfd3"} Feb 19 18:50:57 crc kubenswrapper[4749]: I0219 18:50:57.228596 4749 generic.go:334] "Generic (PLEG): container finished" podID="008062c0-9ccf-4fd2-9b54-63196268da38" containerID="d424cd182682a0c438d89dbb09758c27fa545c0f0dd6fcc5b71037d7ded4f1c4" exitCode=0 Feb 19 18:50:57 crc kubenswrapper[4749]: I0219 18:50:57.228656 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"008062c0-9ccf-4fd2-9b54-63196268da38","Type":"ContainerDied","Data":"d424cd182682a0c438d89dbb09758c27fa545c0f0dd6fcc5b71037d7ded4f1c4"} Feb 19 18:50:57 crc kubenswrapper[4749]: I0219 18:50:57.291341 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/notifications-rabbitmq-server-0" podStartSLOduration=52.399286156 podStartE2EDuration="1m2.291310981s" podCreationTimestamp="2026-02-19 18:49:55 +0000 UTC" firstStartedPulling="2026-02-19 18:50:12.514001534 +0000 UTC m=+986.475221498" lastFinishedPulling="2026-02-19 18:50:22.406026369 +0000 UTC m=+996.367246323" observedRunningTime="2026-02-19 18:50:57.269153851 +0000 UTC m=+1031.230373805" watchObservedRunningTime="2026-02-19 18:50:57.291310981 +0000 UTC m=+1031.252530935" Feb 19 18:50:57 crc kubenswrapper[4749]: I0219 18:50:57.364747 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f544-account-create-update-h22hj"] Feb 19 18:50:57 crc kubenswrapper[4749]: I0219 18:50:57.380939 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8mb6x"] Feb 19 18:50:57 crc kubenswrapper[4749]: I0219 18:50:57.938621 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 19 18:50:58 crc kubenswrapper[4749]: I0219 18:50:58.072083 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-b56xl"] Feb 19 18:50:58 crc kubenswrapper[4749]: I0219 18:50:58.073995 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b56xl" Feb 19 18:50:58 crc kubenswrapper[4749]: I0219 18:50:58.077906 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 19 18:50:58 crc kubenswrapper[4749]: I0219 18:50:58.085719 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-b56xl"] Feb 19 18:50:58 crc kubenswrapper[4749]: I0219 18:50:58.120412 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71b3fa88-5159-45a2-a11e-13b43da32f20-operator-scripts\") pod \"root-account-create-update-b56xl\" (UID: \"71b3fa88-5159-45a2-a11e-13b43da32f20\") " pod="openstack/root-account-create-update-b56xl" Feb 19 18:50:58 crc kubenswrapper[4749]: I0219 18:50:58.120638 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v867\" (UniqueName: \"kubernetes.io/projected/71b3fa88-5159-45a2-a11e-13b43da32f20-kube-api-access-8v867\") pod \"root-account-create-update-b56xl\" (UID: \"71b3fa88-5159-45a2-a11e-13b43da32f20\") " pod="openstack/root-account-create-update-b56xl" Feb 19 18:50:58 crc kubenswrapper[4749]: I0219 18:50:58.222085 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v867\" (UniqueName: \"kubernetes.io/projected/71b3fa88-5159-45a2-a11e-13b43da32f20-kube-api-access-8v867\") pod \"root-account-create-update-b56xl\" (UID: \"71b3fa88-5159-45a2-a11e-13b43da32f20\") " pod="openstack/root-account-create-update-b56xl" Feb 19 18:50:58 crc kubenswrapper[4749]: I0219 18:50:58.222160 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71b3fa88-5159-45a2-a11e-13b43da32f20-operator-scripts\") pod \"root-account-create-update-b56xl\" (UID: \"71b3fa88-5159-45a2-a11e-13b43da32f20\") " pod="openstack/root-account-create-update-b56xl" Feb 19 18:50:58 crc kubenswrapper[4749]: I0219 18:50:58.223059 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71b3fa88-5159-45a2-a11e-13b43da32f20-operator-scripts\") pod \"root-account-create-update-b56xl\" (UID: \"71b3fa88-5159-45a2-a11e-13b43da32f20\") " pod="openstack/root-account-create-update-b56xl" Feb 19 18:50:58 crc kubenswrapper[4749]: I0219 18:50:58.244451 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"042fb593-4898-4085-889e-7ccb375cf969","Type":"ContainerStarted","Data":"27a5fc8a7e6b23c377b6b816b449c02eb2f42223b70eefa4d494e95e767e7d8e"} Feb 19 18:50:58 crc kubenswrapper[4749]: I0219 18:50:58.245328 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 19 18:50:58 crc kubenswrapper[4749]: I0219 18:50:58.249773 4749 generic.go:334] "Generic (PLEG): container finished" podID="0a056c11-d814-41ff-a7b7-a1aaa36ed053" containerID="6df48fc95856a290bf4052b93a35b03d065fad77c8ea9e1645e2ee1c181fcbe5" exitCode=0 Feb 19 18:50:58 crc kubenswrapper[4749]: I0219 18:50:58.249849 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8mb6x" event={"ID":"0a056c11-d814-41ff-a7b7-a1aaa36ed053","Type":"ContainerDied","Data":"6df48fc95856a290bf4052b93a35b03d065fad77c8ea9e1645e2ee1c181fcbe5"} Feb 19 18:50:58 crc kubenswrapper[4749]: I0219 18:50:58.249873 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8mb6x" event={"ID":"0a056c11-d814-41ff-a7b7-a1aaa36ed053","Type":"ContainerStarted","Data":"e233cbeb80eb1c679255c2123f805ad2c5167302cc0d80536680bdfa1a81add2"} Feb 19 18:50:58 crc kubenswrapper[4749]: I0219 18:50:58.251790 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v867\" (UniqueName: \"kubernetes.io/projected/71b3fa88-5159-45a2-a11e-13b43da32f20-kube-api-access-8v867\") pod \"root-account-create-update-b56xl\" (UID: \"71b3fa88-5159-45a2-a11e-13b43da32f20\") " pod="openstack/root-account-create-update-b56xl" Feb 19 18:50:58 crc kubenswrapper[4749]: I0219 18:50:58.251888 4749 generic.go:334] "Generic (PLEG): container finished" podID="1abcce5b-61f2-44f5-82e3-9d476a81ccd8" containerID="b711117dbecc09677e41ebcf5ecc036ae53d0f39e0d400503a109aad9d3ff5b6" exitCode=0 Feb 19 18:50:58 crc kubenswrapper[4749]: I0219 18:50:58.251971 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f544-account-create-update-h22hj" event={"ID":"1abcce5b-61f2-44f5-82e3-9d476a81ccd8","Type":"ContainerDied","Data":"b711117dbecc09677e41ebcf5ecc036ae53d0f39e0d400503a109aad9d3ff5b6"} Feb 19 18:50:58 crc kubenswrapper[4749]: I0219 18:50:58.251992 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f544-account-create-update-h22hj" event={"ID":"1abcce5b-61f2-44f5-82e3-9d476a81ccd8","Type":"ContainerStarted","Data":"da2556673088c3be31627513607fd778ca1a8440c6b5eedd83cffc4f418830bf"} Feb 19 18:50:58 crc kubenswrapper[4749]: I0219 18:50:58.254576 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"008062c0-9ccf-4fd2-9b54-63196268da38","Type":"ContainerStarted","Data":"a725c718506d3e939f66c1ae53a59f357a04f1a719595919ba7965a418ce89e8"} Feb 19 18:50:58 crc kubenswrapper[4749]: I0219 18:50:58.255001 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:50:58 crc kubenswrapper[4749]: I0219 18:50:58.271944 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=62.271924852 podStartE2EDuration="1m2.271924852s" podCreationTimestamp="2026-02-19 18:49:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:50:58.266456309 +0000 UTC m=+1032.227676283" watchObservedRunningTime="2026-02-19 18:50:58.271924852 +0000 UTC m=+1032.233144806" Feb 19 18:50:58 crc kubenswrapper[4749]: I0219 18:50:58.328566 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-czb4z" podStartSLOduration=10.441053189 podStartE2EDuration="13.328552039s" podCreationTimestamp="2026-02-19 18:50:45 +0000 UTC" firstStartedPulling="2026-02-19 18:50:54.041719041 +0000 UTC m=+1028.002938995" lastFinishedPulling="2026-02-19 18:50:56.929217891 +0000 UTC m=+1030.890437845" observedRunningTime="2026-02-19 18:50:58.313075733 +0000 UTC m=+1032.274295707" watchObservedRunningTime="2026-02-19 18:50:58.328552039 +0000 UTC m=+1032.289771993" Feb 19 18:50:58 crc kubenswrapper[4749]: I0219 18:50:58.402858 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b56xl" Feb 19 18:50:58 crc kubenswrapper[4749]: I0219 18:50:58.829296 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=62.829279813 podStartE2EDuration="1m2.829279813s" podCreationTimestamp="2026-02-19 18:49:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:50:58.352534893 +0000 UTC m=+1032.313754847" watchObservedRunningTime="2026-02-19 18:50:58.829279813 +0000 UTC m=+1032.790499767" Feb 19 18:50:58 crc kubenswrapper[4749]: I0219 18:50:58.836909 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-b56xl"] Feb 19 18:50:58 crc kubenswrapper[4749]: W0219 18:50:58.840642 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71b3fa88_5159_45a2_a11e_13b43da32f20.slice/crio-f03ad28a187846c37233a43db50831c0b92300f66445f64abab270d34d5573ee WatchSource:0}: Error finding container f03ad28a187846c37233a43db50831c0b92300f66445f64abab270d34d5573ee: Status 404 returned error can't find the container with id f03ad28a187846c37233a43db50831c0b92300f66445f64abab270d34d5573ee Feb 19 18:50:59 crc kubenswrapper[4749]: I0219 18:50:59.263118 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b56xl" event={"ID":"71b3fa88-5159-45a2-a11e-13b43da32f20","Type":"ContainerStarted","Data":"505eaac2bb687c42a34911e46d43ca987fdfa63a1a58808c66af3062702a0237"} Feb 19 18:50:59 crc kubenswrapper[4749]: I0219 18:50:59.263193 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b56xl" event={"ID":"71b3fa88-5159-45a2-a11e-13b43da32f20","Type":"ContainerStarted","Data":"f03ad28a187846c37233a43db50831c0b92300f66445f64abab270d34d5573ee"} Feb 19 18:50:59 crc kubenswrapper[4749]: I0219 18:50:59.289881 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-b56xl" podStartSLOduration=1.28985994 podStartE2EDuration="1.28985994s" podCreationTimestamp="2026-02-19 18:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:50:59.287120544 +0000 UTC m=+1033.248340518" watchObservedRunningTime="2026-02-19 18:50:59.28985994 +0000 UTC m=+1033.251079894" Feb 19 18:50:59 crc kubenswrapper[4749]: I0219 18:50:59.626751 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8mb6x" Feb 19 18:50:59 crc kubenswrapper[4749]: I0219 18:50:59.648828 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 19 18:50:59 crc kubenswrapper[4749]: I0219 18:50:59.733914 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f544-account-create-update-h22hj" Feb 19 18:50:59 crc kubenswrapper[4749]: I0219 18:50:59.744885 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a056c11-d814-41ff-a7b7-a1aaa36ed053-operator-scripts\") pod \"0a056c11-d814-41ff-a7b7-a1aaa36ed053\" (UID: \"0a056c11-d814-41ff-a7b7-a1aaa36ed053\") " Feb 19 18:50:59 crc kubenswrapper[4749]: I0219 18:50:59.744985 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p574n\" (UniqueName: \"kubernetes.io/projected/0a056c11-d814-41ff-a7b7-a1aaa36ed053-kube-api-access-p574n\") pod \"0a056c11-d814-41ff-a7b7-a1aaa36ed053\" (UID: \"0a056c11-d814-41ff-a7b7-a1aaa36ed053\") " Feb 19 18:50:59 crc kubenswrapper[4749]: I0219 18:50:59.745482 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a056c11-d814-41ff-a7b7-a1aaa36ed053-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0a056c11-d814-41ff-a7b7-a1aaa36ed053" (UID: "0a056c11-d814-41ff-a7b7-a1aaa36ed053"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:50:59 crc kubenswrapper[4749]: I0219 18:50:59.752294 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a056c11-d814-41ff-a7b7-a1aaa36ed053-kube-api-access-p574n" (OuterVolumeSpecName: "kube-api-access-p574n") pod "0a056c11-d814-41ff-a7b7-a1aaa36ed053" (UID: "0a056c11-d814-41ff-a7b7-a1aaa36ed053"). InnerVolumeSpecName "kube-api-access-p574n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:50:59 crc kubenswrapper[4749]: I0219 18:50:59.847057 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1abcce5b-61f2-44f5-82e3-9d476a81ccd8-operator-scripts\") pod \"1abcce5b-61f2-44f5-82e3-9d476a81ccd8\" (UID: \"1abcce5b-61f2-44f5-82e3-9d476a81ccd8\") " Feb 19 18:50:59 crc kubenswrapper[4749]: I0219 18:50:59.847156 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29dn8\" (UniqueName: \"kubernetes.io/projected/1abcce5b-61f2-44f5-82e3-9d476a81ccd8-kube-api-access-29dn8\") pod \"1abcce5b-61f2-44f5-82e3-9d476a81ccd8\" (UID: \"1abcce5b-61f2-44f5-82e3-9d476a81ccd8\") " Feb 19 18:50:59 crc kubenswrapper[4749]: I0219 18:50:59.847596 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1abcce5b-61f2-44f5-82e3-9d476a81ccd8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1abcce5b-61f2-44f5-82e3-9d476a81ccd8" (UID: "1abcce5b-61f2-44f5-82e3-9d476a81ccd8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:50:59 crc kubenswrapper[4749]: I0219 18:50:59.847955 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a056c11-d814-41ff-a7b7-a1aaa36ed053-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:59 crc kubenswrapper[4749]: I0219 18:50:59.847975 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1abcce5b-61f2-44f5-82e3-9d476a81ccd8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:59 crc kubenswrapper[4749]: I0219 18:50:59.847985 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p574n\" (UniqueName: \"kubernetes.io/projected/0a056c11-d814-41ff-a7b7-a1aaa36ed053-kube-api-access-p574n\") on node \"crc\" DevicePath \"\"" Feb 19 18:50:59 crc kubenswrapper[4749]: I0219 18:50:59.853560 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1abcce5b-61f2-44f5-82e3-9d476a81ccd8-kube-api-access-29dn8" (OuterVolumeSpecName: "kube-api-access-29dn8") pod "1abcce5b-61f2-44f5-82e3-9d476a81ccd8" (UID: "1abcce5b-61f2-44f5-82e3-9d476a81ccd8"). InnerVolumeSpecName "kube-api-access-29dn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:50:59 crc kubenswrapper[4749]: I0219 18:50:59.949942 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29dn8\" (UniqueName: \"kubernetes.io/projected/1abcce5b-61f2-44f5-82e3-9d476a81ccd8-kube-api-access-29dn8\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:00 crc kubenswrapper[4749]: I0219 18:51:00.271675 4749 generic.go:334] "Generic (PLEG): container finished" podID="71b3fa88-5159-45a2-a11e-13b43da32f20" containerID="505eaac2bb687c42a34911e46d43ca987fdfa63a1a58808c66af3062702a0237" exitCode=0 Feb 19 18:51:00 crc kubenswrapper[4749]: I0219 18:51:00.271754 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b56xl" event={"ID":"71b3fa88-5159-45a2-a11e-13b43da32f20","Type":"ContainerDied","Data":"505eaac2bb687c42a34911e46d43ca987fdfa63a1a58808c66af3062702a0237"} Feb 19 18:51:00 crc kubenswrapper[4749]: I0219 18:51:00.273269 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8mb6x" Feb 19 18:51:00 crc kubenswrapper[4749]: I0219 18:51:00.273504 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8mb6x" event={"ID":"0a056c11-d814-41ff-a7b7-a1aaa36ed053","Type":"ContainerDied","Data":"e233cbeb80eb1c679255c2123f805ad2c5167302cc0d80536680bdfa1a81add2"} Feb 19 18:51:00 crc kubenswrapper[4749]: I0219 18:51:00.273536 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e233cbeb80eb1c679255c2123f805ad2c5167302cc0d80536680bdfa1a81add2" Feb 19 18:51:00 crc kubenswrapper[4749]: I0219 18:51:00.275058 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f544-account-create-update-h22hj" Feb 19 18:51:00 crc kubenswrapper[4749]: I0219 18:51:00.275020 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f544-account-create-update-h22hj" event={"ID":"1abcce5b-61f2-44f5-82e3-9d476a81ccd8","Type":"ContainerDied","Data":"da2556673088c3be31627513607fd778ca1a8440c6b5eedd83cffc4f418830bf"} Feb 19 18:51:00 crc kubenswrapper[4749]: I0219 18:51:00.275111 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da2556673088c3be31627513607fd778ca1a8440c6b5eedd83cffc4f418830bf" Feb 19 18:51:00 crc kubenswrapper[4749]: I0219 18:51:00.665088 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ece11938-c758-4d62-ad84-c630d040f511-etc-swift\") pod \"swift-storage-0\" (UID: \"ece11938-c758-4d62-ad84-c630d040f511\") " pod="openstack/swift-storage-0" Feb 19 18:51:00 crc kubenswrapper[4749]: E0219 18:51:00.665378 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 18:51:00 crc kubenswrapper[4749]: E0219 18:51:00.665403 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 18:51:00 crc kubenswrapper[4749]: E0219 18:51:00.665460 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ece11938-c758-4d62-ad84-c630d040f511-etc-swift podName:ece11938-c758-4d62-ad84-c630d040f511 nodeName:}" failed. No retries permitted until 2026-02-19 18:51:16.665441321 +0000 UTC m=+1050.626661275 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ece11938-c758-4d62-ad84-c630d040f511-etc-swift") pod "swift-storage-0" (UID: "ece11938-c758-4d62-ad84-c630d040f511") : configmap "swift-ring-files" not found Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.338526 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-9wjqf"] Feb 19 18:51:01 crc kubenswrapper[4749]: E0219 18:51:01.339413 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1abcce5b-61f2-44f5-82e3-9d476a81ccd8" containerName="mariadb-account-create-update" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.339438 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1abcce5b-61f2-44f5-82e3-9d476a81ccd8" containerName="mariadb-account-create-update" Feb 19 18:51:01 crc kubenswrapper[4749]: E0219 18:51:01.339448 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a056c11-d814-41ff-a7b7-a1aaa36ed053" containerName="mariadb-database-create" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.339454 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a056c11-d814-41ff-a7b7-a1aaa36ed053" containerName="mariadb-database-create" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.339594 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1abcce5b-61f2-44f5-82e3-9d476a81ccd8" containerName="mariadb-account-create-update" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.339615 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a056c11-d814-41ff-a7b7-a1aaa36ed053" containerName="mariadb-database-create" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.340220 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9wjqf" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.342959 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.351447 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9wjqf"] Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.358636 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-mcps2" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.399708 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ztkm4" podUID="7232b466-ffe3-4eab-ad4c-bb2ccac65929" containerName="ovn-controller" probeResult="failure" output=< Feb 19 18:51:01 crc kubenswrapper[4749]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 19 18:51:01 crc kubenswrapper[4749]: > Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.486588 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fb0bea8-d0bd-4a01-a752-7c9697971db8-config-data\") pod \"glance-db-sync-9wjqf\" (UID: \"4fb0bea8-d0bd-4a01-a752-7c9697971db8\") " pod="openstack/glance-db-sync-9wjqf" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.486659 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjrds\" (UniqueName: \"kubernetes.io/projected/4fb0bea8-d0bd-4a01-a752-7c9697971db8-kube-api-access-qjrds\") pod \"glance-db-sync-9wjqf\" (UID: \"4fb0bea8-d0bd-4a01-a752-7c9697971db8\") " pod="openstack/glance-db-sync-9wjqf" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.486812 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fb0bea8-d0bd-4a01-a752-7c9697971db8-combined-ca-bundle\") pod \"glance-db-sync-9wjqf\" (UID: \"4fb0bea8-d0bd-4a01-a752-7c9697971db8\") " pod="openstack/glance-db-sync-9wjqf" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.487144 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4fb0bea8-d0bd-4a01-a752-7c9697971db8-db-sync-config-data\") pod \"glance-db-sync-9wjqf\" (UID: \"4fb0bea8-d0bd-4a01-a752-7c9697971db8\") " pod="openstack/glance-db-sync-9wjqf" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.588794 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4fb0bea8-d0bd-4a01-a752-7c9697971db8-db-sync-config-data\") pod \"glance-db-sync-9wjqf\" (UID: \"4fb0bea8-d0bd-4a01-a752-7c9697971db8\") " pod="openstack/glance-db-sync-9wjqf" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.588869 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fb0bea8-d0bd-4a01-a752-7c9697971db8-config-data\") pod \"glance-db-sync-9wjqf\" (UID: \"4fb0bea8-d0bd-4a01-a752-7c9697971db8\") " pod="openstack/glance-db-sync-9wjqf" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.588919 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjrds\" (UniqueName: \"kubernetes.io/projected/4fb0bea8-d0bd-4a01-a752-7c9697971db8-kube-api-access-qjrds\") pod \"glance-db-sync-9wjqf\" (UID: \"4fb0bea8-d0bd-4a01-a752-7c9697971db8\") " pod="openstack/glance-db-sync-9wjqf" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.588952 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fb0bea8-d0bd-4a01-a752-7c9697971db8-combined-ca-bundle\") pod \"glance-db-sync-9wjqf\" (UID: \"4fb0bea8-d0bd-4a01-a752-7c9697971db8\") " pod="openstack/glance-db-sync-9wjqf" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.595813 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4fb0bea8-d0bd-4a01-a752-7c9697971db8-db-sync-config-data\") pod \"glance-db-sync-9wjqf\" (UID: \"4fb0bea8-d0bd-4a01-a752-7c9697971db8\") " pod="openstack/glance-db-sync-9wjqf" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.596238 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fb0bea8-d0bd-4a01-a752-7c9697971db8-combined-ca-bundle\") pod \"glance-db-sync-9wjqf\" (UID: \"4fb0bea8-d0bd-4a01-a752-7c9697971db8\") " pod="openstack/glance-db-sync-9wjqf" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.599737 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fb0bea8-d0bd-4a01-a752-7c9697971db8-config-data\") pod \"glance-db-sync-9wjqf\" (UID: \"4fb0bea8-d0bd-4a01-a752-7c9697971db8\") " pod="openstack/glance-db-sync-9wjqf" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.607489 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjrds\" (UniqueName: \"kubernetes.io/projected/4fb0bea8-d0bd-4a01-a752-7c9697971db8-kube-api-access-qjrds\") pod \"glance-db-sync-9wjqf\" (UID: \"4fb0bea8-d0bd-4a01-a752-7c9697971db8\") " pod="openstack/glance-db-sync-9wjqf" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.659139 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9wjqf" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.659381 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b56xl" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.798568 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71b3fa88-5159-45a2-a11e-13b43da32f20-operator-scripts\") pod \"71b3fa88-5159-45a2-a11e-13b43da32f20\" (UID: \"71b3fa88-5159-45a2-a11e-13b43da32f20\") " Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.799164 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v867\" (UniqueName: \"kubernetes.io/projected/71b3fa88-5159-45a2-a11e-13b43da32f20-kube-api-access-8v867\") pod \"71b3fa88-5159-45a2-a11e-13b43da32f20\" (UID: \"71b3fa88-5159-45a2-a11e-13b43da32f20\") " Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.799613 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71b3fa88-5159-45a2-a11e-13b43da32f20-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "71b3fa88-5159-45a2-a11e-13b43da32f20" (UID: "71b3fa88-5159-45a2-a11e-13b43da32f20"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.803199 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71b3fa88-5159-45a2-a11e-13b43da32f20-kube-api-access-8v867" (OuterVolumeSpecName: "kube-api-access-8v867") pod "71b3fa88-5159-45a2-a11e-13b43da32f20" (UID: "71b3fa88-5159-45a2-a11e-13b43da32f20"). InnerVolumeSpecName "kube-api-access-8v867". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.901539 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v867\" (UniqueName: \"kubernetes.io/projected/71b3fa88-5159-45a2-a11e-13b43da32f20-kube-api-access-8v867\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.901573 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71b3fa88-5159-45a2-a11e-13b43da32f20-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.992918 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-7mz2s"] Feb 19 18:51:01 crc kubenswrapper[4749]: E0219 18:51:01.993394 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b3fa88-5159-45a2-a11e-13b43da32f20" containerName="mariadb-account-create-update" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.993420 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b3fa88-5159-45a2-a11e-13b43da32f20" containerName="mariadb-account-create-update" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.993663 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="71b3fa88-5159-45a2-a11e-13b43da32f20" containerName="mariadb-account-create-update" Feb 19 18:51:01 crc kubenswrapper[4749]: I0219 18:51:01.994367 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7mz2s" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.005148 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-7mz2s"] Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.104859 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc2s6\" (UniqueName: \"kubernetes.io/projected/dc1e4d4d-f564-46cd-86ae-2b2902b5f678-kube-api-access-hc2s6\") pod \"keystone-db-create-7mz2s\" (UID: \"dc1e4d4d-f564-46cd-86ae-2b2902b5f678\") " pod="openstack/keystone-db-create-7mz2s" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.105513 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc1e4d4d-f564-46cd-86ae-2b2902b5f678-operator-scripts\") pod \"keystone-db-create-7mz2s\" (UID: \"dc1e4d4d-f564-46cd-86ae-2b2902b5f678\") " pod="openstack/keystone-db-create-7mz2s" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.121840 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-794a-account-create-update-296dd"] Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.122971 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-794a-account-create-update-296dd" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.126700 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.136043 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-794a-account-create-update-296dd"] Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.206784 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc1e4d4d-f564-46cd-86ae-2b2902b5f678-operator-scripts\") pod \"keystone-db-create-7mz2s\" (UID: \"dc1e4d4d-f564-46cd-86ae-2b2902b5f678\") " pod="openstack/keystone-db-create-7mz2s" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.206836 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33a5f43c-0d61-48a3-abb9-86a7bb12af24-operator-scripts\") pod \"keystone-794a-account-create-update-296dd\" (UID: \"33a5f43c-0d61-48a3-abb9-86a7bb12af24\") " pod="openstack/keystone-794a-account-create-update-296dd" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.206962 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8ss8\" (UniqueName: \"kubernetes.io/projected/33a5f43c-0d61-48a3-abb9-86a7bb12af24-kube-api-access-x8ss8\") pod \"keystone-794a-account-create-update-296dd\" (UID: \"33a5f43c-0d61-48a3-abb9-86a7bb12af24\") " pod="openstack/keystone-794a-account-create-update-296dd" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.206999 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc2s6\" (UniqueName: \"kubernetes.io/projected/dc1e4d4d-f564-46cd-86ae-2b2902b5f678-kube-api-access-hc2s6\") pod \"keystone-db-create-7mz2s\" (UID: \"dc1e4d4d-f564-46cd-86ae-2b2902b5f678\") " pod="openstack/keystone-db-create-7mz2s" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.207919 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc1e4d4d-f564-46cd-86ae-2b2902b5f678-operator-scripts\") pod \"keystone-db-create-7mz2s\" (UID: \"dc1e4d4d-f564-46cd-86ae-2b2902b5f678\") " pod="openstack/keystone-db-create-7mz2s" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.213816 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-tq984"] Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.215259 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tq984" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.222328 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-tq984"] Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.232686 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc2s6\" (UniqueName: \"kubernetes.io/projected/dc1e4d4d-f564-46cd-86ae-2b2902b5f678-kube-api-access-hc2s6\") pod \"keystone-db-create-7mz2s\" (UID: \"dc1e4d4d-f564-46cd-86ae-2b2902b5f678\") " pod="openstack/keystone-db-create-7mz2s" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.276938 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9wjqf"] Feb 19 18:51:02 crc kubenswrapper[4749]: W0219 18:51:02.286574 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fb0bea8_d0bd_4a01_a752_7c9697971db8.slice/crio-aa301d9176a8bc4314fc8a194c27f029da5066c9dbdafc5e774ea7671fed62f2 WatchSource:0}: Error finding container aa301d9176a8bc4314fc8a194c27f029da5066c9dbdafc5e774ea7671fed62f2: Status 404 returned error can't find the container with id aa301d9176a8bc4314fc8a194c27f029da5066c9dbdafc5e774ea7671fed62f2 Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.300385 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b56xl" event={"ID":"71b3fa88-5159-45a2-a11e-13b43da32f20","Type":"ContainerDied","Data":"f03ad28a187846c37233a43db50831c0b92300f66445f64abab270d34d5573ee"} Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.300440 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f03ad28a187846c37233a43db50831c0b92300f66445f64abab270d34d5573ee" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.300474 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b56xl" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.308212 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33a5f43c-0d61-48a3-abb9-86a7bb12af24-operator-scripts\") pod \"keystone-794a-account-create-update-296dd\" (UID: \"33a5f43c-0d61-48a3-abb9-86a7bb12af24\") " pod="openstack/keystone-794a-account-create-update-296dd" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.308329 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a06e2a80-6319-460a-895c-686ceee7e8df-operator-scripts\") pod \"placement-db-create-tq984\" (UID: \"a06e2a80-6319-460a-895c-686ceee7e8df\") " pod="openstack/placement-db-create-tq984" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.308415 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfn8d\" (UniqueName: \"kubernetes.io/projected/a06e2a80-6319-460a-895c-686ceee7e8df-kube-api-access-hfn8d\") pod \"placement-db-create-tq984\" (UID: \"a06e2a80-6319-460a-895c-686ceee7e8df\") " pod="openstack/placement-db-create-tq984" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.308458 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8ss8\" (UniqueName: \"kubernetes.io/projected/33a5f43c-0d61-48a3-abb9-86a7bb12af24-kube-api-access-x8ss8\") pod \"keystone-794a-account-create-update-296dd\" (UID: \"33a5f43c-0d61-48a3-abb9-86a7bb12af24\") " pod="openstack/keystone-794a-account-create-update-296dd" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.309629 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33a5f43c-0d61-48a3-abb9-86a7bb12af24-operator-scripts\") pod \"keystone-794a-account-create-update-296dd\" (UID: \"33a5f43c-0d61-48a3-abb9-86a7bb12af24\") " pod="openstack/keystone-794a-account-create-update-296dd" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.315449 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7mz2s" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.327726 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-18b6-account-create-update-6cfqr"] Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.329016 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-18b6-account-create-update-6cfqr" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.340830 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.347237 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8ss8\" (UniqueName: \"kubernetes.io/projected/33a5f43c-0d61-48a3-abb9-86a7bb12af24-kube-api-access-x8ss8\") pod \"keystone-794a-account-create-update-296dd\" (UID: \"33a5f43c-0d61-48a3-abb9-86a7bb12af24\") " pod="openstack/keystone-794a-account-create-update-296dd" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.379107 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-18b6-account-create-update-6cfqr"] Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.414722 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a06e2a80-6319-460a-895c-686ceee7e8df-operator-scripts\") pod \"placement-db-create-tq984\" (UID: \"a06e2a80-6319-460a-895c-686ceee7e8df\") " pod="openstack/placement-db-create-tq984" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.414837 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfn8d\" (UniqueName: \"kubernetes.io/projected/a06e2a80-6319-460a-895c-686ceee7e8df-kube-api-access-hfn8d\") pod \"placement-db-create-tq984\" (UID: \"a06e2a80-6319-460a-895c-686ceee7e8df\") " pod="openstack/placement-db-create-tq984" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.414972 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/307d3d46-e4e9-4ed1-abc3-88afd38d5497-operator-scripts\") pod \"placement-18b6-account-create-update-6cfqr\" (UID: \"307d3d46-e4e9-4ed1-abc3-88afd38d5497\") " pod="openstack/placement-18b6-account-create-update-6cfqr" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.414997 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rht6\" (UniqueName: \"kubernetes.io/projected/307d3d46-e4e9-4ed1-abc3-88afd38d5497-kube-api-access-5rht6\") pod \"placement-18b6-account-create-update-6cfqr\" (UID: \"307d3d46-e4e9-4ed1-abc3-88afd38d5497\") " pod="openstack/placement-18b6-account-create-update-6cfqr" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.415714 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a06e2a80-6319-460a-895c-686ceee7e8df-operator-scripts\") pod \"placement-db-create-tq984\" (UID: \"a06e2a80-6319-460a-895c-686ceee7e8df\") " pod="openstack/placement-db-create-tq984" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.437681 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-794a-account-create-update-296dd" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.446710 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfn8d\" (UniqueName: \"kubernetes.io/projected/a06e2a80-6319-460a-895c-686ceee7e8df-kube-api-access-hfn8d\") pod \"placement-db-create-tq984\" (UID: \"a06e2a80-6319-460a-895c-686ceee7e8df\") " pod="openstack/placement-db-create-tq984" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.516863 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/307d3d46-e4e9-4ed1-abc3-88afd38d5497-operator-scripts\") pod \"placement-18b6-account-create-update-6cfqr\" (UID: \"307d3d46-e4e9-4ed1-abc3-88afd38d5497\") " pod="openstack/placement-18b6-account-create-update-6cfqr" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.517223 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rht6\" (UniqueName: \"kubernetes.io/projected/307d3d46-e4e9-4ed1-abc3-88afd38d5497-kube-api-access-5rht6\") pod \"placement-18b6-account-create-update-6cfqr\" (UID: \"307d3d46-e4e9-4ed1-abc3-88afd38d5497\") " pod="openstack/placement-18b6-account-create-update-6cfqr" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.517778 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/307d3d46-e4e9-4ed1-abc3-88afd38d5497-operator-scripts\") pod \"placement-18b6-account-create-update-6cfqr\" (UID: \"307d3d46-e4e9-4ed1-abc3-88afd38d5497\") " pod="openstack/placement-18b6-account-create-update-6cfqr" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.532435 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tq984" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.536004 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rht6\" (UniqueName: \"kubernetes.io/projected/307d3d46-e4e9-4ed1-abc3-88afd38d5497-kube-api-access-5rht6\") pod \"placement-18b6-account-create-update-6cfqr\" (UID: \"307d3d46-e4e9-4ed1-abc3-88afd38d5497\") " pod="openstack/placement-18b6-account-create-update-6cfqr" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.662547 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-18b6-account-create-update-6cfqr" Feb 19 18:51:02 crc kubenswrapper[4749]: I0219 18:51:02.794461 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-7mz2s"] Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.000227 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-18b6-account-create-update-6cfqr"] Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.006639 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-794a-account-create-update-296dd"] Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.097406 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-tq984"] Feb 19 18:51:03 crc kubenswrapper[4749]: W0219 18:51:03.109945 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda06e2a80_6319_460a_895c_686ceee7e8df.slice/crio-271f7637d5f1d578476a734ffbd9cdc8ebfac28135a5e3d010929bbbce349177 WatchSource:0}: Error finding container 271f7637d5f1d578476a734ffbd9cdc8ebfac28135a5e3d010929bbbce349177: Status 404 returned error can't find the container with id 271f7637d5f1d578476a734ffbd9cdc8ebfac28135a5e3d010929bbbce349177 Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.310216 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-794a-account-create-update-296dd" event={"ID":"33a5f43c-0d61-48a3-abb9-86a7bb12af24","Type":"ContainerStarted","Data":"99ea3770e65720cdb19e7356929d565c47ef4c6b7954073cb2d57c0fee98f1db"} Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.310260 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-794a-account-create-update-296dd" event={"ID":"33a5f43c-0d61-48a3-abb9-86a7bb12af24","Type":"ContainerStarted","Data":"69ab4c778d09dcbc511d4590232a4b86e2c27669c7fa9f5c2401ba7f561d4389"} Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.312206 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7mz2s" event={"ID":"dc1e4d4d-f564-46cd-86ae-2b2902b5f678","Type":"ContainerStarted","Data":"e0e6efb47dd1e6d9f27f0466b50e790b54f4447f59a7c7bdbbafb80b8c1d99b2"} Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.312232 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7mz2s" event={"ID":"dc1e4d4d-f564-46cd-86ae-2b2902b5f678","Type":"ContainerStarted","Data":"d7949fd992c4fc8f365c0d979d15a02cfdc33090b9c1afe670f8d8cb2ebd7b15"} Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.314406 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-tq984" event={"ID":"a06e2a80-6319-460a-895c-686ceee7e8df","Type":"ContainerStarted","Data":"2d3d2170553317949e650860ca2acf7aabe778227f7f709ef09631ece2474c0d"} Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.314448 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-tq984" event={"ID":"a06e2a80-6319-460a-895c-686ceee7e8df","Type":"ContainerStarted","Data":"271f7637d5f1d578476a734ffbd9cdc8ebfac28135a5e3d010929bbbce349177"} Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.315710 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9wjqf" event={"ID":"4fb0bea8-d0bd-4a01-a752-7c9697971db8","Type":"ContainerStarted","Data":"aa301d9176a8bc4314fc8a194c27f029da5066c9dbdafc5e774ea7671fed62f2"} Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.317478 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-18b6-account-create-update-6cfqr" event={"ID":"307d3d46-e4e9-4ed1-abc3-88afd38d5497","Type":"ContainerStarted","Data":"f937af1f2d8b9f78426e90c648d467a608bbc3596d4ff03b7baa83a22085c0d8"} Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.317521 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-18b6-account-create-update-6cfqr" event={"ID":"307d3d46-e4e9-4ed1-abc3-88afd38d5497","Type":"ContainerStarted","Data":"a7f918743f03ed047e520f1d7a8b14f5a25d187d724281a055d3ffafc74e8ad7"} Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.329288 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-ftrcj"] Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.330515 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-ftrcj" Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.351241 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-ftrcj"] Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.367106 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-794a-account-create-update-296dd" podStartSLOduration=1.367087878 podStartE2EDuration="1.367087878s" podCreationTimestamp="2026-02-19 18:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:51:03.355793033 +0000 UTC m=+1037.317012977" watchObservedRunningTime="2026-02-19 18:51:03.367087878 +0000 UTC m=+1037.328307832" Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.426347 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-18b6-account-create-update-6cfqr" podStartSLOduration=1.4263275690000001 podStartE2EDuration="1.426327569s" podCreationTimestamp="2026-02-19 18:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:51:03.424984026 +0000 UTC m=+1037.386203980" watchObservedRunningTime="2026-02-19 18:51:03.426327569 +0000 UTC m=+1037.387547523" Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.429445 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-tq984" podStartSLOduration=1.429438444 podStartE2EDuration="1.429438444s" podCreationTimestamp="2026-02-19 18:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:51:03.403222166 +0000 UTC m=+1037.364442130" watchObservedRunningTime="2026-02-19 18:51:03.429438444 +0000 UTC m=+1037.390658398" Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.436381 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aac53868-ce9a-4993-b620-a5cc50286b92-operator-scripts\") pod \"watcher-db-create-ftrcj\" (UID: \"aac53868-ce9a-4993-b620-a5cc50286b92\") " pod="openstack/watcher-db-create-ftrcj" Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.436442 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr2nc\" (UniqueName: \"kubernetes.io/projected/aac53868-ce9a-4993-b620-a5cc50286b92-kube-api-access-sr2nc\") pod \"watcher-db-create-ftrcj\" (UID: \"aac53868-ce9a-4993-b620-a5cc50286b92\") " pod="openstack/watcher-db-create-ftrcj" Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.444132 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-7mz2s" podStartSLOduration=2.444117142 podStartE2EDuration="2.444117142s" podCreationTimestamp="2026-02-19 18:51:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:51:03.441711893 +0000 UTC m=+1037.402931867" watchObservedRunningTime="2026-02-19 18:51:03.444117142 +0000 UTC m=+1037.405337096" Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.457369 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-53ab-account-create-update-ggjn8"] Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.458386 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-53ab-account-create-update-ggjn8" Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.460484 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.474709 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-53ab-account-create-update-ggjn8"] Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.537957 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aac53868-ce9a-4993-b620-a5cc50286b92-operator-scripts\") pod \"watcher-db-create-ftrcj\" (UID: \"aac53868-ce9a-4993-b620-a5cc50286b92\") " pod="openstack/watcher-db-create-ftrcj" Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.538019 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr2nc\" (UniqueName: \"kubernetes.io/projected/aac53868-ce9a-4993-b620-a5cc50286b92-kube-api-access-sr2nc\") pod \"watcher-db-create-ftrcj\" (UID: \"aac53868-ce9a-4993-b620-a5cc50286b92\") " pod="openstack/watcher-db-create-ftrcj" Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.539121 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aac53868-ce9a-4993-b620-a5cc50286b92-operator-scripts\") pod \"watcher-db-create-ftrcj\" (UID: \"aac53868-ce9a-4993-b620-a5cc50286b92\") " pod="openstack/watcher-db-create-ftrcj" Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.556614 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr2nc\" (UniqueName: \"kubernetes.io/projected/aac53868-ce9a-4993-b620-a5cc50286b92-kube-api-access-sr2nc\") pod \"watcher-db-create-ftrcj\" (UID: \"aac53868-ce9a-4993-b620-a5cc50286b92\") " pod="openstack/watcher-db-create-ftrcj" Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.639742 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c6kd\" (UniqueName: \"kubernetes.io/projected/28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3-kube-api-access-7c6kd\") pod \"watcher-53ab-account-create-update-ggjn8\" (UID: \"28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3\") " pod="openstack/watcher-53ab-account-create-update-ggjn8" Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.639811 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3-operator-scripts\") pod \"watcher-53ab-account-create-update-ggjn8\" (UID: \"28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3\") " pod="openstack/watcher-53ab-account-create-update-ggjn8" Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.650282 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-ftrcj" Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.741839 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c6kd\" (UniqueName: \"kubernetes.io/projected/28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3-kube-api-access-7c6kd\") pod \"watcher-53ab-account-create-update-ggjn8\" (UID: \"28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3\") " pod="openstack/watcher-53ab-account-create-update-ggjn8" Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.741894 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3-operator-scripts\") pod \"watcher-53ab-account-create-update-ggjn8\" (UID: \"28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3\") " pod="openstack/watcher-53ab-account-create-update-ggjn8" Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.742622 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3-operator-scripts\") pod \"watcher-53ab-account-create-update-ggjn8\" (UID: \"28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3\") " pod="openstack/watcher-53ab-account-create-update-ggjn8" Feb 19 18:51:03 crc kubenswrapper[4749]: I0219 18:51:03.793937 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c6kd\" (UniqueName: \"kubernetes.io/projected/28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3-kube-api-access-7c6kd\") pod \"watcher-53ab-account-create-update-ggjn8\" (UID: \"28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3\") " pod="openstack/watcher-53ab-account-create-update-ggjn8" Feb 19 18:51:04 crc kubenswrapper[4749]: I0219 18:51:04.074054 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-53ab-account-create-update-ggjn8" Feb 19 18:51:04 crc kubenswrapper[4749]: I0219 18:51:04.223666 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-ftrcj"] Feb 19 18:51:04 crc kubenswrapper[4749]: W0219 18:51:04.241246 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaac53868_ce9a_4993_b620_a5cc50286b92.slice/crio-063c6c648c78b9f8b206f7c744d7a50be66e2487302999eb3d45c4ca678b1c87 WatchSource:0}: Error finding container 063c6c648c78b9f8b206f7c744d7a50be66e2487302999eb3d45c4ca678b1c87: Status 404 returned error can't find the container with id 063c6c648c78b9f8b206f7c744d7a50be66e2487302999eb3d45c4ca678b1c87 Feb 19 18:51:04 crc kubenswrapper[4749]: I0219 18:51:04.331019 4749 generic.go:334] "Generic (PLEG): container finished" podID="dc1e4d4d-f564-46cd-86ae-2b2902b5f678" containerID="e0e6efb47dd1e6d9f27f0466b50e790b54f4447f59a7c7bdbbafb80b8c1d99b2" exitCode=0 Feb 19 18:51:04 crc kubenswrapper[4749]: I0219 18:51:04.331146 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7mz2s" event={"ID":"dc1e4d4d-f564-46cd-86ae-2b2902b5f678","Type":"ContainerDied","Data":"e0e6efb47dd1e6d9f27f0466b50e790b54f4447f59a7c7bdbbafb80b8c1d99b2"} Feb 19 18:51:04 crc kubenswrapper[4749]: I0219 18:51:04.333189 4749 generic.go:334] "Generic (PLEG): container finished" podID="a06e2a80-6319-460a-895c-686ceee7e8df" containerID="2d3d2170553317949e650860ca2acf7aabe778227f7f709ef09631ece2474c0d" exitCode=0 Feb 19 18:51:04 crc kubenswrapper[4749]: I0219 18:51:04.333267 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-tq984" event={"ID":"a06e2a80-6319-460a-895c-686ceee7e8df","Type":"ContainerDied","Data":"2d3d2170553317949e650860ca2acf7aabe778227f7f709ef09631ece2474c0d"} Feb 19 18:51:04 crc kubenswrapper[4749]: I0219 18:51:04.335405 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-ftrcj" event={"ID":"aac53868-ce9a-4993-b620-a5cc50286b92","Type":"ContainerStarted","Data":"063c6c648c78b9f8b206f7c744d7a50be66e2487302999eb3d45c4ca678b1c87"} Feb 19 18:51:04 crc kubenswrapper[4749]: I0219 18:51:04.337140 4749 generic.go:334] "Generic (PLEG): container finished" podID="307d3d46-e4e9-4ed1-abc3-88afd38d5497" containerID="f937af1f2d8b9f78426e90c648d467a608bbc3596d4ff03b7baa83a22085c0d8" exitCode=0 Feb 19 18:51:04 crc kubenswrapper[4749]: I0219 18:51:04.337178 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-18b6-account-create-update-6cfqr" event={"ID":"307d3d46-e4e9-4ed1-abc3-88afd38d5497","Type":"ContainerDied","Data":"f937af1f2d8b9f78426e90c648d467a608bbc3596d4ff03b7baa83a22085c0d8"} Feb 19 18:51:04 crc kubenswrapper[4749]: I0219 18:51:04.338706 4749 generic.go:334] "Generic (PLEG): container finished" podID="33a5f43c-0d61-48a3-abb9-86a7bb12af24" containerID="99ea3770e65720cdb19e7356929d565c47ef4c6b7954073cb2d57c0fee98f1db" exitCode=0 Feb 19 18:51:04 crc kubenswrapper[4749]: I0219 18:51:04.338731 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-794a-account-create-update-296dd" event={"ID":"33a5f43c-0d61-48a3-abb9-86a7bb12af24","Type":"ContainerDied","Data":"99ea3770e65720cdb19e7356929d565c47ef4c6b7954073cb2d57c0fee98f1db"} Feb 19 18:51:04 crc kubenswrapper[4749]: I0219 18:51:04.395076 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-b56xl"] Feb 19 18:51:04 crc kubenswrapper[4749]: I0219 18:51:04.405077 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-b56xl"] Feb 19 18:51:04 crc kubenswrapper[4749]: I0219 18:51:04.506505 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-53ab-account-create-update-ggjn8"] Feb 19 18:51:04 crc kubenswrapper[4749]: W0219 18:51:04.518711 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28f9406a_e2cf_4bbe_ac55_ac7eb3e9d5b3.slice/crio-9e4783db0889664189c98e0225219e8dee8f44f18a8f7709a136c3835b63bbc8 WatchSource:0}: Error finding container 9e4783db0889664189c98e0225219e8dee8f44f18a8f7709a136c3835b63bbc8: Status 404 returned error can't find the container with id 9e4783db0889664189c98e0225219e8dee8f44f18a8f7709a136c3835b63bbc8 Feb 19 18:51:04 crc kubenswrapper[4749]: I0219 18:51:04.649541 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:04 crc kubenswrapper[4749]: I0219 18:51:04.652111 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:04 crc kubenswrapper[4749]: I0219 18:51:04.695309 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71b3fa88-5159-45a2-a11e-13b43da32f20" path="/var/lib/kubelet/pods/71b3fa88-5159-45a2-a11e-13b43da32f20/volumes" Feb 19 18:51:05 crc kubenswrapper[4749]: I0219 18:51:05.351111 4749 generic.go:334] "Generic (PLEG): container finished" podID="28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3" containerID="feed97219043ded708db9c3cc1303e18a31a3ec504ab91a4f8870a38a8005f11" exitCode=0 Feb 19 18:51:05 crc kubenswrapper[4749]: I0219 18:51:05.351176 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-53ab-account-create-update-ggjn8" event={"ID":"28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3","Type":"ContainerDied","Data":"feed97219043ded708db9c3cc1303e18a31a3ec504ab91a4f8870a38a8005f11"} Feb 19 18:51:05 crc kubenswrapper[4749]: I0219 18:51:05.351375 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-53ab-account-create-update-ggjn8" event={"ID":"28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3","Type":"ContainerStarted","Data":"9e4783db0889664189c98e0225219e8dee8f44f18a8f7709a136c3835b63bbc8"} Feb 19 18:51:05 crc kubenswrapper[4749]: I0219 18:51:05.354223 4749 generic.go:334] "Generic (PLEG): container finished" podID="aac53868-ce9a-4993-b620-a5cc50286b92" containerID="b79398cce56b4e0681150010c89d7cba4890cb8511db5f22cfb45d82c0a01d17" exitCode=0 Feb 19 18:51:05 crc kubenswrapper[4749]: I0219 18:51:05.354339 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-ftrcj" event={"ID":"aac53868-ce9a-4993-b620-a5cc50286b92","Type":"ContainerDied","Data":"b79398cce56b4e0681150010c89d7cba4890cb8511db5f22cfb45d82c0a01d17"} Feb 19 18:51:05 crc kubenswrapper[4749]: I0219 18:51:05.355605 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:05 crc kubenswrapper[4749]: I0219 18:51:05.825552 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7mz2s" Feb 19 18:51:05 crc kubenswrapper[4749]: I0219 18:51:05.945942 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-794a-account-create-update-296dd" Feb 19 18:51:05 crc kubenswrapper[4749]: I0219 18:51:05.955351 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-18b6-account-create-update-6cfqr" Feb 19 18:51:05 crc kubenswrapper[4749]: I0219 18:51:05.973404 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tq984" Feb 19 18:51:05 crc kubenswrapper[4749]: I0219 18:51:05.979593 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc2s6\" (UniqueName: \"kubernetes.io/projected/dc1e4d4d-f564-46cd-86ae-2b2902b5f678-kube-api-access-hc2s6\") pod \"dc1e4d4d-f564-46cd-86ae-2b2902b5f678\" (UID: \"dc1e4d4d-f564-46cd-86ae-2b2902b5f678\") " Feb 19 18:51:05 crc kubenswrapper[4749]: I0219 18:51:05.979775 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc1e4d4d-f564-46cd-86ae-2b2902b5f678-operator-scripts\") pod \"dc1e4d4d-f564-46cd-86ae-2b2902b5f678\" (UID: \"dc1e4d4d-f564-46cd-86ae-2b2902b5f678\") " Feb 19 18:51:05 crc kubenswrapper[4749]: I0219 18:51:05.980653 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc1e4d4d-f564-46cd-86ae-2b2902b5f678-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dc1e4d4d-f564-46cd-86ae-2b2902b5f678" (UID: "dc1e4d4d-f564-46cd-86ae-2b2902b5f678"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.006788 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc1e4d4d-f564-46cd-86ae-2b2902b5f678-kube-api-access-hc2s6" (OuterVolumeSpecName: "kube-api-access-hc2s6") pod "dc1e4d4d-f564-46cd-86ae-2b2902b5f678" (UID: "dc1e4d4d-f564-46cd-86ae-2b2902b5f678"). InnerVolumeSpecName "kube-api-access-hc2s6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.081612 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfn8d\" (UniqueName: \"kubernetes.io/projected/a06e2a80-6319-460a-895c-686ceee7e8df-kube-api-access-hfn8d\") pod \"a06e2a80-6319-460a-895c-686ceee7e8df\" (UID: \"a06e2a80-6319-460a-895c-686ceee7e8df\") " Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.081699 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a06e2a80-6319-460a-895c-686ceee7e8df-operator-scripts\") pod \"a06e2a80-6319-460a-895c-686ceee7e8df\" (UID: \"a06e2a80-6319-460a-895c-686ceee7e8df\") " Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.081771 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/307d3d46-e4e9-4ed1-abc3-88afd38d5497-operator-scripts\") pod \"307d3d46-e4e9-4ed1-abc3-88afd38d5497\" (UID: \"307d3d46-e4e9-4ed1-abc3-88afd38d5497\") " Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.081798 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33a5f43c-0d61-48a3-abb9-86a7bb12af24-operator-scripts\") pod \"33a5f43c-0d61-48a3-abb9-86a7bb12af24\" (UID: \"33a5f43c-0d61-48a3-abb9-86a7bb12af24\") " Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.081827 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rht6\" (UniqueName: \"kubernetes.io/projected/307d3d46-e4e9-4ed1-abc3-88afd38d5497-kube-api-access-5rht6\") pod \"307d3d46-e4e9-4ed1-abc3-88afd38d5497\" (UID: \"307d3d46-e4e9-4ed1-abc3-88afd38d5497\") " Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.081858 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8ss8\" (UniqueName: \"kubernetes.io/projected/33a5f43c-0d61-48a3-abb9-86a7bb12af24-kube-api-access-x8ss8\") pod \"33a5f43c-0d61-48a3-abb9-86a7bb12af24\" (UID: \"33a5f43c-0d61-48a3-abb9-86a7bb12af24\") " Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.082257 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc2s6\" (UniqueName: \"kubernetes.io/projected/dc1e4d4d-f564-46cd-86ae-2b2902b5f678-kube-api-access-hc2s6\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.082276 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc1e4d4d-f564-46cd-86ae-2b2902b5f678-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.082242 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/307d3d46-e4e9-4ed1-abc3-88afd38d5497-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "307d3d46-e4e9-4ed1-abc3-88afd38d5497" (UID: "307d3d46-e4e9-4ed1-abc3-88afd38d5497"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.082427 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a06e2a80-6319-460a-895c-686ceee7e8df-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a06e2a80-6319-460a-895c-686ceee7e8df" (UID: "a06e2a80-6319-460a-895c-686ceee7e8df"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.082765 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33a5f43c-0d61-48a3-abb9-86a7bb12af24-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "33a5f43c-0d61-48a3-abb9-86a7bb12af24" (UID: "33a5f43c-0d61-48a3-abb9-86a7bb12af24"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.086636 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/307d3d46-e4e9-4ed1-abc3-88afd38d5497-kube-api-access-5rht6" (OuterVolumeSpecName: "kube-api-access-5rht6") pod "307d3d46-e4e9-4ed1-abc3-88afd38d5497" (UID: "307d3d46-e4e9-4ed1-abc3-88afd38d5497"). InnerVolumeSpecName "kube-api-access-5rht6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.088381 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a06e2a80-6319-460a-895c-686ceee7e8df-kube-api-access-hfn8d" (OuterVolumeSpecName: "kube-api-access-hfn8d") pod "a06e2a80-6319-460a-895c-686ceee7e8df" (UID: "a06e2a80-6319-460a-895c-686ceee7e8df"). InnerVolumeSpecName "kube-api-access-hfn8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.091399 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33a5f43c-0d61-48a3-abb9-86a7bb12af24-kube-api-access-x8ss8" (OuterVolumeSpecName: "kube-api-access-x8ss8") pod "33a5f43c-0d61-48a3-abb9-86a7bb12af24" (UID: "33a5f43c-0d61-48a3-abb9-86a7bb12af24"). InnerVolumeSpecName "kube-api-access-x8ss8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.183952 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfn8d\" (UniqueName: \"kubernetes.io/projected/a06e2a80-6319-460a-895c-686ceee7e8df-kube-api-access-hfn8d\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.183988 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a06e2a80-6319-460a-895c-686ceee7e8df-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.183998 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/307d3d46-e4e9-4ed1-abc3-88afd38d5497-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.184012 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33a5f43c-0d61-48a3-abb9-86a7bb12af24-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.184041 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rht6\" (UniqueName: \"kubernetes.io/projected/307d3d46-e4e9-4ed1-abc3-88afd38d5497-kube-api-access-5rht6\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.184054 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8ss8\" (UniqueName: \"kubernetes.io/projected/33a5f43c-0d61-48a3-abb9-86a7bb12af24-kube-api-access-x8ss8\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.379335 4749 generic.go:334] "Generic (PLEG): container finished" podID="f1234ce5-5e40-4f76-a3b5-8b47853bf147" containerID="7e902fa8be8764cd07b40bd1e4e2bf1cedf7edf7ab3e68df5a989c8e0f2e26c7" exitCode=0 Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.379398 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-czb4z" event={"ID":"f1234ce5-5e40-4f76-a3b5-8b47853bf147","Type":"ContainerDied","Data":"7e902fa8be8764cd07b40bd1e4e2bf1cedf7edf7ab3e68df5a989c8e0f2e26c7"} Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.381246 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7mz2s" event={"ID":"dc1e4d4d-f564-46cd-86ae-2b2902b5f678","Type":"ContainerDied","Data":"d7949fd992c4fc8f365c0d979d15a02cfdc33090b9c1afe670f8d8cb2ebd7b15"} Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.381290 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7949fd992c4fc8f365c0d979d15a02cfdc33090b9c1afe670f8d8cb2ebd7b15" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.381349 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7mz2s" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.384508 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-tq984" event={"ID":"a06e2a80-6319-460a-895c-686ceee7e8df","Type":"ContainerDied","Data":"271f7637d5f1d578476a734ffbd9cdc8ebfac28135a5e3d010929bbbce349177"} Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.384555 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="271f7637d5f1d578476a734ffbd9cdc8ebfac28135a5e3d010929bbbce349177" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.384634 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tq984" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.391933 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-18b6-account-create-update-6cfqr" event={"ID":"307d3d46-e4e9-4ed1-abc3-88afd38d5497","Type":"ContainerDied","Data":"a7f918743f03ed047e520f1d7a8b14f5a25d187d724281a055d3ffafc74e8ad7"} Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.391965 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-18b6-account-create-update-6cfqr" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.391976 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7f918743f03ed047e520f1d7a8b14f5a25d187d724281a055d3ffafc74e8ad7" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.398148 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-794a-account-create-update-296dd" event={"ID":"33a5f43c-0d61-48a3-abb9-86a7bb12af24","Type":"ContainerDied","Data":"69ab4c778d09dcbc511d4590232a4b86e2c27669c7fa9f5c2401ba7f561d4389"} Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.398194 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69ab4c778d09dcbc511d4590232a4b86e2c27669c7fa9f5c2401ba7f561d4389" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.398252 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-794a-account-create-update-296dd" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.400665 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ztkm4" podUID="7232b466-ffe3-4eab-ad4c-bb2ccac65929" containerName="ovn-controller" probeResult="failure" output=< Feb 19 18:51:06 crc kubenswrapper[4749]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 19 18:51:06 crc kubenswrapper[4749]: > Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.435941 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-sn6j7" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.464759 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-sn6j7" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.711951 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ztkm4-config-8g7rw"] Feb 19 18:51:06 crc kubenswrapper[4749]: E0219 18:51:06.712772 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a06e2a80-6319-460a-895c-686ceee7e8df" containerName="mariadb-database-create" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.712792 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a06e2a80-6319-460a-895c-686ceee7e8df" containerName="mariadb-database-create" Feb 19 18:51:06 crc kubenswrapper[4749]: E0219 18:51:06.712813 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33a5f43c-0d61-48a3-abb9-86a7bb12af24" containerName="mariadb-account-create-update" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.712820 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="33a5f43c-0d61-48a3-abb9-86a7bb12af24" containerName="mariadb-account-create-update" Feb 19 18:51:06 crc kubenswrapper[4749]: E0219 18:51:06.712842 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="307d3d46-e4e9-4ed1-abc3-88afd38d5497" containerName="mariadb-account-create-update" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.712849 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="307d3d46-e4e9-4ed1-abc3-88afd38d5497" containerName="mariadb-account-create-update" Feb 19 18:51:06 crc kubenswrapper[4749]: E0219 18:51:06.712860 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc1e4d4d-f564-46cd-86ae-2b2902b5f678" containerName="mariadb-database-create" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.712867 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc1e4d4d-f564-46cd-86ae-2b2902b5f678" containerName="mariadb-database-create" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.713068 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="33a5f43c-0d61-48a3-abb9-86a7bb12af24" containerName="mariadb-account-create-update" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.713083 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a06e2a80-6319-460a-895c-686ceee7e8df" containerName="mariadb-database-create" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.713099 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc1e4d4d-f564-46cd-86ae-2b2902b5f678" containerName="mariadb-database-create" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.713115 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="307d3d46-e4e9-4ed1-abc3-88afd38d5497" containerName="mariadb-account-create-update" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.713687 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ztkm4-config-8g7rw" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.716512 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.734515 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ztkm4-config-8g7rw"] Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.799277 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7becec20-41fe-439c-bed2-0f2bc39d558c-additional-scripts\") pod \"ovn-controller-ztkm4-config-8g7rw\" (UID: \"7becec20-41fe-439c-bed2-0f2bc39d558c\") " pod="openstack/ovn-controller-ztkm4-config-8g7rw" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.799343 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7becec20-41fe-439c-bed2-0f2bc39d558c-scripts\") pod \"ovn-controller-ztkm4-config-8g7rw\" (UID: \"7becec20-41fe-439c-bed2-0f2bc39d558c\") " pod="openstack/ovn-controller-ztkm4-config-8g7rw" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.799768 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7becec20-41fe-439c-bed2-0f2bc39d558c-var-log-ovn\") pod \"ovn-controller-ztkm4-config-8g7rw\" (UID: \"7becec20-41fe-439c-bed2-0f2bc39d558c\") " pod="openstack/ovn-controller-ztkm4-config-8g7rw" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.799954 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7becec20-41fe-439c-bed2-0f2bc39d558c-var-run-ovn\") pod \"ovn-controller-ztkm4-config-8g7rw\" (UID: \"7becec20-41fe-439c-bed2-0f2bc39d558c\") " pod="openstack/ovn-controller-ztkm4-config-8g7rw" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.800050 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4vth\" (UniqueName: \"kubernetes.io/projected/7becec20-41fe-439c-bed2-0f2bc39d558c-kube-api-access-t4vth\") pod \"ovn-controller-ztkm4-config-8g7rw\" (UID: \"7becec20-41fe-439c-bed2-0f2bc39d558c\") " pod="openstack/ovn-controller-ztkm4-config-8g7rw" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.800086 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7becec20-41fe-439c-bed2-0f2bc39d558c-var-run\") pod \"ovn-controller-ztkm4-config-8g7rw\" (UID: \"7becec20-41fe-439c-bed2-0f2bc39d558c\") " pod="openstack/ovn-controller-ztkm4-config-8g7rw" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.902332 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7becec20-41fe-439c-bed2-0f2bc39d558c-var-run-ovn\") pod \"ovn-controller-ztkm4-config-8g7rw\" (UID: \"7becec20-41fe-439c-bed2-0f2bc39d558c\") " pod="openstack/ovn-controller-ztkm4-config-8g7rw" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.902385 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4vth\" (UniqueName: \"kubernetes.io/projected/7becec20-41fe-439c-bed2-0f2bc39d558c-kube-api-access-t4vth\") pod \"ovn-controller-ztkm4-config-8g7rw\" (UID: \"7becec20-41fe-439c-bed2-0f2bc39d558c\") " pod="openstack/ovn-controller-ztkm4-config-8g7rw" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.902408 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7becec20-41fe-439c-bed2-0f2bc39d558c-var-run\") pod \"ovn-controller-ztkm4-config-8g7rw\" (UID: \"7becec20-41fe-439c-bed2-0f2bc39d558c\") " pod="openstack/ovn-controller-ztkm4-config-8g7rw" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.902475 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7becec20-41fe-439c-bed2-0f2bc39d558c-additional-scripts\") pod \"ovn-controller-ztkm4-config-8g7rw\" (UID: \"7becec20-41fe-439c-bed2-0f2bc39d558c\") " pod="openstack/ovn-controller-ztkm4-config-8g7rw" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.902502 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7becec20-41fe-439c-bed2-0f2bc39d558c-scripts\") pod \"ovn-controller-ztkm4-config-8g7rw\" (UID: \"7becec20-41fe-439c-bed2-0f2bc39d558c\") " pod="openstack/ovn-controller-ztkm4-config-8g7rw" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.902570 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7becec20-41fe-439c-bed2-0f2bc39d558c-var-log-ovn\") pod \"ovn-controller-ztkm4-config-8g7rw\" (UID: \"7becec20-41fe-439c-bed2-0f2bc39d558c\") " pod="openstack/ovn-controller-ztkm4-config-8g7rw" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.902665 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7becec20-41fe-439c-bed2-0f2bc39d558c-var-run-ovn\") pod \"ovn-controller-ztkm4-config-8g7rw\" (UID: \"7becec20-41fe-439c-bed2-0f2bc39d558c\") " pod="openstack/ovn-controller-ztkm4-config-8g7rw" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.902721 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7becec20-41fe-439c-bed2-0f2bc39d558c-var-run\") pod \"ovn-controller-ztkm4-config-8g7rw\" (UID: \"7becec20-41fe-439c-bed2-0f2bc39d558c\") " pod="openstack/ovn-controller-ztkm4-config-8g7rw" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.902727 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7becec20-41fe-439c-bed2-0f2bc39d558c-var-log-ovn\") pod \"ovn-controller-ztkm4-config-8g7rw\" (UID: \"7becec20-41fe-439c-bed2-0f2bc39d558c\") " pod="openstack/ovn-controller-ztkm4-config-8g7rw" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.903544 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7becec20-41fe-439c-bed2-0f2bc39d558c-additional-scripts\") pod \"ovn-controller-ztkm4-config-8g7rw\" (UID: \"7becec20-41fe-439c-bed2-0f2bc39d558c\") " pod="openstack/ovn-controller-ztkm4-config-8g7rw" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.904630 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7becec20-41fe-439c-bed2-0f2bc39d558c-scripts\") pod \"ovn-controller-ztkm4-config-8g7rw\" (UID: \"7becec20-41fe-439c-bed2-0f2bc39d558c\") " pod="openstack/ovn-controller-ztkm4-config-8g7rw" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.930931 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4vth\" (UniqueName: \"kubernetes.io/projected/7becec20-41fe-439c-bed2-0f2bc39d558c-kube-api-access-t4vth\") pod \"ovn-controller-ztkm4-config-8g7rw\" (UID: \"7becec20-41fe-439c-bed2-0f2bc39d558c\") " pod="openstack/ovn-controller-ztkm4-config-8g7rw" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.969580 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-ftrcj" Feb 19 18:51:06 crc kubenswrapper[4749]: I0219 18:51:06.975267 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-53ab-account-create-update-ggjn8" Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.038948 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ztkm4-config-8g7rw" Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.137437 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3-operator-scripts\") pod \"28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3\" (UID: \"28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3\") " Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.137919 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr2nc\" (UniqueName: \"kubernetes.io/projected/aac53868-ce9a-4993-b620-a5cc50286b92-kube-api-access-sr2nc\") pod \"aac53868-ce9a-4993-b620-a5cc50286b92\" (UID: \"aac53868-ce9a-4993-b620-a5cc50286b92\") " Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.137956 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c6kd\" (UniqueName: \"kubernetes.io/projected/28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3-kube-api-access-7c6kd\") pod \"28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3\" (UID: \"28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3\") " Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.138007 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aac53868-ce9a-4993-b620-a5cc50286b92-operator-scripts\") pod \"aac53868-ce9a-4993-b620-a5cc50286b92\" (UID: \"aac53868-ce9a-4993-b620-a5cc50286b92\") " Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.138967 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aac53868-ce9a-4993-b620-a5cc50286b92-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aac53868-ce9a-4993-b620-a5cc50286b92" (UID: "aac53868-ce9a-4993-b620-a5cc50286b92"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.139233 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3" (UID: "28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.144629 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aac53868-ce9a-4993-b620-a5cc50286b92-kube-api-access-sr2nc" (OuterVolumeSpecName: "kube-api-access-sr2nc") pod "aac53868-ce9a-4993-b620-a5cc50286b92" (UID: "aac53868-ce9a-4993-b620-a5cc50286b92"). InnerVolumeSpecName "kube-api-access-sr2nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.145098 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3-kube-api-access-7c6kd" (OuterVolumeSpecName: "kube-api-access-7c6kd") pod "28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3" (UID: "28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3"). InnerVolumeSpecName "kube-api-access-7c6kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.239432 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sr2nc\" (UniqueName: \"kubernetes.io/projected/aac53868-ce9a-4993-b620-a5cc50286b92-kube-api-access-sr2nc\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.239470 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c6kd\" (UniqueName: \"kubernetes.io/projected/28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3-kube-api-access-7c6kd\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.239483 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aac53868-ce9a-4993-b620-a5cc50286b92-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.239494 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.372870 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/notifications-rabbitmq-server-0" podUID="27dfe8e9-686d-4703-b36d-df6b94491b40" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.106:5671: connect: connection refused" Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.407085 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-53ab-account-create-update-ggjn8" Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.407110 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-53ab-account-create-update-ggjn8" event={"ID":"28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3","Type":"ContainerDied","Data":"9e4783db0889664189c98e0225219e8dee8f44f18a8f7709a136c3835b63bbc8"} Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.407165 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e4783db0889664189c98e0225219e8dee8f44f18a8f7709a136c3835b63bbc8" Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.408609 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-ftrcj" Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.408646 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-ftrcj" event={"ID":"aac53868-ce9a-4993-b620-a5cc50286b92","Type":"ContainerDied","Data":"063c6c648c78b9f8b206f7c744d7a50be66e2487302999eb3d45c4ca678b1c87"} Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.408664 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="063c6c648c78b9f8b206f7c744d7a50be66e2487302999eb3d45c4ca678b1c87" Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.493873 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ztkm4-config-8g7rw"] Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.805100 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-czb4z" Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.940335 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="008062c0-9ccf-4fd2-9b54-63196268da38" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.108:5671: connect: connection refused" Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.953509 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1234ce5-5e40-4f76-a3b5-8b47853bf147-scripts\") pod \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.953553 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f1234ce5-5e40-4f76-a3b5-8b47853bf147-dispersionconf\") pod \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.953582 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f1234ce5-5e40-4f76-a3b5-8b47853bf147-swiftconf\") pod \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.953605 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f1234ce5-5e40-4f76-a3b5-8b47853bf147-etc-swift\") pod \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.953789 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67qqd\" (UniqueName: \"kubernetes.io/projected/f1234ce5-5e40-4f76-a3b5-8b47853bf147-kube-api-access-67qqd\") pod \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.953815 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1234ce5-5e40-4f76-a3b5-8b47853bf147-combined-ca-bundle\") pod \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.953861 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f1234ce5-5e40-4f76-a3b5-8b47853bf147-ring-data-devices\") pod \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\" (UID: \"f1234ce5-5e40-4f76-a3b5-8b47853bf147\") " Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.955113 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1234ce5-5e40-4f76-a3b5-8b47853bf147-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f1234ce5-5e40-4f76-a3b5-8b47853bf147" (UID: "f1234ce5-5e40-4f76-a3b5-8b47853bf147"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.956019 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1234ce5-5e40-4f76-a3b5-8b47853bf147-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f1234ce5-5e40-4f76-a3b5-8b47853bf147" (UID: "f1234ce5-5e40-4f76-a3b5-8b47853bf147"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.961662 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="042fb593-4898-4085-889e-7ccb375cf969" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.983650 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1234ce5-5e40-4f76-a3b5-8b47853bf147-scripts" (OuterVolumeSpecName: "scripts") pod "f1234ce5-5e40-4f76-a3b5-8b47853bf147" (UID: "f1234ce5-5e40-4f76-a3b5-8b47853bf147"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.989138 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1234ce5-5e40-4f76-a3b5-8b47853bf147-kube-api-access-67qqd" (OuterVolumeSpecName: "kube-api-access-67qqd") pod "f1234ce5-5e40-4f76-a3b5-8b47853bf147" (UID: "f1234ce5-5e40-4f76-a3b5-8b47853bf147"). InnerVolumeSpecName "kube-api-access-67qqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.990196 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1234ce5-5e40-4f76-a3b5-8b47853bf147-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f1234ce5-5e40-4f76-a3b5-8b47853bf147" (UID: "f1234ce5-5e40-4f76-a3b5-8b47853bf147"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:51:07 crc kubenswrapper[4749]: I0219 18:51:07.996199 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1234ce5-5e40-4f76-a3b5-8b47853bf147-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f1234ce5-5e40-4f76-a3b5-8b47853bf147" (UID: "f1234ce5-5e40-4f76-a3b5-8b47853bf147"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:51:08 crc kubenswrapper[4749]: I0219 18:51:08.003142 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1234ce5-5e40-4f76-a3b5-8b47853bf147-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1234ce5-5e40-4f76-a3b5-8b47853bf147" (UID: "f1234ce5-5e40-4f76-a3b5-8b47853bf147"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:51:08 crc kubenswrapper[4749]: I0219 18:51:08.056399 4749 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f1234ce5-5e40-4f76-a3b5-8b47853bf147-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:08 crc kubenswrapper[4749]: I0219 18:51:08.056432 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1234ce5-5e40-4f76-a3b5-8b47853bf147-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:08 crc kubenswrapper[4749]: I0219 18:51:08.056443 4749 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f1234ce5-5e40-4f76-a3b5-8b47853bf147-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:08 crc kubenswrapper[4749]: I0219 18:51:08.056453 4749 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f1234ce5-5e40-4f76-a3b5-8b47853bf147-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:08 crc kubenswrapper[4749]: I0219 18:51:08.056461 4749 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f1234ce5-5e40-4f76-a3b5-8b47853bf147-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:08 crc kubenswrapper[4749]: I0219 18:51:08.056472 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67qqd\" (UniqueName: \"kubernetes.io/projected/f1234ce5-5e40-4f76-a3b5-8b47853bf147-kube-api-access-67qqd\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:08 crc kubenswrapper[4749]: I0219 18:51:08.056482 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1234ce5-5e40-4f76-a3b5-8b47853bf147-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:08 crc kubenswrapper[4749]: I0219 18:51:08.110206 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 18:51:08 crc kubenswrapper[4749]: I0219 18:51:08.110520 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="21b53583-c33c-47c6-8351-35bd5f08e632" containerName="prometheus" containerID="cri-o://31b79e96d287fde0e6ca67cdf86a525c83fa917860e1ecaa0e024d228a6b4ca3" gracePeriod=600 Feb 19 18:51:08 crc kubenswrapper[4749]: I0219 18:51:08.110589 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="21b53583-c33c-47c6-8351-35bd5f08e632" containerName="thanos-sidecar" containerID="cri-o://805ffae3cb911a697a50fd95139308d89663c03e7c9b94a22c8bde6c38c656a4" gracePeriod=600 Feb 19 18:51:08 crc kubenswrapper[4749]: I0219 18:51:08.110603 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="21b53583-c33c-47c6-8351-35bd5f08e632" containerName="config-reloader" containerID="cri-o://d0e5ba129d30866e58603466a4a238ab61f5e23b51d34dab6469ae21da450e2c" gracePeriod=600 Feb 19 18:51:08 crc kubenswrapper[4749]: I0219 18:51:08.417148 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ztkm4-config-8g7rw" event={"ID":"7becec20-41fe-439c-bed2-0f2bc39d558c","Type":"ContainerStarted","Data":"2526f8dc6e001f84c0dc07081bcc7822b10251354f7b568a4fde083fdf119970"} Feb 19 18:51:08 crc kubenswrapper[4749]: I0219 18:51:08.417217 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ztkm4-config-8g7rw" event={"ID":"7becec20-41fe-439c-bed2-0f2bc39d558c","Type":"ContainerStarted","Data":"768754bac7eff6f12ef8ff062b2322d4cf4c3218988f4af2248af4a0b099e56b"} Feb 19 18:51:08 crc kubenswrapper[4749]: I0219 18:51:08.418591 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-czb4z" Feb 19 18:51:08 crc kubenswrapper[4749]: I0219 18:51:08.418585 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-czb4z" event={"ID":"f1234ce5-5e40-4f76-a3b5-8b47853bf147","Type":"ContainerDied","Data":"c2b7c510e5516736fca67d3867c88a114e489f997d7b70e78518e9c519fbe3b1"} Feb 19 18:51:08 crc kubenswrapper[4749]: I0219 18:51:08.418726 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2b7c510e5516736fca67d3867c88a114e489f997d7b70e78518e9c519fbe3b1" Feb 19 18:51:08 crc kubenswrapper[4749]: I0219 18:51:08.423332 4749 generic.go:334] "Generic (PLEG): container finished" podID="21b53583-c33c-47c6-8351-35bd5f08e632" containerID="805ffae3cb911a697a50fd95139308d89663c03e7c9b94a22c8bde6c38c656a4" exitCode=0 Feb 19 18:51:08 crc kubenswrapper[4749]: I0219 18:51:08.423360 4749 generic.go:334] "Generic (PLEG): container finished" podID="21b53583-c33c-47c6-8351-35bd5f08e632" containerID="31b79e96d287fde0e6ca67cdf86a525c83fa917860e1ecaa0e024d228a6b4ca3" exitCode=0 Feb 19 18:51:08 crc kubenswrapper[4749]: I0219 18:51:08.423376 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"21b53583-c33c-47c6-8351-35bd5f08e632","Type":"ContainerDied","Data":"805ffae3cb911a697a50fd95139308d89663c03e7c9b94a22c8bde6c38c656a4"} Feb 19 18:51:08 crc kubenswrapper[4749]: I0219 18:51:08.423398 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"21b53583-c33c-47c6-8351-35bd5f08e632","Type":"ContainerDied","Data":"31b79e96d287fde0e6ca67cdf86a525c83fa917860e1ecaa0e024d228a6b4ca3"} Feb 19 18:51:08 crc kubenswrapper[4749]: I0219 18:51:08.441799 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ztkm4-config-8g7rw" podStartSLOduration=2.4417844459999998 podStartE2EDuration="2.441784446s" podCreationTimestamp="2026-02-19 18:51:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:51:08.438495346 +0000 UTC m=+1042.399715300" watchObservedRunningTime="2026-02-19 18:51:08.441784446 +0000 UTC m=+1042.403004400" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.093549 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.173445 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/21b53583-c33c-47c6-8351-35bd5f08e632-prometheus-metric-storage-rulefiles-0\") pod \"21b53583-c33c-47c6-8351-35bd5f08e632\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.173518 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/21b53583-c33c-47c6-8351-35bd5f08e632-thanos-prometheus-http-client-file\") pod \"21b53583-c33c-47c6-8351-35bd5f08e632\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.173627 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/21b53583-c33c-47c6-8351-35bd5f08e632-tls-assets\") pod \"21b53583-c33c-47c6-8351-35bd5f08e632\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.173697 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/21b53583-c33c-47c6-8351-35bd5f08e632-config\") pod \"21b53583-c33c-47c6-8351-35bd5f08e632\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.173722 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/21b53583-c33c-47c6-8351-35bd5f08e632-prometheus-metric-storage-rulefiles-1\") pod \"21b53583-c33c-47c6-8351-35bd5f08e632\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.173772 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhd9v\" (UniqueName: \"kubernetes.io/projected/21b53583-c33c-47c6-8351-35bd5f08e632-kube-api-access-fhd9v\") pod \"21b53583-c33c-47c6-8351-35bd5f08e632\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.173815 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/21b53583-c33c-47c6-8351-35bd5f08e632-config-out\") pod \"21b53583-c33c-47c6-8351-35bd5f08e632\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.173838 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/21b53583-c33c-47c6-8351-35bd5f08e632-web-config\") pod \"21b53583-c33c-47c6-8351-35bd5f08e632\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.174068 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82420923-4549-44bf-81a2-5cca6d09b55a\") pod \"21b53583-c33c-47c6-8351-35bd5f08e632\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.174132 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/21b53583-c33c-47c6-8351-35bd5f08e632-prometheus-metric-storage-rulefiles-2\") pod \"21b53583-c33c-47c6-8351-35bd5f08e632\" (UID: \"21b53583-c33c-47c6-8351-35bd5f08e632\") " Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.176313 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21b53583-c33c-47c6-8351-35bd5f08e632-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "21b53583-c33c-47c6-8351-35bd5f08e632" (UID: "21b53583-c33c-47c6-8351-35bd5f08e632"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.176526 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21b53583-c33c-47c6-8351-35bd5f08e632-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "21b53583-c33c-47c6-8351-35bd5f08e632" (UID: "21b53583-c33c-47c6-8351-35bd5f08e632"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.178588 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21b53583-c33c-47c6-8351-35bd5f08e632-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "21b53583-c33c-47c6-8351-35bd5f08e632" (UID: "21b53583-c33c-47c6-8351-35bd5f08e632"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.182500 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21b53583-c33c-47c6-8351-35bd5f08e632-kube-api-access-fhd9v" (OuterVolumeSpecName: "kube-api-access-fhd9v") pod "21b53583-c33c-47c6-8351-35bd5f08e632" (UID: "21b53583-c33c-47c6-8351-35bd5f08e632"). InnerVolumeSpecName "kube-api-access-fhd9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.183220 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21b53583-c33c-47c6-8351-35bd5f08e632-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "21b53583-c33c-47c6-8351-35bd5f08e632" (UID: "21b53583-c33c-47c6-8351-35bd5f08e632"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.184134 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21b53583-c33c-47c6-8351-35bd5f08e632-config-out" (OuterVolumeSpecName: "config-out") pod "21b53583-c33c-47c6-8351-35bd5f08e632" (UID: "21b53583-c33c-47c6-8351-35bd5f08e632"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.189152 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21b53583-c33c-47c6-8351-35bd5f08e632-config" (OuterVolumeSpecName: "config") pod "21b53583-c33c-47c6-8351-35bd5f08e632" (UID: "21b53583-c33c-47c6-8351-35bd5f08e632"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.199524 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21b53583-c33c-47c6-8351-35bd5f08e632-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "21b53583-c33c-47c6-8351-35bd5f08e632" (UID: "21b53583-c33c-47c6-8351-35bd5f08e632"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.213961 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21b53583-c33c-47c6-8351-35bd5f08e632-web-config" (OuterVolumeSpecName: "web-config") pod "21b53583-c33c-47c6-8351-35bd5f08e632" (UID: "21b53583-c33c-47c6-8351-35bd5f08e632"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.221525 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82420923-4549-44bf-81a2-5cca6d09b55a" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "21b53583-c33c-47c6-8351-35bd5f08e632" (UID: "21b53583-c33c-47c6-8351-35bd5f08e632"). InnerVolumeSpecName "pvc-82420923-4549-44bf-81a2-5cca6d09b55a". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.277104 4749 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/21b53583-c33c-47c6-8351-35bd5f08e632-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.277196 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/21b53583-c33c-47c6-8351-35bd5f08e632-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.277246 4749 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/21b53583-c33c-47c6-8351-35bd5f08e632-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.277260 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhd9v\" (UniqueName: \"kubernetes.io/projected/21b53583-c33c-47c6-8351-35bd5f08e632-kube-api-access-fhd9v\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.277699 4749 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/21b53583-c33c-47c6-8351-35bd5f08e632-config-out\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.277708 4749 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/21b53583-c33c-47c6-8351-35bd5f08e632-web-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.277791 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-82420923-4549-44bf-81a2-5cca6d09b55a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82420923-4549-44bf-81a2-5cca6d09b55a\") on node \"crc\" " Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.277864 4749 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/21b53583-c33c-47c6-8351-35bd5f08e632-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.277936 4749 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/21b53583-c33c-47c6-8351-35bd5f08e632-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.277952 4749 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/21b53583-c33c-47c6-8351-35bd5f08e632-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.300339 4749 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.300490 4749 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-82420923-4549-44bf-81a2-5cca6d09b55a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82420923-4549-44bf-81a2-5cca6d09b55a") on node "crc" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.379171 4749 reconciler_common.go:293] "Volume detached for volume \"pvc-82420923-4549-44bf-81a2-5cca6d09b55a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82420923-4549-44bf-81a2-5cca6d09b55a\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.410401 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-hspjc"] Feb 19 18:51:09 crc kubenswrapper[4749]: E0219 18:51:09.410793 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b53583-c33c-47c6-8351-35bd5f08e632" containerName="prometheus" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.410819 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b53583-c33c-47c6-8351-35bd5f08e632" containerName="prometheus" Feb 19 18:51:09 crc kubenswrapper[4749]: E0219 18:51:09.410833 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b53583-c33c-47c6-8351-35bd5f08e632" containerName="thanos-sidecar" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.410840 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b53583-c33c-47c6-8351-35bd5f08e632" containerName="thanos-sidecar" Feb 19 18:51:09 crc kubenswrapper[4749]: E0219 18:51:09.410857 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b53583-c33c-47c6-8351-35bd5f08e632" containerName="init-config-reloader" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.410865 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b53583-c33c-47c6-8351-35bd5f08e632" containerName="init-config-reloader" Feb 19 18:51:09 crc kubenswrapper[4749]: E0219 18:51:09.410881 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b53583-c33c-47c6-8351-35bd5f08e632" containerName="config-reloader" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.410889 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b53583-c33c-47c6-8351-35bd5f08e632" containerName="config-reloader" Feb 19 18:51:09 crc kubenswrapper[4749]: E0219 18:51:09.410907 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1234ce5-5e40-4f76-a3b5-8b47853bf147" containerName="swift-ring-rebalance" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.410915 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1234ce5-5e40-4f76-a3b5-8b47853bf147" containerName="swift-ring-rebalance" Feb 19 18:51:09 crc kubenswrapper[4749]: E0219 18:51:09.410954 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3" containerName="mariadb-account-create-update" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.410961 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3" containerName="mariadb-account-create-update" Feb 19 18:51:09 crc kubenswrapper[4749]: E0219 18:51:09.410976 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aac53868-ce9a-4993-b620-a5cc50286b92" containerName="mariadb-database-create" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.410984 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="aac53868-ce9a-4993-b620-a5cc50286b92" containerName="mariadb-database-create" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.411179 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="21b53583-c33c-47c6-8351-35bd5f08e632" containerName="prometheus" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.411200 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="aac53868-ce9a-4993-b620-a5cc50286b92" containerName="mariadb-database-create" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.411215 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3" containerName="mariadb-account-create-update" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.411226 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="21b53583-c33c-47c6-8351-35bd5f08e632" containerName="thanos-sidecar" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.411238 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1234ce5-5e40-4f76-a3b5-8b47853bf147" containerName="swift-ring-rebalance" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.411249 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="21b53583-c33c-47c6-8351-35bd5f08e632" containerName="config-reloader" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.411828 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hspjc" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.413862 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.431199 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-hspjc"] Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.445243 4749 generic.go:334] "Generic (PLEG): container finished" podID="7becec20-41fe-439c-bed2-0f2bc39d558c" containerID="2526f8dc6e001f84c0dc07081bcc7822b10251354f7b568a4fde083fdf119970" exitCode=0 Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.445607 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ztkm4-config-8g7rw" event={"ID":"7becec20-41fe-439c-bed2-0f2bc39d558c","Type":"ContainerDied","Data":"2526f8dc6e001f84c0dc07081bcc7822b10251354f7b568a4fde083fdf119970"} Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.457357 4749 generic.go:334] "Generic (PLEG): container finished" podID="21b53583-c33c-47c6-8351-35bd5f08e632" containerID="d0e5ba129d30866e58603466a4a238ab61f5e23b51d34dab6469ae21da450e2c" exitCode=0 Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.457405 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"21b53583-c33c-47c6-8351-35bd5f08e632","Type":"ContainerDied","Data":"d0e5ba129d30866e58603466a4a238ab61f5e23b51d34dab6469ae21da450e2c"} Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.457429 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"21b53583-c33c-47c6-8351-35bd5f08e632","Type":"ContainerDied","Data":"c68e6bead1f9133b15c5d597aecff39cffe2005148663c929951f581574c25e4"} Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.457446 4749 scope.go:117] "RemoveContainer" containerID="805ffae3cb911a697a50fd95139308d89663c03e7c9b94a22c8bde6c38c656a4" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.457577 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.511680 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.520526 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.529661 4749 scope.go:117] "RemoveContainer" containerID="d0e5ba129d30866e58603466a4a238ab61f5e23b51d34dab6469ae21da450e2c" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.544402 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.547558 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.552483 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.552532 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.553560 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.553724 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.554891 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.554971 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.555291 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-h9bs6" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.555479 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.559955 4749 scope.go:117] "RemoveContainer" containerID="31b79e96d287fde0e6ca67cdf86a525c83fa917860e1ecaa0e024d228a6b4ca3" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.561272 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.585997 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.588167 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwqjb\" (UniqueName: \"kubernetes.io/projected/910f857c-2f81-4747-8401-0b59bab921a2-kube-api-access-cwqjb\") pod \"root-account-create-update-hspjc\" (UID: \"910f857c-2f81-4747-8401-0b59bab921a2\") " pod="openstack/root-account-create-update-hspjc" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.588294 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/910f857c-2f81-4747-8401-0b59bab921a2-operator-scripts\") pod \"root-account-create-update-hspjc\" (UID: \"910f857c-2f81-4747-8401-0b59bab921a2\") " pod="openstack/root-account-create-update-hspjc" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.597005 4749 scope.go:117] "RemoveContainer" containerID="8a8ffdf9e5d60f04cba0c6394d110f8543f0f87d221878e48343da1dfca4650e" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.629202 4749 scope.go:117] "RemoveContainer" containerID="805ffae3cb911a697a50fd95139308d89663c03e7c9b94a22c8bde6c38c656a4" Feb 19 18:51:09 crc kubenswrapper[4749]: E0219 18:51:09.642151 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"805ffae3cb911a697a50fd95139308d89663c03e7c9b94a22c8bde6c38c656a4\": container with ID starting with 805ffae3cb911a697a50fd95139308d89663c03e7c9b94a22c8bde6c38c656a4 not found: ID does not exist" containerID="805ffae3cb911a697a50fd95139308d89663c03e7c9b94a22c8bde6c38c656a4" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.642199 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"805ffae3cb911a697a50fd95139308d89663c03e7c9b94a22c8bde6c38c656a4"} err="failed to get container status \"805ffae3cb911a697a50fd95139308d89663c03e7c9b94a22c8bde6c38c656a4\": rpc error: code = NotFound desc = could not find container \"805ffae3cb911a697a50fd95139308d89663c03e7c9b94a22c8bde6c38c656a4\": container with ID starting with 805ffae3cb911a697a50fd95139308d89663c03e7c9b94a22c8bde6c38c656a4 not found: ID does not exist" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.642231 4749 scope.go:117] "RemoveContainer" containerID="d0e5ba129d30866e58603466a4a238ab61f5e23b51d34dab6469ae21da450e2c" Feb 19 18:51:09 crc kubenswrapper[4749]: E0219 18:51:09.645409 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0e5ba129d30866e58603466a4a238ab61f5e23b51d34dab6469ae21da450e2c\": container with ID starting with d0e5ba129d30866e58603466a4a238ab61f5e23b51d34dab6469ae21da450e2c not found: ID does not exist" containerID="d0e5ba129d30866e58603466a4a238ab61f5e23b51d34dab6469ae21da450e2c" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.645446 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0e5ba129d30866e58603466a4a238ab61f5e23b51d34dab6469ae21da450e2c"} err="failed to get container status \"d0e5ba129d30866e58603466a4a238ab61f5e23b51d34dab6469ae21da450e2c\": rpc error: code = NotFound desc = could not find container \"d0e5ba129d30866e58603466a4a238ab61f5e23b51d34dab6469ae21da450e2c\": container with ID starting with d0e5ba129d30866e58603466a4a238ab61f5e23b51d34dab6469ae21da450e2c not found: ID does not exist" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.645469 4749 scope.go:117] "RemoveContainer" containerID="31b79e96d287fde0e6ca67cdf86a525c83fa917860e1ecaa0e024d228a6b4ca3" Feb 19 18:51:09 crc kubenswrapper[4749]: E0219 18:51:09.646245 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31b79e96d287fde0e6ca67cdf86a525c83fa917860e1ecaa0e024d228a6b4ca3\": container with ID starting with 31b79e96d287fde0e6ca67cdf86a525c83fa917860e1ecaa0e024d228a6b4ca3 not found: ID does not exist" containerID="31b79e96d287fde0e6ca67cdf86a525c83fa917860e1ecaa0e024d228a6b4ca3" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.646309 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31b79e96d287fde0e6ca67cdf86a525c83fa917860e1ecaa0e024d228a6b4ca3"} err="failed to get container status \"31b79e96d287fde0e6ca67cdf86a525c83fa917860e1ecaa0e024d228a6b4ca3\": rpc error: code = NotFound desc = could not find container \"31b79e96d287fde0e6ca67cdf86a525c83fa917860e1ecaa0e024d228a6b4ca3\": container with ID starting with 31b79e96d287fde0e6ca67cdf86a525c83fa917860e1ecaa0e024d228a6b4ca3 not found: ID does not exist" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.646330 4749 scope.go:117] "RemoveContainer" containerID="8a8ffdf9e5d60f04cba0c6394d110f8543f0f87d221878e48343da1dfca4650e" Feb 19 18:51:09 crc kubenswrapper[4749]: E0219 18:51:09.647242 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a8ffdf9e5d60f04cba0c6394d110f8543f0f87d221878e48343da1dfca4650e\": container with ID starting with 8a8ffdf9e5d60f04cba0c6394d110f8543f0f87d221878e48343da1dfca4650e not found: ID does not exist" containerID="8a8ffdf9e5d60f04cba0c6394d110f8543f0f87d221878e48343da1dfca4650e" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.647270 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a8ffdf9e5d60f04cba0c6394d110f8543f0f87d221878e48343da1dfca4650e"} err="failed to get container status \"8a8ffdf9e5d60f04cba0c6394d110f8543f0f87d221878e48343da1dfca4650e\": rpc error: code = NotFound desc = could not find container \"8a8ffdf9e5d60f04cba0c6394d110f8543f0f87d221878e48343da1dfca4650e\": container with ID starting with 8a8ffdf9e5d60f04cba0c6394d110f8543f0f87d221878e48343da1dfca4650e not found: ID does not exist" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.689882 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.689945 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/15a688ac-ce3d-40e9-90d0-b013569164e3-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.689972 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/15a688ac-ce3d-40e9-90d0-b013569164e3-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.690015 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-config\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.690064 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.690093 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwqjb\" (UniqueName: \"kubernetes.io/projected/910f857c-2f81-4747-8401-0b59bab921a2-kube-api-access-cwqjb\") pod \"root-account-create-update-hspjc\" (UID: \"910f857c-2f81-4747-8401-0b59bab921a2\") " pod="openstack/root-account-create-update-hspjc" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.690127 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkwhc\" (UniqueName: \"kubernetes.io/projected/15a688ac-ce3d-40e9-90d0-b013569164e3-kube-api-access-zkwhc\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.690151 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/910f857c-2f81-4747-8401-0b59bab921a2-operator-scripts\") pod \"root-account-create-update-hspjc\" (UID: \"910f857c-2f81-4747-8401-0b59bab921a2\") " pod="openstack/root-account-create-update-hspjc" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.690193 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.690218 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.690246 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.690272 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-82420923-4549-44bf-81a2-5cca6d09b55a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82420923-4549-44bf-81a2-5cca6d09b55a\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.690343 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/15a688ac-ce3d-40e9-90d0-b013569164e3-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.690403 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/15a688ac-ce3d-40e9-90d0-b013569164e3-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.690430 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/15a688ac-ce3d-40e9-90d0-b013569164e3-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.691604 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/910f857c-2f81-4747-8401-0b59bab921a2-operator-scripts\") pod \"root-account-create-update-hspjc\" (UID: \"910f857c-2f81-4747-8401-0b59bab921a2\") " pod="openstack/root-account-create-update-hspjc" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.710640 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwqjb\" (UniqueName: \"kubernetes.io/projected/910f857c-2f81-4747-8401-0b59bab921a2-kube-api-access-cwqjb\") pod \"root-account-create-update-hspjc\" (UID: \"910f857c-2f81-4747-8401-0b59bab921a2\") " pod="openstack/root-account-create-update-hspjc" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.729493 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hspjc" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.791589 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.791643 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/15a688ac-ce3d-40e9-90d0-b013569164e3-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.791667 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/15a688ac-ce3d-40e9-90d0-b013569164e3-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.791698 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-config\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.791717 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.791736 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkwhc\" (UniqueName: \"kubernetes.io/projected/15a688ac-ce3d-40e9-90d0-b013569164e3-kube-api-access-zkwhc\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.791768 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.791787 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.791808 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.791829 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-82420923-4549-44bf-81a2-5cca6d09b55a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82420923-4549-44bf-81a2-5cca6d09b55a\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.791886 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/15a688ac-ce3d-40e9-90d0-b013569164e3-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.791924 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/15a688ac-ce3d-40e9-90d0-b013569164e3-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.791940 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/15a688ac-ce3d-40e9-90d0-b013569164e3-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.792729 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/15a688ac-ce3d-40e9-90d0-b013569164e3-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.793902 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/15a688ac-ce3d-40e9-90d0-b013569164e3-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.794042 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/15a688ac-ce3d-40e9-90d0-b013569164e3-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.807824 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.807824 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.811509 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.813274 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/15a688ac-ce3d-40e9-90d0-b013569164e3-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.814399 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.818385 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/15a688ac-ce3d-40e9-90d0-b013569164e3-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.818895 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-config\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.821861 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.833701 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.833741 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-82420923-4549-44bf-81a2-5cca6d09b55a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82420923-4549-44bf-81a2-5cca6d09b55a\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7aef4d84e7f064b8dddb5f07903a3617545888f3f79f605754eebcaaed810a22/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.854848 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkwhc\" (UniqueName: \"kubernetes.io/projected/15a688ac-ce3d-40e9-90d0-b013569164e3-kube-api-access-zkwhc\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:09 crc kubenswrapper[4749]: I0219 18:51:09.929624 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-82420923-4549-44bf-81a2-5cca6d09b55a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82420923-4549-44bf-81a2-5cca6d09b55a\") pod \"prometheus-metric-storage-0\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:10 crc kubenswrapper[4749]: I0219 18:51:10.170092 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:10 crc kubenswrapper[4749]: I0219 18:51:10.335739 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-hspjc"] Feb 19 18:51:10 crc kubenswrapper[4749]: W0219 18:51:10.348393 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod910f857c_2f81_4747_8401_0b59bab921a2.slice/crio-c2e9ad07d0437727a551271922f6031842f4270f7ea3210c4c6a3318f3dd0791 WatchSource:0}: Error finding container c2e9ad07d0437727a551271922f6031842f4270f7ea3210c4c6a3318f3dd0791: Status 404 returned error can't find the container with id c2e9ad07d0437727a551271922f6031842f4270f7ea3210c4c6a3318f3dd0791 Feb 19 18:51:10 crc kubenswrapper[4749]: I0219 18:51:10.470868 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hspjc" event={"ID":"910f857c-2f81-4747-8401-0b59bab921a2","Type":"ContainerStarted","Data":"c2e9ad07d0437727a551271922f6031842f4270f7ea3210c4c6a3318f3dd0791"} Feb 19 18:51:10 crc kubenswrapper[4749]: I0219 18:51:10.707394 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21b53583-c33c-47c6-8351-35bd5f08e632" path="/var/lib/kubelet/pods/21b53583-c33c-47c6-8351-35bd5f08e632/volumes" Feb 19 18:51:10 crc kubenswrapper[4749]: I0219 18:51:10.708770 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 18:51:10 crc kubenswrapper[4749]: I0219 18:51:10.917248 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ztkm4-config-8g7rw" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.010968 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7becec20-41fe-439c-bed2-0f2bc39d558c-var-run\") pod \"7becec20-41fe-439c-bed2-0f2bc39d558c\" (UID: \"7becec20-41fe-439c-bed2-0f2bc39d558c\") " Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.011098 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7becec20-41fe-439c-bed2-0f2bc39d558c-var-run" (OuterVolumeSpecName: "var-run") pod "7becec20-41fe-439c-bed2-0f2bc39d558c" (UID: "7becec20-41fe-439c-bed2-0f2bc39d558c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.011575 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7becec20-41fe-439c-bed2-0f2bc39d558c-scripts\") pod \"7becec20-41fe-439c-bed2-0f2bc39d558c\" (UID: \"7becec20-41fe-439c-bed2-0f2bc39d558c\") " Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.011636 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7becec20-41fe-439c-bed2-0f2bc39d558c-additional-scripts\") pod \"7becec20-41fe-439c-bed2-0f2bc39d558c\" (UID: \"7becec20-41fe-439c-bed2-0f2bc39d558c\") " Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.011682 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7becec20-41fe-439c-bed2-0f2bc39d558c-var-run-ovn\") pod \"7becec20-41fe-439c-bed2-0f2bc39d558c\" (UID: \"7becec20-41fe-439c-bed2-0f2bc39d558c\") " Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.011712 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4vth\" (UniqueName: \"kubernetes.io/projected/7becec20-41fe-439c-bed2-0f2bc39d558c-kube-api-access-t4vth\") pod \"7becec20-41fe-439c-bed2-0f2bc39d558c\" (UID: \"7becec20-41fe-439c-bed2-0f2bc39d558c\") " Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.011747 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7becec20-41fe-439c-bed2-0f2bc39d558c-var-log-ovn\") pod \"7becec20-41fe-439c-bed2-0f2bc39d558c\" (UID: \"7becec20-41fe-439c-bed2-0f2bc39d558c\") " Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.011785 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7becec20-41fe-439c-bed2-0f2bc39d558c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "7becec20-41fe-439c-bed2-0f2bc39d558c" (UID: "7becec20-41fe-439c-bed2-0f2bc39d558c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.012086 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7becec20-41fe-439c-bed2-0f2bc39d558c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "7becec20-41fe-439c-bed2-0f2bc39d558c" (UID: "7becec20-41fe-439c-bed2-0f2bc39d558c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.012399 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7becec20-41fe-439c-bed2-0f2bc39d558c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "7becec20-41fe-439c-bed2-0f2bc39d558c" (UID: "7becec20-41fe-439c-bed2-0f2bc39d558c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.012432 4749 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7becec20-41fe-439c-bed2-0f2bc39d558c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.012461 4749 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7becec20-41fe-439c-bed2-0f2bc39d558c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.012473 4749 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7becec20-41fe-439c-bed2-0f2bc39d558c-var-run\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.013269 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7becec20-41fe-439c-bed2-0f2bc39d558c-scripts" (OuterVolumeSpecName: "scripts") pod "7becec20-41fe-439c-bed2-0f2bc39d558c" (UID: "7becec20-41fe-439c-bed2-0f2bc39d558c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.016005 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7becec20-41fe-439c-bed2-0f2bc39d558c-kube-api-access-t4vth" (OuterVolumeSpecName: "kube-api-access-t4vth") pod "7becec20-41fe-439c-bed2-0f2bc39d558c" (UID: "7becec20-41fe-439c-bed2-0f2bc39d558c"). InnerVolumeSpecName "kube-api-access-t4vth". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.113480 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7becec20-41fe-439c-bed2-0f2bc39d558c-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.113708 4749 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7becec20-41fe-439c-bed2-0f2bc39d558c-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.113719 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4vth\" (UniqueName: \"kubernetes.io/projected/7becec20-41fe-439c-bed2-0f2bc39d558c-kube-api-access-t4vth\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:11 crc kubenswrapper[4749]: E0219 18:51:11.233347 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod910f857c_2f81_4747_8401_0b59bab921a2.slice/crio-conmon-8c17a93c00a141afefd774bb8ff010f31d0100825f40064435589d1ad56bcb39.scope\": RecentStats: unable to find data in memory cache]" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.396306 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ztkm4" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.479691 4749 generic.go:334] "Generic (PLEG): container finished" podID="910f857c-2f81-4747-8401-0b59bab921a2" containerID="8c17a93c00a141afefd774bb8ff010f31d0100825f40064435589d1ad56bcb39" exitCode=0 Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.479750 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hspjc" event={"ID":"910f857c-2f81-4747-8401-0b59bab921a2","Type":"ContainerDied","Data":"8c17a93c00a141afefd774bb8ff010f31d0100825f40064435589d1ad56bcb39"} Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.484064 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"15a688ac-ce3d-40e9-90d0-b013569164e3","Type":"ContainerStarted","Data":"6232beab92b2ee5f38edfd6b54766ce3eefef5f79d59f38952df942c0abd0b94"} Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.486001 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ztkm4-config-8g7rw" event={"ID":"7becec20-41fe-439c-bed2-0f2bc39d558c","Type":"ContainerDied","Data":"768754bac7eff6f12ef8ff062b2322d4cf4c3218988f4af2248af4a0b099e56b"} Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.486042 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="768754bac7eff6f12ef8ff062b2322d4cf4c3218988f4af2248af4a0b099e56b" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.486105 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ztkm4-config-8g7rw" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.556476 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ztkm4-config-8g7rw"] Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.565086 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ztkm4-config-8g7rw"] Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.589073 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ztkm4-config-jmmzz"] Feb 19 18:51:11 crc kubenswrapper[4749]: E0219 18:51:11.589476 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7becec20-41fe-439c-bed2-0f2bc39d558c" containerName="ovn-config" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.589499 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7becec20-41fe-439c-bed2-0f2bc39d558c" containerName="ovn-config" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.589741 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7becec20-41fe-439c-bed2-0f2bc39d558c" containerName="ovn-config" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.590458 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ztkm4-config-jmmzz" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.592568 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.613186 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ztkm4-config-jmmzz"] Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.621421 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d88e0b5-c217-4481-bbbd-594240990a40-var-run-ovn\") pod \"ovn-controller-ztkm4-config-jmmzz\" (UID: \"7d88e0b5-c217-4481-bbbd-594240990a40\") " pod="openstack/ovn-controller-ztkm4-config-jmmzz" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.621504 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d88e0b5-c217-4481-bbbd-594240990a40-scripts\") pod \"ovn-controller-ztkm4-config-jmmzz\" (UID: \"7d88e0b5-c217-4481-bbbd-594240990a40\") " pod="openstack/ovn-controller-ztkm4-config-jmmzz" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.621627 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7d88e0b5-c217-4481-bbbd-594240990a40-var-log-ovn\") pod \"ovn-controller-ztkm4-config-jmmzz\" (UID: \"7d88e0b5-c217-4481-bbbd-594240990a40\") " pod="openstack/ovn-controller-ztkm4-config-jmmzz" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.621698 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmzcz\" (UniqueName: \"kubernetes.io/projected/7d88e0b5-c217-4481-bbbd-594240990a40-kube-api-access-nmzcz\") pod \"ovn-controller-ztkm4-config-jmmzz\" (UID: \"7d88e0b5-c217-4481-bbbd-594240990a40\") " pod="openstack/ovn-controller-ztkm4-config-jmmzz" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.621802 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7d88e0b5-c217-4481-bbbd-594240990a40-additional-scripts\") pod \"ovn-controller-ztkm4-config-jmmzz\" (UID: \"7d88e0b5-c217-4481-bbbd-594240990a40\") " pod="openstack/ovn-controller-ztkm4-config-jmmzz" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.621946 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7d88e0b5-c217-4481-bbbd-594240990a40-var-run\") pod \"ovn-controller-ztkm4-config-jmmzz\" (UID: \"7d88e0b5-c217-4481-bbbd-594240990a40\") " pod="openstack/ovn-controller-ztkm4-config-jmmzz" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.723284 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7d88e0b5-c217-4481-bbbd-594240990a40-var-run\") pod \"ovn-controller-ztkm4-config-jmmzz\" (UID: \"7d88e0b5-c217-4481-bbbd-594240990a40\") " pod="openstack/ovn-controller-ztkm4-config-jmmzz" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.723409 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d88e0b5-c217-4481-bbbd-594240990a40-var-run-ovn\") pod \"ovn-controller-ztkm4-config-jmmzz\" (UID: \"7d88e0b5-c217-4481-bbbd-594240990a40\") " pod="openstack/ovn-controller-ztkm4-config-jmmzz" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.723451 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d88e0b5-c217-4481-bbbd-594240990a40-scripts\") pod \"ovn-controller-ztkm4-config-jmmzz\" (UID: \"7d88e0b5-c217-4481-bbbd-594240990a40\") " pod="openstack/ovn-controller-ztkm4-config-jmmzz" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.723469 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7d88e0b5-c217-4481-bbbd-594240990a40-var-log-ovn\") pod \"ovn-controller-ztkm4-config-jmmzz\" (UID: \"7d88e0b5-c217-4481-bbbd-594240990a40\") " pod="openstack/ovn-controller-ztkm4-config-jmmzz" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.723494 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmzcz\" (UniqueName: \"kubernetes.io/projected/7d88e0b5-c217-4481-bbbd-594240990a40-kube-api-access-nmzcz\") pod \"ovn-controller-ztkm4-config-jmmzz\" (UID: \"7d88e0b5-c217-4481-bbbd-594240990a40\") " pod="openstack/ovn-controller-ztkm4-config-jmmzz" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.723519 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7d88e0b5-c217-4481-bbbd-594240990a40-additional-scripts\") pod \"ovn-controller-ztkm4-config-jmmzz\" (UID: \"7d88e0b5-c217-4481-bbbd-594240990a40\") " pod="openstack/ovn-controller-ztkm4-config-jmmzz" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.724194 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d88e0b5-c217-4481-bbbd-594240990a40-var-run-ovn\") pod \"ovn-controller-ztkm4-config-jmmzz\" (UID: \"7d88e0b5-c217-4481-bbbd-594240990a40\") " pod="openstack/ovn-controller-ztkm4-config-jmmzz" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.724277 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7d88e0b5-c217-4481-bbbd-594240990a40-var-log-ovn\") pod \"ovn-controller-ztkm4-config-jmmzz\" (UID: \"7d88e0b5-c217-4481-bbbd-594240990a40\") " pod="openstack/ovn-controller-ztkm4-config-jmmzz" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.724310 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7d88e0b5-c217-4481-bbbd-594240990a40-additional-scripts\") pod \"ovn-controller-ztkm4-config-jmmzz\" (UID: \"7d88e0b5-c217-4481-bbbd-594240990a40\") " pod="openstack/ovn-controller-ztkm4-config-jmmzz" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.727009 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d88e0b5-c217-4481-bbbd-594240990a40-scripts\") pod \"ovn-controller-ztkm4-config-jmmzz\" (UID: \"7d88e0b5-c217-4481-bbbd-594240990a40\") " pod="openstack/ovn-controller-ztkm4-config-jmmzz" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.727231 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7d88e0b5-c217-4481-bbbd-594240990a40-var-run\") pod \"ovn-controller-ztkm4-config-jmmzz\" (UID: \"7d88e0b5-c217-4481-bbbd-594240990a40\") " pod="openstack/ovn-controller-ztkm4-config-jmmzz" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.753241 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmzcz\" (UniqueName: \"kubernetes.io/projected/7d88e0b5-c217-4481-bbbd-594240990a40-kube-api-access-nmzcz\") pod \"ovn-controller-ztkm4-config-jmmzz\" (UID: \"7d88e0b5-c217-4481-bbbd-594240990a40\") " pod="openstack/ovn-controller-ztkm4-config-jmmzz" Feb 19 18:51:11 crc kubenswrapper[4749]: I0219 18:51:11.909830 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ztkm4-config-jmmzz" Feb 19 18:51:12 crc kubenswrapper[4749]: I0219 18:51:12.687961 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7becec20-41fe-439c-bed2-0f2bc39d558c" path="/var/lib/kubelet/pods/7becec20-41fe-439c-bed2-0f2bc39d558c/volumes" Feb 19 18:51:14 crc kubenswrapper[4749]: I0219 18:51:14.529443 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"15a688ac-ce3d-40e9-90d0-b013569164e3","Type":"ContainerStarted","Data":"932f0f8fbcf7192d70ec8c5b1ecfa8ee758f7808e4bb9fc686ea1d9e697ceeb8"} Feb 19 18:51:16 crc kubenswrapper[4749]: I0219 18:51:16.706304 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ece11938-c758-4d62-ad84-c630d040f511-etc-swift\") pod \"swift-storage-0\" (UID: \"ece11938-c758-4d62-ad84-c630d040f511\") " pod="openstack/swift-storage-0" Feb 19 18:51:16 crc kubenswrapper[4749]: I0219 18:51:16.713249 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ece11938-c758-4d62-ad84-c630d040f511-etc-swift\") pod \"swift-storage-0\" (UID: \"ece11938-c758-4d62-ad84-c630d040f511\") " pod="openstack/swift-storage-0" Feb 19 18:51:16 crc kubenswrapper[4749]: I0219 18:51:16.904542 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 19 18:51:17 crc kubenswrapper[4749]: I0219 18:51:17.372919 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/notifications-rabbitmq-server-0" Feb 19 18:51:17 crc kubenswrapper[4749]: I0219 18:51:17.940528 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:51:17 crc kubenswrapper[4749]: I0219 18:51:17.960245 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 19 18:51:18 crc kubenswrapper[4749]: I0219 18:51:18.687288 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hspjc" Feb 19 18:51:18 crc kubenswrapper[4749]: I0219 18:51:18.847592 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwqjb\" (UniqueName: \"kubernetes.io/projected/910f857c-2f81-4747-8401-0b59bab921a2-kube-api-access-cwqjb\") pod \"910f857c-2f81-4747-8401-0b59bab921a2\" (UID: \"910f857c-2f81-4747-8401-0b59bab921a2\") " Feb 19 18:51:18 crc kubenswrapper[4749]: I0219 18:51:18.848049 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/910f857c-2f81-4747-8401-0b59bab921a2-operator-scripts\") pod \"910f857c-2f81-4747-8401-0b59bab921a2\" (UID: \"910f857c-2f81-4747-8401-0b59bab921a2\") " Feb 19 18:51:18 crc kubenswrapper[4749]: I0219 18:51:18.849133 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/910f857c-2f81-4747-8401-0b59bab921a2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "910f857c-2f81-4747-8401-0b59bab921a2" (UID: "910f857c-2f81-4747-8401-0b59bab921a2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:18 crc kubenswrapper[4749]: I0219 18:51:18.855428 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/910f857c-2f81-4747-8401-0b59bab921a2-kube-api-access-cwqjb" (OuterVolumeSpecName: "kube-api-access-cwqjb") pod "910f857c-2f81-4747-8401-0b59bab921a2" (UID: "910f857c-2f81-4747-8401-0b59bab921a2"). InnerVolumeSpecName "kube-api-access-cwqjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:51:19 crc kubenswrapper[4749]: I0219 18:51:18.950055 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwqjb\" (UniqueName: \"kubernetes.io/projected/910f857c-2f81-4747-8401-0b59bab921a2-kube-api-access-cwqjb\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:19 crc kubenswrapper[4749]: I0219 18:51:18.950099 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/910f857c-2f81-4747-8401-0b59bab921a2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:19 crc kubenswrapper[4749]: I0219 18:51:19.052218 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ztkm4-config-jmmzz"] Feb 19 18:51:19 crc kubenswrapper[4749]: I0219 18:51:19.155710 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 19 18:51:19 crc kubenswrapper[4749]: I0219 18:51:19.570800 4749 generic.go:334] "Generic (PLEG): container finished" podID="15a688ac-ce3d-40e9-90d0-b013569164e3" containerID="932f0f8fbcf7192d70ec8c5b1ecfa8ee758f7808e4bb9fc686ea1d9e697ceeb8" exitCode=0 Feb 19 18:51:19 crc kubenswrapper[4749]: I0219 18:51:19.570986 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"15a688ac-ce3d-40e9-90d0-b013569164e3","Type":"ContainerDied","Data":"932f0f8fbcf7192d70ec8c5b1ecfa8ee758f7808e4bb9fc686ea1d9e697ceeb8"} Feb 19 18:51:19 crc kubenswrapper[4749]: I0219 18:51:19.593948 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ztkm4-config-jmmzz" event={"ID":"7d88e0b5-c217-4481-bbbd-594240990a40","Type":"ContainerStarted","Data":"ea3111911f70a12e19c82baf183a7d9b4294320ae0c58115da896f67e0ca42fc"} Feb 19 18:51:19 crc kubenswrapper[4749]: I0219 18:51:19.593990 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ztkm4-config-jmmzz" event={"ID":"7d88e0b5-c217-4481-bbbd-594240990a40","Type":"ContainerStarted","Data":"6d81f7708d5dd021310db9c4e14f2dc86d6c608404977a97f576ffac3a59b944"} Feb 19 18:51:19 crc kubenswrapper[4749]: I0219 18:51:19.600807 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9wjqf" event={"ID":"4fb0bea8-d0bd-4a01-a752-7c9697971db8","Type":"ContainerStarted","Data":"41eb58b1338db1e882273d3a36b1da142d86f374ea98f3b78a0578f261bd70df"} Feb 19 18:51:19 crc kubenswrapper[4749]: I0219 18:51:19.608686 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hspjc" event={"ID":"910f857c-2f81-4747-8401-0b59bab921a2","Type":"ContainerDied","Data":"c2e9ad07d0437727a551271922f6031842f4270f7ea3210c4c6a3318f3dd0791"} Feb 19 18:51:19 crc kubenswrapper[4749]: I0219 18:51:19.608724 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2e9ad07d0437727a551271922f6031842f4270f7ea3210c4c6a3318f3dd0791" Feb 19 18:51:19 crc kubenswrapper[4749]: I0219 18:51:19.608776 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hspjc" Feb 19 18:51:19 crc kubenswrapper[4749]: I0219 18:51:19.615022 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ece11938-c758-4d62-ad84-c630d040f511","Type":"ContainerStarted","Data":"2233dc71bc121f122b196d2980f13ad1ac9d9de7226abd9d6e94c18e61d3dea4"} Feb 19 18:51:19 crc kubenswrapper[4749]: I0219 18:51:19.619399 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ztkm4-config-jmmzz" podStartSLOduration=8.61938098 podStartE2EDuration="8.61938098s" podCreationTimestamp="2026-02-19 18:51:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:51:19.615303351 +0000 UTC m=+1053.576523295" watchObservedRunningTime="2026-02-19 18:51:19.61938098 +0000 UTC m=+1053.580600934" Feb 19 18:51:19 crc kubenswrapper[4749]: I0219 18:51:19.638713 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-9wjqf" podStartSLOduration=2.126022191 podStartE2EDuration="18.63869579s" podCreationTimestamp="2026-02-19 18:51:01 +0000 UTC" firstStartedPulling="2026-02-19 18:51:02.294315384 +0000 UTC m=+1036.255535338" lastFinishedPulling="2026-02-19 18:51:18.806988983 +0000 UTC m=+1052.768208937" observedRunningTime="2026-02-19 18:51:19.635435341 +0000 UTC m=+1053.596655295" watchObservedRunningTime="2026-02-19 18:51:19.63869579 +0000 UTC m=+1053.599915744" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.392598 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-rhcc5"] Feb 19 18:51:20 crc kubenswrapper[4749]: E0219 18:51:20.395991 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="910f857c-2f81-4747-8401-0b59bab921a2" containerName="mariadb-account-create-update" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.396108 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="910f857c-2f81-4747-8401-0b59bab921a2" containerName="mariadb-account-create-update" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.396387 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="910f857c-2f81-4747-8401-0b59bab921a2" containerName="mariadb-account-create-update" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.397189 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rhcc5" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.402164 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-rhcc5"] Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.488719 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlhp8\" (UniqueName: \"kubernetes.io/projected/4fa888c5-51f3-47a3-ae88-65694f44677a-kube-api-access-wlhp8\") pod \"barbican-db-create-rhcc5\" (UID: \"4fa888c5-51f3-47a3-ae88-65694f44677a\") " pod="openstack/barbican-db-create-rhcc5" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.488848 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fa888c5-51f3-47a3-ae88-65694f44677a-operator-scripts\") pod \"barbican-db-create-rhcc5\" (UID: \"4fa888c5-51f3-47a3-ae88-65694f44677a\") " pod="openstack/barbican-db-create-rhcc5" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.514471 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-cbfb-account-create-update-gzp2p"] Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.515704 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cbfb-account-create-update-gzp2p" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.524665 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.529171 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-cbfb-account-create-update-gzp2p"] Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.590585 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fa888c5-51f3-47a3-ae88-65694f44677a-operator-scripts\") pod \"barbican-db-create-rhcc5\" (UID: \"4fa888c5-51f3-47a3-ae88-65694f44677a\") " pod="openstack/barbican-db-create-rhcc5" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.590638 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb2n6\" (UniqueName: \"kubernetes.io/projected/fbc2d05b-8f07-4386-9d4c-604e07f0f265-kube-api-access-pb2n6\") pod \"barbican-cbfb-account-create-update-gzp2p\" (UID: \"fbc2d05b-8f07-4386-9d4c-604e07f0f265\") " pod="openstack/barbican-cbfb-account-create-update-gzp2p" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.590681 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlhp8\" (UniqueName: \"kubernetes.io/projected/4fa888c5-51f3-47a3-ae88-65694f44677a-kube-api-access-wlhp8\") pod \"barbican-db-create-rhcc5\" (UID: \"4fa888c5-51f3-47a3-ae88-65694f44677a\") " pod="openstack/barbican-db-create-rhcc5" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.590739 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbc2d05b-8f07-4386-9d4c-604e07f0f265-operator-scripts\") pod \"barbican-cbfb-account-create-update-gzp2p\" (UID: \"fbc2d05b-8f07-4386-9d4c-604e07f0f265\") " pod="openstack/barbican-cbfb-account-create-update-gzp2p" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.591388 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fa888c5-51f3-47a3-ae88-65694f44677a-operator-scripts\") pod \"barbican-db-create-rhcc5\" (UID: \"4fa888c5-51f3-47a3-ae88-65694f44677a\") " pod="openstack/barbican-db-create-rhcc5" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.641007 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlhp8\" (UniqueName: \"kubernetes.io/projected/4fa888c5-51f3-47a3-ae88-65694f44677a-kube-api-access-wlhp8\") pod \"barbican-db-create-rhcc5\" (UID: \"4fa888c5-51f3-47a3-ae88-65694f44677a\") " pod="openstack/barbican-db-create-rhcc5" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.644163 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-1d78-account-create-update-lrl5l"] Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.646175 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1d78-account-create-update-lrl5l" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.656582 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.721008 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbc2d05b-8f07-4386-9d4c-604e07f0f265-operator-scripts\") pod \"barbican-cbfb-account-create-update-gzp2p\" (UID: \"fbc2d05b-8f07-4386-9d4c-604e07f0f265\") " pod="openstack/barbican-cbfb-account-create-update-gzp2p" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.721202 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb2n6\" (UniqueName: \"kubernetes.io/projected/fbc2d05b-8f07-4386-9d4c-604e07f0f265-kube-api-access-pb2n6\") pod \"barbican-cbfb-account-create-update-gzp2p\" (UID: \"fbc2d05b-8f07-4386-9d4c-604e07f0f265\") " pod="openstack/barbican-cbfb-account-create-update-gzp2p" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.733173 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbc2d05b-8f07-4386-9d4c-604e07f0f265-operator-scripts\") pod \"barbican-cbfb-account-create-update-gzp2p\" (UID: \"fbc2d05b-8f07-4386-9d4c-604e07f0f265\") " pod="openstack/barbican-cbfb-account-create-update-gzp2p" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.734098 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rhcc5" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.750084 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-nggkc"] Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.751366 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nggkc" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.766163 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-nggkc"] Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.777798 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb2n6\" (UniqueName: \"kubernetes.io/projected/fbc2d05b-8f07-4386-9d4c-604e07f0f265-kube-api-access-pb2n6\") pod \"barbican-cbfb-account-create-update-gzp2p\" (UID: \"fbc2d05b-8f07-4386-9d4c-604e07f0f265\") " pod="openstack/barbican-cbfb-account-create-update-gzp2p" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.793440 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-1d78-account-create-update-lrl5l"] Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.803293 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ece11938-c758-4d62-ad84-c630d040f511","Type":"ContainerStarted","Data":"c17e4d3c3b20f88b5b3d218549f34b2f2166e48ae6ff8749817b6837b1e2b532"} Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.803360 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ece11938-c758-4d62-ad84-c630d040f511","Type":"ContainerStarted","Data":"66241b6744cb4f4f30fae865efd6e86a3adb6d68dad79213da8d27dfa522ee33"} Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.820176 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"15a688ac-ce3d-40e9-90d0-b013569164e3","Type":"ContainerStarted","Data":"ac50eb7c57f2273200cf4650239284e60b3fcea351fbcab08791bcf1592a0f5a"} Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.839509 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1767b8ff-e819-4268-982c-57cd067b1cd5-operator-scripts\") pod \"cinder-1d78-account-create-update-lrl5l\" (UID: \"1767b8ff-e819-4268-982c-57cd067b1cd5\") " pod="openstack/cinder-1d78-account-create-update-lrl5l" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.839762 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g5bv\" (UniqueName: \"kubernetes.io/projected/f34e48f9-e454-4f11-b78e-965516098e91-kube-api-access-6g5bv\") pod \"cinder-db-create-nggkc\" (UID: \"f34e48f9-e454-4f11-b78e-965516098e91\") " pod="openstack/cinder-db-create-nggkc" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.850162 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f34e48f9-e454-4f11-b78e-965516098e91-operator-scripts\") pod \"cinder-db-create-nggkc\" (UID: \"f34e48f9-e454-4f11-b78e-965516098e91\") " pod="openstack/cinder-db-create-nggkc" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.848623 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cbfb-account-create-update-gzp2p" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.850390 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqc2c\" (UniqueName: \"kubernetes.io/projected/1767b8ff-e819-4268-982c-57cd067b1cd5-kube-api-access-bqc2c\") pod \"cinder-1d78-account-create-update-lrl5l\" (UID: \"1767b8ff-e819-4268-982c-57cd067b1cd5\") " pod="openstack/cinder-1d78-account-create-update-lrl5l" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.868817 4749 generic.go:334] "Generic (PLEG): container finished" podID="7d88e0b5-c217-4481-bbbd-594240990a40" containerID="ea3111911f70a12e19c82baf183a7d9b4294320ae0c58115da896f67e0ca42fc" exitCode=0 Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.869865 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ztkm4-config-jmmzz" event={"ID":"7d88e0b5-c217-4481-bbbd-594240990a40","Type":"ContainerDied","Data":"ea3111911f70a12e19c82baf183a7d9b4294320ae0c58115da896f67e0ca42fc"} Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.955089 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-trrfz"] Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.956569 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-trrfz" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.956989 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqc2c\" (UniqueName: \"kubernetes.io/projected/1767b8ff-e819-4268-982c-57cd067b1cd5-kube-api-access-bqc2c\") pod \"cinder-1d78-account-create-update-lrl5l\" (UID: \"1767b8ff-e819-4268-982c-57cd067b1cd5\") " pod="openstack/cinder-1d78-account-create-update-lrl5l" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.957073 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1767b8ff-e819-4268-982c-57cd067b1cd5-operator-scripts\") pod \"cinder-1d78-account-create-update-lrl5l\" (UID: \"1767b8ff-e819-4268-982c-57cd067b1cd5\") " pod="openstack/cinder-1d78-account-create-update-lrl5l" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.957199 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g5bv\" (UniqueName: \"kubernetes.io/projected/f34e48f9-e454-4f11-b78e-965516098e91-kube-api-access-6g5bv\") pod \"cinder-db-create-nggkc\" (UID: \"f34e48f9-e454-4f11-b78e-965516098e91\") " pod="openstack/cinder-db-create-nggkc" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.957224 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f34e48f9-e454-4f11-b78e-965516098e91-operator-scripts\") pod \"cinder-db-create-nggkc\" (UID: \"f34e48f9-e454-4f11-b78e-965516098e91\") " pod="openstack/cinder-db-create-nggkc" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.957856 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f34e48f9-e454-4f11-b78e-965516098e91-operator-scripts\") pod \"cinder-db-create-nggkc\" (UID: \"f34e48f9-e454-4f11-b78e-965516098e91\") " pod="openstack/cinder-db-create-nggkc" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.958951 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1767b8ff-e819-4268-982c-57cd067b1cd5-operator-scripts\") pod \"cinder-1d78-account-create-update-lrl5l\" (UID: \"1767b8ff-e819-4268-982c-57cd067b1cd5\") " pod="openstack/cinder-1d78-account-create-update-lrl5l" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.969620 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.969910 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-xqgf2" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.970115 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.970248 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-trrfz"] Feb 19 18:51:20 crc kubenswrapper[4749]: I0219 18:51:20.970314 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.005245 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqc2c\" (UniqueName: \"kubernetes.io/projected/1767b8ff-e819-4268-982c-57cd067b1cd5-kube-api-access-bqc2c\") pod \"cinder-1d78-account-create-update-lrl5l\" (UID: \"1767b8ff-e819-4268-982c-57cd067b1cd5\") " pod="openstack/cinder-1d78-account-create-update-lrl5l" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.021318 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g5bv\" (UniqueName: \"kubernetes.io/projected/f34e48f9-e454-4f11-b78e-965516098e91-kube-api-access-6g5bv\") pod \"cinder-db-create-nggkc\" (UID: \"f34e48f9-e454-4f11-b78e-965516098e91\") " pod="openstack/cinder-db-create-nggkc" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.035104 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-2xmjl"] Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.036561 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2xmjl" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.060777 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/284da664-b432-48cf-8f30-fa7cc57bd5b3-combined-ca-bundle\") pod \"keystone-db-sync-trrfz\" (UID: \"284da664-b432-48cf-8f30-fa7cc57bd5b3\") " pod="openstack/keystone-db-sync-trrfz" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.061438 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15f880df-7af3-4bab-91f1-5085b70a86d0-operator-scripts\") pod \"neutron-db-create-2xmjl\" (UID: \"15f880df-7af3-4bab-91f1-5085b70a86d0\") " pod="openstack/neutron-db-create-2xmjl" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.061524 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/284da664-b432-48cf-8f30-fa7cc57bd5b3-config-data\") pod \"keystone-db-sync-trrfz\" (UID: \"284da664-b432-48cf-8f30-fa7cc57bd5b3\") " pod="openstack/keystone-db-sync-trrfz" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.061605 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttflw\" (UniqueName: \"kubernetes.io/projected/284da664-b432-48cf-8f30-fa7cc57bd5b3-kube-api-access-ttflw\") pod \"keystone-db-sync-trrfz\" (UID: \"284da664-b432-48cf-8f30-fa7cc57bd5b3\") " pod="openstack/keystone-db-sync-trrfz" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.061795 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld7m5\" (UniqueName: \"kubernetes.io/projected/15f880df-7af3-4bab-91f1-5085b70a86d0-kube-api-access-ld7m5\") pod \"neutron-db-create-2xmjl\" (UID: \"15f880df-7af3-4bab-91f1-5085b70a86d0\") " pod="openstack/neutron-db-create-2xmjl" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.065122 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-14db-account-create-update-xdwgq"] Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.068918 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1d78-account-create-update-lrl5l" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.076223 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-14db-account-create-update-xdwgq" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.083763 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.084188 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2xmjl"] Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.094111 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-14db-account-create-update-xdwgq"] Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.162873 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55d0d068-45a8-49a7-9cbe-a3ff70cc056f-operator-scripts\") pod \"neutron-14db-account-create-update-xdwgq\" (UID: \"55d0d068-45a8-49a7-9cbe-a3ff70cc056f\") " pod="openstack/neutron-14db-account-create-update-xdwgq" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.162963 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld7m5\" (UniqueName: \"kubernetes.io/projected/15f880df-7af3-4bab-91f1-5085b70a86d0-kube-api-access-ld7m5\") pod \"neutron-db-create-2xmjl\" (UID: \"15f880df-7af3-4bab-91f1-5085b70a86d0\") " pod="openstack/neutron-db-create-2xmjl" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.163029 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/284da664-b432-48cf-8f30-fa7cc57bd5b3-combined-ca-bundle\") pod \"keystone-db-sync-trrfz\" (UID: \"284da664-b432-48cf-8f30-fa7cc57bd5b3\") " pod="openstack/keystone-db-sync-trrfz" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.163067 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v8sf\" (UniqueName: \"kubernetes.io/projected/55d0d068-45a8-49a7-9cbe-a3ff70cc056f-kube-api-access-9v8sf\") pod \"neutron-14db-account-create-update-xdwgq\" (UID: \"55d0d068-45a8-49a7-9cbe-a3ff70cc056f\") " pod="openstack/neutron-14db-account-create-update-xdwgq" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.163114 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15f880df-7af3-4bab-91f1-5085b70a86d0-operator-scripts\") pod \"neutron-db-create-2xmjl\" (UID: \"15f880df-7af3-4bab-91f1-5085b70a86d0\") " pod="openstack/neutron-db-create-2xmjl" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.163130 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/284da664-b432-48cf-8f30-fa7cc57bd5b3-config-data\") pod \"keystone-db-sync-trrfz\" (UID: \"284da664-b432-48cf-8f30-fa7cc57bd5b3\") " pod="openstack/keystone-db-sync-trrfz" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.163147 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttflw\" (UniqueName: \"kubernetes.io/projected/284da664-b432-48cf-8f30-fa7cc57bd5b3-kube-api-access-ttflw\") pod \"keystone-db-sync-trrfz\" (UID: \"284da664-b432-48cf-8f30-fa7cc57bd5b3\") " pod="openstack/keystone-db-sync-trrfz" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.165076 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15f880df-7af3-4bab-91f1-5085b70a86d0-operator-scripts\") pod \"neutron-db-create-2xmjl\" (UID: \"15f880df-7af3-4bab-91f1-5085b70a86d0\") " pod="openstack/neutron-db-create-2xmjl" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.170094 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/284da664-b432-48cf-8f30-fa7cc57bd5b3-combined-ca-bundle\") pod \"keystone-db-sync-trrfz\" (UID: \"284da664-b432-48cf-8f30-fa7cc57bd5b3\") " pod="openstack/keystone-db-sync-trrfz" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.180766 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/284da664-b432-48cf-8f30-fa7cc57bd5b3-config-data\") pod \"keystone-db-sync-trrfz\" (UID: \"284da664-b432-48cf-8f30-fa7cc57bd5b3\") " pod="openstack/keystone-db-sync-trrfz" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.191156 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttflw\" (UniqueName: \"kubernetes.io/projected/284da664-b432-48cf-8f30-fa7cc57bd5b3-kube-api-access-ttflw\") pod \"keystone-db-sync-trrfz\" (UID: \"284da664-b432-48cf-8f30-fa7cc57bd5b3\") " pod="openstack/keystone-db-sync-trrfz" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.204029 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld7m5\" (UniqueName: \"kubernetes.io/projected/15f880df-7af3-4bab-91f1-5085b70a86d0-kube-api-access-ld7m5\") pod \"neutron-db-create-2xmjl\" (UID: \"15f880df-7af3-4bab-91f1-5085b70a86d0\") " pod="openstack/neutron-db-create-2xmjl" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.206914 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nggkc" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.264447 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55d0d068-45a8-49a7-9cbe-a3ff70cc056f-operator-scripts\") pod \"neutron-14db-account-create-update-xdwgq\" (UID: \"55d0d068-45a8-49a7-9cbe-a3ff70cc056f\") " pod="openstack/neutron-14db-account-create-update-xdwgq" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.268368 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v8sf\" (UniqueName: \"kubernetes.io/projected/55d0d068-45a8-49a7-9cbe-a3ff70cc056f-kube-api-access-9v8sf\") pod \"neutron-14db-account-create-update-xdwgq\" (UID: \"55d0d068-45a8-49a7-9cbe-a3ff70cc056f\") " pod="openstack/neutron-14db-account-create-update-xdwgq" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.271192 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55d0d068-45a8-49a7-9cbe-a3ff70cc056f-operator-scripts\") pod \"neutron-14db-account-create-update-xdwgq\" (UID: \"55d0d068-45a8-49a7-9cbe-a3ff70cc056f\") " pod="openstack/neutron-14db-account-create-update-xdwgq" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.287066 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-trrfz" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.310609 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v8sf\" (UniqueName: \"kubernetes.io/projected/55d0d068-45a8-49a7-9cbe-a3ff70cc056f-kube-api-access-9v8sf\") pod \"neutron-14db-account-create-update-xdwgq\" (UID: \"55d0d068-45a8-49a7-9cbe-a3ff70cc056f\") " pod="openstack/neutron-14db-account-create-update-xdwgq" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.319104 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-tkp7q"] Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.320215 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-tkp7q" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.324158 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-p6p4n" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.324343 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.359808 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-tkp7q"] Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.373016 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04522760-872e-4efc-a852-726292ac24f4-combined-ca-bundle\") pod \"watcher-db-sync-tkp7q\" (UID: \"04522760-872e-4efc-a852-726292ac24f4\") " pod="openstack/watcher-db-sync-tkp7q" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.373185 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldgjl\" (UniqueName: \"kubernetes.io/projected/04522760-872e-4efc-a852-726292ac24f4-kube-api-access-ldgjl\") pod \"watcher-db-sync-tkp7q\" (UID: \"04522760-872e-4efc-a852-726292ac24f4\") " pod="openstack/watcher-db-sync-tkp7q" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.373243 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04522760-872e-4efc-a852-726292ac24f4-config-data\") pod \"watcher-db-sync-tkp7q\" (UID: \"04522760-872e-4efc-a852-726292ac24f4\") " pod="openstack/watcher-db-sync-tkp7q" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.373425 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/04522760-872e-4efc-a852-726292ac24f4-db-sync-config-data\") pod \"watcher-db-sync-tkp7q\" (UID: \"04522760-872e-4efc-a852-726292ac24f4\") " pod="openstack/watcher-db-sync-tkp7q" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.421789 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2xmjl" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.441268 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-14db-account-create-update-xdwgq" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.474894 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldgjl\" (UniqueName: \"kubernetes.io/projected/04522760-872e-4efc-a852-726292ac24f4-kube-api-access-ldgjl\") pod \"watcher-db-sync-tkp7q\" (UID: \"04522760-872e-4efc-a852-726292ac24f4\") " pod="openstack/watcher-db-sync-tkp7q" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.474952 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04522760-872e-4efc-a852-726292ac24f4-config-data\") pod \"watcher-db-sync-tkp7q\" (UID: \"04522760-872e-4efc-a852-726292ac24f4\") " pod="openstack/watcher-db-sync-tkp7q" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.474979 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/04522760-872e-4efc-a852-726292ac24f4-db-sync-config-data\") pod \"watcher-db-sync-tkp7q\" (UID: \"04522760-872e-4efc-a852-726292ac24f4\") " pod="openstack/watcher-db-sync-tkp7q" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.475022 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04522760-872e-4efc-a852-726292ac24f4-combined-ca-bundle\") pod \"watcher-db-sync-tkp7q\" (UID: \"04522760-872e-4efc-a852-726292ac24f4\") " pod="openstack/watcher-db-sync-tkp7q" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.488366 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/04522760-872e-4efc-a852-726292ac24f4-db-sync-config-data\") pod \"watcher-db-sync-tkp7q\" (UID: \"04522760-872e-4efc-a852-726292ac24f4\") " pod="openstack/watcher-db-sync-tkp7q" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.489807 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04522760-872e-4efc-a852-726292ac24f4-config-data\") pod \"watcher-db-sync-tkp7q\" (UID: \"04522760-872e-4efc-a852-726292ac24f4\") " pod="openstack/watcher-db-sync-tkp7q" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.490555 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04522760-872e-4efc-a852-726292ac24f4-combined-ca-bundle\") pod \"watcher-db-sync-tkp7q\" (UID: \"04522760-872e-4efc-a852-726292ac24f4\") " pod="openstack/watcher-db-sync-tkp7q" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.512555 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldgjl\" (UniqueName: \"kubernetes.io/projected/04522760-872e-4efc-a852-726292ac24f4-kube-api-access-ldgjl\") pod \"watcher-db-sync-tkp7q\" (UID: \"04522760-872e-4efc-a852-726292ac24f4\") " pod="openstack/watcher-db-sync-tkp7q" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.515479 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-rhcc5"] Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.645426 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-tkp7q" Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.670784 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-cbfb-account-create-update-gzp2p"] Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.870175 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-1d78-account-create-update-lrl5l"] Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.934336 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rhcc5" event={"ID":"4fa888c5-51f3-47a3-ae88-65694f44677a","Type":"ContainerStarted","Data":"1ab18b1b301233d4495d4c51b4d490762920610c2f53414c0ecfbb0bef169bac"} Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.941869 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cbfb-account-create-update-gzp2p" event={"ID":"fbc2d05b-8f07-4386-9d4c-604e07f0f265","Type":"ContainerStarted","Data":"88e1a3d68fc3a2943b7f28b5dc67154b573a354b76747fdedf155f010f4e726d"} Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.970188 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ece11938-c758-4d62-ad84-c630d040f511","Type":"ContainerStarted","Data":"410a08b8affd04b0fd469b83afe90f61e6c314603b427a49b53d3a9a15938554"} Feb 19 18:51:21 crc kubenswrapper[4749]: I0219 18:51:21.970227 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ece11938-c758-4d62-ad84-c630d040f511","Type":"ContainerStarted","Data":"f7d3ca71fbe790ec494f12a30e3a4e56fcbcd944343a197c70c3aee66534eb01"} Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.070010 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-nggkc"] Feb 19 18:51:22 crc kubenswrapper[4749]: W0219 18:51:22.078982 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf34e48f9_e454_4f11_b78e_965516098e91.slice/crio-df1589f0a03c323489f6ce9978a10b1d6427c496cc9481dbdb11ae4efdae78d6 WatchSource:0}: Error finding container df1589f0a03c323489f6ce9978a10b1d6427c496cc9481dbdb11ae4efdae78d6: Status 404 returned error can't find the container with id df1589f0a03c323489f6ce9978a10b1d6427c496cc9481dbdb11ae4efdae78d6 Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.092014 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-trrfz"] Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.190961 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2xmjl"] Feb 19 18:51:22 crc kubenswrapper[4749]: W0219 18:51:22.206343 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15f880df_7af3_4bab_91f1_5085b70a86d0.slice/crio-e287229ca80deb76c1ffe3ec7f166df59c2d98c0269212aae4395d1e978f1194 WatchSource:0}: Error finding container e287229ca80deb76c1ffe3ec7f166df59c2d98c0269212aae4395d1e978f1194: Status 404 returned error can't find the container with id e287229ca80deb76c1ffe3ec7f166df59c2d98c0269212aae4395d1e978f1194 Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.323746 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-14db-account-create-update-xdwgq"] Feb 19 18:51:22 crc kubenswrapper[4749]: W0219 18:51:22.335887 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55d0d068_45a8_49a7_9cbe_a3ff70cc056f.slice/crio-756c142bd8e899b5b55fe58868a3b559ccb5fa11980904e221561693f0c74111 WatchSource:0}: Error finding container 756c142bd8e899b5b55fe58868a3b559ccb5fa11980904e221561693f0c74111: Status 404 returned error can't find the container with id 756c142bd8e899b5b55fe58868a3b559ccb5fa11980904e221561693f0c74111 Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.336890 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-tkp7q"] Feb 19 18:51:22 crc kubenswrapper[4749]: W0219 18:51:22.343542 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04522760_872e_4efc_a852_726292ac24f4.slice/crio-3c418d6c071ce094e0b5a939a385d91ddaac3572e2cfd636845541b4460e9446 WatchSource:0}: Error finding container 3c418d6c071ce094e0b5a939a385d91ddaac3572e2cfd636845541b4460e9446: Status 404 returned error can't find the container with id 3c418d6c071ce094e0b5a939a385d91ddaac3572e2cfd636845541b4460e9446 Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.423952 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ztkm4-config-jmmzz" Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.508477 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7d88e0b5-c217-4481-bbbd-594240990a40-additional-scripts\") pod \"7d88e0b5-c217-4481-bbbd-594240990a40\" (UID: \"7d88e0b5-c217-4481-bbbd-594240990a40\") " Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.508573 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7d88e0b5-c217-4481-bbbd-594240990a40-var-run\") pod \"7d88e0b5-c217-4481-bbbd-594240990a40\" (UID: \"7d88e0b5-c217-4481-bbbd-594240990a40\") " Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.508678 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7d88e0b5-c217-4481-bbbd-594240990a40-var-log-ovn\") pod \"7d88e0b5-c217-4481-bbbd-594240990a40\" (UID: \"7d88e0b5-c217-4481-bbbd-594240990a40\") " Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.508712 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d88e0b5-c217-4481-bbbd-594240990a40-var-run-ovn\") pod \"7d88e0b5-c217-4481-bbbd-594240990a40\" (UID: \"7d88e0b5-c217-4481-bbbd-594240990a40\") " Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.508739 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d88e0b5-c217-4481-bbbd-594240990a40-var-run" (OuterVolumeSpecName: "var-run") pod "7d88e0b5-c217-4481-bbbd-594240990a40" (UID: "7d88e0b5-c217-4481-bbbd-594240990a40"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.508758 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmzcz\" (UniqueName: \"kubernetes.io/projected/7d88e0b5-c217-4481-bbbd-594240990a40-kube-api-access-nmzcz\") pod \"7d88e0b5-c217-4481-bbbd-594240990a40\" (UID: \"7d88e0b5-c217-4481-bbbd-594240990a40\") " Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.508863 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d88e0b5-c217-4481-bbbd-594240990a40-scripts\") pod \"7d88e0b5-c217-4481-bbbd-594240990a40\" (UID: \"7d88e0b5-c217-4481-bbbd-594240990a40\") " Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.509047 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d88e0b5-c217-4481-bbbd-594240990a40-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "7d88e0b5-c217-4481-bbbd-594240990a40" (UID: "7d88e0b5-c217-4481-bbbd-594240990a40"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.509148 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d88e0b5-c217-4481-bbbd-594240990a40-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "7d88e0b5-c217-4481-bbbd-594240990a40" (UID: "7d88e0b5-c217-4481-bbbd-594240990a40"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.509478 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d88e0b5-c217-4481-bbbd-594240990a40-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "7d88e0b5-c217-4481-bbbd-594240990a40" (UID: "7d88e0b5-c217-4481-bbbd-594240990a40"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.509643 4749 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7d88e0b5-c217-4481-bbbd-594240990a40-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.509668 4749 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7d88e0b5-c217-4481-bbbd-594240990a40-var-run\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.509679 4749 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7d88e0b5-c217-4481-bbbd-594240990a40-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.509689 4749 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d88e0b5-c217-4481-bbbd-594240990a40-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.509824 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d88e0b5-c217-4481-bbbd-594240990a40-scripts" (OuterVolumeSpecName: "scripts") pod "7d88e0b5-c217-4481-bbbd-594240990a40" (UID: "7d88e0b5-c217-4481-bbbd-594240990a40"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.514246 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d88e0b5-c217-4481-bbbd-594240990a40-kube-api-access-nmzcz" (OuterVolumeSpecName: "kube-api-access-nmzcz") pod "7d88e0b5-c217-4481-bbbd-594240990a40" (UID: "7d88e0b5-c217-4481-bbbd-594240990a40"). InnerVolumeSpecName "kube-api-access-nmzcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.610767 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmzcz\" (UniqueName: \"kubernetes.io/projected/7d88e0b5-c217-4481-bbbd-594240990a40-kube-api-access-nmzcz\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.610807 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d88e0b5-c217-4481-bbbd-594240990a40-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.987261 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ztkm4-config-jmmzz" Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.987267 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ztkm4-config-jmmzz" event={"ID":"7d88e0b5-c217-4481-bbbd-594240990a40","Type":"ContainerDied","Data":"6d81f7708d5dd021310db9c4e14f2dc86d6c608404977a97f576ffac3a59b944"} Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.987805 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d81f7708d5dd021310db9c4e14f2dc86d6c608404977a97f576ffac3a59b944" Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.988123 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-trrfz" event={"ID":"284da664-b432-48cf-8f30-fa7cc57bd5b3","Type":"ContainerStarted","Data":"2dcfa948891457b1ab4dfda1923092ce8ec0432bc78f656baab626fcf2787335"} Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.990167 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1d78-account-create-update-lrl5l" event={"ID":"1767b8ff-e819-4268-982c-57cd067b1cd5","Type":"ContainerStarted","Data":"ad0e7d8564db6dacbe89319f85830b5e2c7bc2b01cbebdcb6e3ab4a6d370319f"} Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.990200 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1d78-account-create-update-lrl5l" event={"ID":"1767b8ff-e819-4268-982c-57cd067b1cd5","Type":"ContainerStarted","Data":"475851cbe77d7d61e6b33d43f7936ff5ac5e0a0e2d46e254f9d83e1314919902"} Feb 19 18:51:22 crc kubenswrapper[4749]: I0219 18:51:22.998456 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"15a688ac-ce3d-40e9-90d0-b013569164e3","Type":"ContainerStarted","Data":"eeb5965773fef4719ed9de17a0da7de9483b329b067732af29e29198c020c7f2"} Feb 19 18:51:23 crc kubenswrapper[4749]: I0219 18:51:23.000045 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-14db-account-create-update-xdwgq" event={"ID":"55d0d068-45a8-49a7-9cbe-a3ff70cc056f","Type":"ContainerStarted","Data":"4f7270a050bf0b55c766d387494cbb82874997b5f40c884d357181fdc9361b53"} Feb 19 18:51:23 crc kubenswrapper[4749]: I0219 18:51:23.000115 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-14db-account-create-update-xdwgq" event={"ID":"55d0d068-45a8-49a7-9cbe-a3ff70cc056f","Type":"ContainerStarted","Data":"756c142bd8e899b5b55fe58868a3b559ccb5fa11980904e221561693f0c74111"} Feb 19 18:51:23 crc kubenswrapper[4749]: I0219 18:51:23.009560 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rhcc5" event={"ID":"4fa888c5-51f3-47a3-ae88-65694f44677a","Type":"ContainerStarted","Data":"51054023a2dc977d46e8925ec806bae10007d1e004f36abc9d03bf6988217c7a"} Feb 19 18:51:23 crc kubenswrapper[4749]: I0219 18:51:23.012948 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-1d78-account-create-update-lrl5l" podStartSLOduration=3.012929392 podStartE2EDuration="3.012929392s" podCreationTimestamp="2026-02-19 18:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:51:23.006769713 +0000 UTC m=+1056.967989687" watchObservedRunningTime="2026-02-19 18:51:23.012929392 +0000 UTC m=+1056.974149346" Feb 19 18:51:23 crc kubenswrapper[4749]: I0219 18:51:23.015183 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nggkc" event={"ID":"f34e48f9-e454-4f11-b78e-965516098e91","Type":"ContainerStarted","Data":"cd30e62108093956c6066b5e2d48fc375104fa0c280c57d78b26afdd27a388b2"} Feb 19 18:51:23 crc kubenswrapper[4749]: I0219 18:51:23.015222 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nggkc" event={"ID":"f34e48f9-e454-4f11-b78e-965516098e91","Type":"ContainerStarted","Data":"df1589f0a03c323489f6ce9978a10b1d6427c496cc9481dbdb11ae4efdae78d6"} Feb 19 18:51:23 crc kubenswrapper[4749]: I0219 18:51:23.018979 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cbfb-account-create-update-gzp2p" event={"ID":"fbc2d05b-8f07-4386-9d4c-604e07f0f265","Type":"ContainerStarted","Data":"3554e0aa16a11cbb9c6f94a23b8e93ea5fc408c9dbeaae7a66d9a5d403192f0e"} Feb 19 18:51:23 crc kubenswrapper[4749]: I0219 18:51:23.025399 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-14db-account-create-update-xdwgq" podStartSLOduration=3.025375695 podStartE2EDuration="3.025375695s" podCreationTimestamp="2026-02-19 18:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:51:23.020492536 +0000 UTC m=+1056.981712490" watchObservedRunningTime="2026-02-19 18:51:23.025375695 +0000 UTC m=+1056.986595659" Feb 19 18:51:23 crc kubenswrapper[4749]: I0219 18:51:23.025559 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2xmjl" event={"ID":"15f880df-7af3-4bab-91f1-5085b70a86d0","Type":"ContainerStarted","Data":"c6616dd3022501a5ad9ae3b9f6dd0bcb3db2fb08710e3f51164bce28d0d78b30"} Feb 19 18:51:23 crc kubenswrapper[4749]: I0219 18:51:23.025620 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2xmjl" event={"ID":"15f880df-7af3-4bab-91f1-5085b70a86d0","Type":"ContainerStarted","Data":"e287229ca80deb76c1ffe3ec7f166df59c2d98c0269212aae4395d1e978f1194"} Feb 19 18:51:23 crc kubenswrapper[4749]: I0219 18:51:23.027897 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-tkp7q" event={"ID":"04522760-872e-4efc-a852-726292ac24f4","Type":"ContainerStarted","Data":"3c418d6c071ce094e0b5a939a385d91ddaac3572e2cfd636845541b4460e9446"} Feb 19 18:51:23 crc kubenswrapper[4749]: I0219 18:51:23.047721 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-cbfb-account-create-update-gzp2p" podStartSLOduration=3.047705519 podStartE2EDuration="3.047705519s" podCreationTimestamp="2026-02-19 18:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:51:23.041509407 +0000 UTC m=+1057.002729371" watchObservedRunningTime="2026-02-19 18:51:23.047705519 +0000 UTC m=+1057.008925473" Feb 19 18:51:23 crc kubenswrapper[4749]: I0219 18:51:23.056198 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-rhcc5" podStartSLOduration=3.056179805 podStartE2EDuration="3.056179805s" podCreationTimestamp="2026-02-19 18:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:51:23.0531164 +0000 UTC m=+1057.014336374" watchObservedRunningTime="2026-02-19 18:51:23.056179805 +0000 UTC m=+1057.017399769" Feb 19 18:51:23 crc kubenswrapper[4749]: I0219 18:51:23.073851 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-nggkc" podStartSLOduration=3.073834104 podStartE2EDuration="3.073834104s" podCreationTimestamp="2026-02-19 18:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:51:23.069202961 +0000 UTC m=+1057.030422925" watchObservedRunningTime="2026-02-19 18:51:23.073834104 +0000 UTC m=+1057.035054058" Feb 19 18:51:23 crc kubenswrapper[4749]: I0219 18:51:23.082400 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-2xmjl" podStartSLOduration=3.082380953 podStartE2EDuration="3.082380953s" podCreationTimestamp="2026-02-19 18:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:51:23.081436219 +0000 UTC m=+1057.042656183" watchObservedRunningTime="2026-02-19 18:51:23.082380953 +0000 UTC m=+1057.043600907" Feb 19 18:51:23 crc kubenswrapper[4749]: I0219 18:51:23.511181 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ztkm4-config-jmmzz"] Feb 19 18:51:23 crc kubenswrapper[4749]: I0219 18:51:23.519963 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ztkm4-config-jmmzz"] Feb 19 18:51:24 crc kubenswrapper[4749]: I0219 18:51:24.039248 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"15a688ac-ce3d-40e9-90d0-b013569164e3","Type":"ContainerStarted","Data":"bda247fdba956df0533e901b8f8a7301db7a1cdc1771cafeea1206f9f3adb665"} Feb 19 18:51:24 crc kubenswrapper[4749]: I0219 18:51:24.047241 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-14db-account-create-update-xdwgq" event={"ID":"55d0d068-45a8-49a7-9cbe-a3ff70cc056f","Type":"ContainerDied","Data":"4f7270a050bf0b55c766d387494cbb82874997b5f40c884d357181fdc9361b53"} Feb 19 18:51:24 crc kubenswrapper[4749]: I0219 18:51:24.040824 4749 generic.go:334] "Generic (PLEG): container finished" podID="55d0d068-45a8-49a7-9cbe-a3ff70cc056f" containerID="4f7270a050bf0b55c766d387494cbb82874997b5f40c884d357181fdc9361b53" exitCode=0 Feb 19 18:51:24 crc kubenswrapper[4749]: I0219 18:51:24.050538 4749 generic.go:334] "Generic (PLEG): container finished" podID="4fa888c5-51f3-47a3-ae88-65694f44677a" containerID="51054023a2dc977d46e8925ec806bae10007d1e004f36abc9d03bf6988217c7a" exitCode=0 Feb 19 18:51:24 crc kubenswrapper[4749]: I0219 18:51:24.050603 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rhcc5" event={"ID":"4fa888c5-51f3-47a3-ae88-65694f44677a","Type":"ContainerDied","Data":"51054023a2dc977d46e8925ec806bae10007d1e004f36abc9d03bf6988217c7a"} Feb 19 18:51:24 crc kubenswrapper[4749]: I0219 18:51:24.053522 4749 generic.go:334] "Generic (PLEG): container finished" podID="f34e48f9-e454-4f11-b78e-965516098e91" containerID="cd30e62108093956c6066b5e2d48fc375104fa0c280c57d78b26afdd27a388b2" exitCode=0 Feb 19 18:51:24 crc kubenswrapper[4749]: I0219 18:51:24.053587 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nggkc" event={"ID":"f34e48f9-e454-4f11-b78e-965516098e91","Type":"ContainerDied","Data":"cd30e62108093956c6066b5e2d48fc375104fa0c280c57d78b26afdd27a388b2"} Feb 19 18:51:24 crc kubenswrapper[4749]: I0219 18:51:24.066980 4749 generic.go:334] "Generic (PLEG): container finished" podID="1767b8ff-e819-4268-982c-57cd067b1cd5" containerID="ad0e7d8564db6dacbe89319f85830b5e2c7bc2b01cbebdcb6e3ab4a6d370319f" exitCode=0 Feb 19 18:51:24 crc kubenswrapper[4749]: I0219 18:51:24.067113 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1d78-account-create-update-lrl5l" event={"ID":"1767b8ff-e819-4268-982c-57cd067b1cd5","Type":"ContainerDied","Data":"ad0e7d8564db6dacbe89319f85830b5e2c7bc2b01cbebdcb6e3ab4a6d370319f"} Feb 19 18:51:24 crc kubenswrapper[4749]: I0219 18:51:24.078375 4749 generic.go:334] "Generic (PLEG): container finished" podID="fbc2d05b-8f07-4386-9d4c-604e07f0f265" containerID="3554e0aa16a11cbb9c6f94a23b8e93ea5fc408c9dbeaae7a66d9a5d403192f0e" exitCode=0 Feb 19 18:51:24 crc kubenswrapper[4749]: I0219 18:51:24.078505 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cbfb-account-create-update-gzp2p" event={"ID":"fbc2d05b-8f07-4386-9d4c-604e07f0f265","Type":"ContainerDied","Data":"3554e0aa16a11cbb9c6f94a23b8e93ea5fc408c9dbeaae7a66d9a5d403192f0e"} Feb 19 18:51:24 crc kubenswrapper[4749]: I0219 18:51:24.080274 4749 generic.go:334] "Generic (PLEG): container finished" podID="15f880df-7af3-4bab-91f1-5085b70a86d0" containerID="c6616dd3022501a5ad9ae3b9f6dd0bcb3db2fb08710e3f51164bce28d0d78b30" exitCode=0 Feb 19 18:51:24 crc kubenswrapper[4749]: I0219 18:51:24.080315 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2xmjl" event={"ID":"15f880df-7af3-4bab-91f1-5085b70a86d0","Type":"ContainerDied","Data":"c6616dd3022501a5ad9ae3b9f6dd0bcb3db2fb08710e3f51164bce28d0d78b30"} Feb 19 18:51:24 crc kubenswrapper[4749]: I0219 18:51:24.084273 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ece11938-c758-4d62-ad84-c630d040f511","Type":"ContainerStarted","Data":"b9c011c58795b78020d33a56adad437746a6a974e917b539130fe95208bdf031"} Feb 19 18:51:24 crc kubenswrapper[4749]: I0219 18:51:24.084299 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ece11938-c758-4d62-ad84-c630d040f511","Type":"ContainerStarted","Data":"585a55f9c1f955d55f764cc84be8c2609353d14b9f9ad52bc5057dd0e5303006"} Feb 19 18:51:24 crc kubenswrapper[4749]: I0219 18:51:24.085883 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=15.085860889 podStartE2EDuration="15.085860889s" podCreationTimestamp="2026-02-19 18:51:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:51:24.070271689 +0000 UTC m=+1058.031491653" watchObservedRunningTime="2026-02-19 18:51:24.085860889 +0000 UTC m=+1058.047080843" Feb 19 18:51:24 crc kubenswrapper[4749]: I0219 18:51:24.693818 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d88e0b5-c217-4481-bbbd-594240990a40" path="/var/lib/kubelet/pods/7d88e0b5-c217-4481-bbbd-594240990a40/volumes" Feb 19 18:51:25 crc kubenswrapper[4749]: I0219 18:51:25.114788 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ece11938-c758-4d62-ad84-c630d040f511","Type":"ContainerStarted","Data":"e08f006985e3a7945c89db1ee89e074c770ece5c52a22e277b862bcb9c49f145"} Feb 19 18:51:25 crc kubenswrapper[4749]: I0219 18:51:25.170457 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:25 crc kubenswrapper[4749]: I0219 18:51:25.170515 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:25 crc kubenswrapper[4749]: I0219 18:51:25.175680 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:26 crc kubenswrapper[4749]: I0219 18:51:26.127307 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.144214 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1d78-account-create-update-lrl5l" event={"ID":"1767b8ff-e819-4268-982c-57cd067b1cd5","Type":"ContainerDied","Data":"475851cbe77d7d61e6b33d43f7936ff5ac5e0a0e2d46e254f9d83e1314919902"} Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.144680 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="475851cbe77d7d61e6b33d43f7936ff5ac5e0a0e2d46e254f9d83e1314919902" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.146078 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cbfb-account-create-update-gzp2p" event={"ID":"fbc2d05b-8f07-4386-9d4c-604e07f0f265","Type":"ContainerDied","Data":"88e1a3d68fc3a2943b7f28b5dc67154b573a354b76747fdedf155f010f4e726d"} Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.146115 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88e1a3d68fc3a2943b7f28b5dc67154b573a354b76747fdedf155f010f4e726d" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.147107 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2xmjl" event={"ID":"15f880df-7af3-4bab-91f1-5085b70a86d0","Type":"ContainerDied","Data":"e287229ca80deb76c1ffe3ec7f166df59c2d98c0269212aae4395d1e978f1194"} Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.147127 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e287229ca80deb76c1ffe3ec7f166df59c2d98c0269212aae4395d1e978f1194" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.148383 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-14db-account-create-update-xdwgq" event={"ID":"55d0d068-45a8-49a7-9cbe-a3ff70cc056f","Type":"ContainerDied","Data":"756c142bd8e899b5b55fe58868a3b559ccb5fa11980904e221561693f0c74111"} Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.148434 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="756c142bd8e899b5b55fe58868a3b559ccb5fa11980904e221561693f0c74111" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.149336 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rhcc5" event={"ID":"4fa888c5-51f3-47a3-ae88-65694f44677a","Type":"ContainerDied","Data":"1ab18b1b301233d4495d4c51b4d490762920610c2f53414c0ecfbb0bef169bac"} Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.149354 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ab18b1b301233d4495d4c51b4d490762920610c2f53414c0ecfbb0bef169bac" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.150386 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nggkc" event={"ID":"f34e48f9-e454-4f11-b78e-965516098e91","Type":"ContainerDied","Data":"df1589f0a03c323489f6ce9978a10b1d6427c496cc9481dbdb11ae4efdae78d6"} Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.150405 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df1589f0a03c323489f6ce9978a10b1d6427c496cc9481dbdb11ae4efdae78d6" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.189875 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-14db-account-create-update-xdwgq" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.199614 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2xmjl" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.208463 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cbfb-account-create-update-gzp2p" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.218075 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nggkc" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.237007 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1d78-account-create-update-lrl5l" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.241736 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rhcc5" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.348364 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f34e48f9-e454-4f11-b78e-965516098e91-operator-scripts\") pod \"f34e48f9-e454-4f11-b78e-965516098e91\" (UID: \"f34e48f9-e454-4f11-b78e-965516098e91\") " Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.348464 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g5bv\" (UniqueName: \"kubernetes.io/projected/f34e48f9-e454-4f11-b78e-965516098e91-kube-api-access-6g5bv\") pod \"f34e48f9-e454-4f11-b78e-965516098e91\" (UID: \"f34e48f9-e454-4f11-b78e-965516098e91\") " Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.348499 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld7m5\" (UniqueName: \"kubernetes.io/projected/15f880df-7af3-4bab-91f1-5085b70a86d0-kube-api-access-ld7m5\") pod \"15f880df-7af3-4bab-91f1-5085b70a86d0\" (UID: \"15f880df-7af3-4bab-91f1-5085b70a86d0\") " Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.348553 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1767b8ff-e819-4268-982c-57cd067b1cd5-operator-scripts\") pod \"1767b8ff-e819-4268-982c-57cd067b1cd5\" (UID: \"1767b8ff-e819-4268-982c-57cd067b1cd5\") " Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.348588 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqc2c\" (UniqueName: \"kubernetes.io/projected/1767b8ff-e819-4268-982c-57cd067b1cd5-kube-api-access-bqc2c\") pod \"1767b8ff-e819-4268-982c-57cd067b1cd5\" (UID: \"1767b8ff-e819-4268-982c-57cd067b1cd5\") " Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.348617 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v8sf\" (UniqueName: \"kubernetes.io/projected/55d0d068-45a8-49a7-9cbe-a3ff70cc056f-kube-api-access-9v8sf\") pod \"55d0d068-45a8-49a7-9cbe-a3ff70cc056f\" (UID: \"55d0d068-45a8-49a7-9cbe-a3ff70cc056f\") " Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.348661 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlhp8\" (UniqueName: \"kubernetes.io/projected/4fa888c5-51f3-47a3-ae88-65694f44677a-kube-api-access-wlhp8\") pod \"4fa888c5-51f3-47a3-ae88-65694f44677a\" (UID: \"4fa888c5-51f3-47a3-ae88-65694f44677a\") " Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.348717 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fa888c5-51f3-47a3-ae88-65694f44677a-operator-scripts\") pod \"4fa888c5-51f3-47a3-ae88-65694f44677a\" (UID: \"4fa888c5-51f3-47a3-ae88-65694f44677a\") " Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.348740 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb2n6\" (UniqueName: \"kubernetes.io/projected/fbc2d05b-8f07-4386-9d4c-604e07f0f265-kube-api-access-pb2n6\") pod \"fbc2d05b-8f07-4386-9d4c-604e07f0f265\" (UID: \"fbc2d05b-8f07-4386-9d4c-604e07f0f265\") " Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.348769 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15f880df-7af3-4bab-91f1-5085b70a86d0-operator-scripts\") pod \"15f880df-7af3-4bab-91f1-5085b70a86d0\" (UID: \"15f880df-7af3-4bab-91f1-5085b70a86d0\") " Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.348816 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55d0d068-45a8-49a7-9cbe-a3ff70cc056f-operator-scripts\") pod \"55d0d068-45a8-49a7-9cbe-a3ff70cc056f\" (UID: \"55d0d068-45a8-49a7-9cbe-a3ff70cc056f\") " Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.348853 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbc2d05b-8f07-4386-9d4c-604e07f0f265-operator-scripts\") pod \"fbc2d05b-8f07-4386-9d4c-604e07f0f265\" (UID: \"fbc2d05b-8f07-4386-9d4c-604e07f0f265\") " Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.348952 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f34e48f9-e454-4f11-b78e-965516098e91-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f34e48f9-e454-4f11-b78e-965516098e91" (UID: "f34e48f9-e454-4f11-b78e-965516098e91"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.348976 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1767b8ff-e819-4268-982c-57cd067b1cd5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1767b8ff-e819-4268-982c-57cd067b1cd5" (UID: "1767b8ff-e819-4268-982c-57cd067b1cd5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.349273 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fa888c5-51f3-47a3-ae88-65694f44677a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4fa888c5-51f3-47a3-ae88-65694f44677a" (UID: "4fa888c5-51f3-47a3-ae88-65694f44677a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.349669 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55d0d068-45a8-49a7-9cbe-a3ff70cc056f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "55d0d068-45a8-49a7-9cbe-a3ff70cc056f" (UID: "55d0d068-45a8-49a7-9cbe-a3ff70cc056f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.349678 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15f880df-7af3-4bab-91f1-5085b70a86d0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "15f880df-7af3-4bab-91f1-5085b70a86d0" (UID: "15f880df-7af3-4bab-91f1-5085b70a86d0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.349800 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbc2d05b-8f07-4386-9d4c-604e07f0f265-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fbc2d05b-8f07-4386-9d4c-604e07f0f265" (UID: "fbc2d05b-8f07-4386-9d4c-604e07f0f265"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.350427 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1767b8ff-e819-4268-982c-57cd067b1cd5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.350458 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f34e48f9-e454-4f11-b78e-965516098e91-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.353492 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1767b8ff-e819-4268-982c-57cd067b1cd5-kube-api-access-bqc2c" (OuterVolumeSpecName: "kube-api-access-bqc2c") pod "1767b8ff-e819-4268-982c-57cd067b1cd5" (UID: "1767b8ff-e819-4268-982c-57cd067b1cd5"). InnerVolumeSpecName "kube-api-access-bqc2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.353504 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55d0d068-45a8-49a7-9cbe-a3ff70cc056f-kube-api-access-9v8sf" (OuterVolumeSpecName: "kube-api-access-9v8sf") pod "55d0d068-45a8-49a7-9cbe-a3ff70cc056f" (UID: "55d0d068-45a8-49a7-9cbe-a3ff70cc056f"). InnerVolumeSpecName "kube-api-access-9v8sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.353535 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fa888c5-51f3-47a3-ae88-65694f44677a-kube-api-access-wlhp8" (OuterVolumeSpecName: "kube-api-access-wlhp8") pod "4fa888c5-51f3-47a3-ae88-65694f44677a" (UID: "4fa888c5-51f3-47a3-ae88-65694f44677a"). InnerVolumeSpecName "kube-api-access-wlhp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.357196 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbc2d05b-8f07-4386-9d4c-604e07f0f265-kube-api-access-pb2n6" (OuterVolumeSpecName: "kube-api-access-pb2n6") pod "fbc2d05b-8f07-4386-9d4c-604e07f0f265" (UID: "fbc2d05b-8f07-4386-9d4c-604e07f0f265"). InnerVolumeSpecName "kube-api-access-pb2n6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.361299 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15f880df-7af3-4bab-91f1-5085b70a86d0-kube-api-access-ld7m5" (OuterVolumeSpecName: "kube-api-access-ld7m5") pod "15f880df-7af3-4bab-91f1-5085b70a86d0" (UID: "15f880df-7af3-4bab-91f1-5085b70a86d0"). InnerVolumeSpecName "kube-api-access-ld7m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.363474 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f34e48f9-e454-4f11-b78e-965516098e91-kube-api-access-6g5bv" (OuterVolumeSpecName: "kube-api-access-6g5bv") pod "f34e48f9-e454-4f11-b78e-965516098e91" (UID: "f34e48f9-e454-4f11-b78e-965516098e91"). InnerVolumeSpecName "kube-api-access-6g5bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.451558 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqc2c\" (UniqueName: \"kubernetes.io/projected/1767b8ff-e819-4268-982c-57cd067b1cd5-kube-api-access-bqc2c\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.451589 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v8sf\" (UniqueName: \"kubernetes.io/projected/55d0d068-45a8-49a7-9cbe-a3ff70cc056f-kube-api-access-9v8sf\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.451598 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlhp8\" (UniqueName: \"kubernetes.io/projected/4fa888c5-51f3-47a3-ae88-65694f44677a-kube-api-access-wlhp8\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.451607 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fa888c5-51f3-47a3-ae88-65694f44677a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.451615 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb2n6\" (UniqueName: \"kubernetes.io/projected/fbc2d05b-8f07-4386-9d4c-604e07f0f265-kube-api-access-pb2n6\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.451623 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15f880df-7af3-4bab-91f1-5085b70a86d0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.451631 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55d0d068-45a8-49a7-9cbe-a3ff70cc056f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.451640 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbc2d05b-8f07-4386-9d4c-604e07f0f265-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.451648 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g5bv\" (UniqueName: \"kubernetes.io/projected/f34e48f9-e454-4f11-b78e-965516098e91-kube-api-access-6g5bv\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:29 crc kubenswrapper[4749]: I0219 18:51:29.451656 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld7m5\" (UniqueName: \"kubernetes.io/projected/15f880df-7af3-4bab-91f1-5085b70a86d0-kube-api-access-ld7m5\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:30 crc kubenswrapper[4749]: I0219 18:51:30.160138 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cbfb-account-create-update-gzp2p" Feb 19 18:51:30 crc kubenswrapper[4749]: I0219 18:51:30.160179 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nggkc" Feb 19 18:51:30 crc kubenswrapper[4749]: I0219 18:51:30.160140 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rhcc5" Feb 19 18:51:30 crc kubenswrapper[4749]: I0219 18:51:30.160194 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-14db-account-create-update-xdwgq" Feb 19 18:51:30 crc kubenswrapper[4749]: I0219 18:51:30.160206 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2xmjl" Feb 19 18:51:30 crc kubenswrapper[4749]: I0219 18:51:30.160138 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1d78-account-create-update-lrl5l" Feb 19 18:51:31 crc kubenswrapper[4749]: I0219 18:51:31.177785 4749 generic.go:334] "Generic (PLEG): container finished" podID="4fb0bea8-d0bd-4a01-a752-7c9697971db8" containerID="41eb58b1338db1e882273d3a36b1da142d86f374ea98f3b78a0578f261bd70df" exitCode=0 Feb 19 18:51:31 crc kubenswrapper[4749]: I0219 18:51:31.177955 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9wjqf" event={"ID":"4fb0bea8-d0bd-4a01-a752-7c9697971db8","Type":"ContainerDied","Data":"41eb58b1338db1e882273d3a36b1da142d86f374ea98f3b78a0578f261bd70df"} Feb 19 18:51:33 crc kubenswrapper[4749]: I0219 18:51:33.194624 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9wjqf" event={"ID":"4fb0bea8-d0bd-4a01-a752-7c9697971db8","Type":"ContainerDied","Data":"aa301d9176a8bc4314fc8a194c27f029da5066c9dbdafc5e774ea7671fed62f2"} Feb 19 18:51:33 crc kubenswrapper[4749]: I0219 18:51:33.195180 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa301d9176a8bc4314fc8a194c27f029da5066c9dbdafc5e774ea7671fed62f2" Feb 19 18:51:33 crc kubenswrapper[4749]: I0219 18:51:33.429955 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9wjqf" Feb 19 18:51:33 crc kubenswrapper[4749]: I0219 18:51:33.523818 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4fb0bea8-d0bd-4a01-a752-7c9697971db8-db-sync-config-data\") pod \"4fb0bea8-d0bd-4a01-a752-7c9697971db8\" (UID: \"4fb0bea8-d0bd-4a01-a752-7c9697971db8\") " Feb 19 18:51:33 crc kubenswrapper[4749]: I0219 18:51:33.523884 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fb0bea8-d0bd-4a01-a752-7c9697971db8-combined-ca-bundle\") pod \"4fb0bea8-d0bd-4a01-a752-7c9697971db8\" (UID: \"4fb0bea8-d0bd-4a01-a752-7c9697971db8\") " Feb 19 18:51:33 crc kubenswrapper[4749]: I0219 18:51:33.523944 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjrds\" (UniqueName: \"kubernetes.io/projected/4fb0bea8-d0bd-4a01-a752-7c9697971db8-kube-api-access-qjrds\") pod \"4fb0bea8-d0bd-4a01-a752-7c9697971db8\" (UID: \"4fb0bea8-d0bd-4a01-a752-7c9697971db8\") " Feb 19 18:51:33 crc kubenswrapper[4749]: I0219 18:51:33.524006 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fb0bea8-d0bd-4a01-a752-7c9697971db8-config-data\") pod \"4fb0bea8-d0bd-4a01-a752-7c9697971db8\" (UID: \"4fb0bea8-d0bd-4a01-a752-7c9697971db8\") " Feb 19 18:51:33 crc kubenswrapper[4749]: I0219 18:51:33.528077 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fb0bea8-d0bd-4a01-a752-7c9697971db8-kube-api-access-qjrds" (OuterVolumeSpecName: "kube-api-access-qjrds") pod "4fb0bea8-d0bd-4a01-a752-7c9697971db8" (UID: "4fb0bea8-d0bd-4a01-a752-7c9697971db8"). InnerVolumeSpecName "kube-api-access-qjrds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:51:33 crc kubenswrapper[4749]: I0219 18:51:33.528901 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fb0bea8-d0bd-4a01-a752-7c9697971db8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4fb0bea8-d0bd-4a01-a752-7c9697971db8" (UID: "4fb0bea8-d0bd-4a01-a752-7c9697971db8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:51:33 crc kubenswrapper[4749]: I0219 18:51:33.556883 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fb0bea8-d0bd-4a01-a752-7c9697971db8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4fb0bea8-d0bd-4a01-a752-7c9697971db8" (UID: "4fb0bea8-d0bd-4a01-a752-7c9697971db8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:51:33 crc kubenswrapper[4749]: I0219 18:51:33.587967 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fb0bea8-d0bd-4a01-a752-7c9697971db8-config-data" (OuterVolumeSpecName: "config-data") pod "4fb0bea8-d0bd-4a01-a752-7c9697971db8" (UID: "4fb0bea8-d0bd-4a01-a752-7c9697971db8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:51:33 crc kubenswrapper[4749]: I0219 18:51:33.626405 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fb0bea8-d0bd-4a01-a752-7c9697971db8-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:33 crc kubenswrapper[4749]: I0219 18:51:33.626441 4749 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4fb0bea8-d0bd-4a01-a752-7c9697971db8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:33 crc kubenswrapper[4749]: I0219 18:51:33.626453 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fb0bea8-d0bd-4a01-a752-7c9697971db8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:33 crc kubenswrapper[4749]: I0219 18:51:33.626463 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjrds\" (UniqueName: \"kubernetes.io/projected/4fb0bea8-d0bd-4a01-a752-7c9697971db8-kube-api-access-qjrds\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.204448 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-trrfz" event={"ID":"284da664-b432-48cf-8f30-fa7cc57bd5b3","Type":"ContainerStarted","Data":"352a35ca5c0a0988d7f8bf9cfca788b7f9d858c3dcd1018d9be50641b86b57ef"} Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.220014 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-trrfz" podStartSLOduration=3.082900404 podStartE2EDuration="14.219996533s" podCreationTimestamp="2026-02-19 18:51:20 +0000 UTC" firstStartedPulling="2026-02-19 18:51:22.100674555 +0000 UTC m=+1056.061894509" lastFinishedPulling="2026-02-19 18:51:33.237770684 +0000 UTC m=+1067.198990638" observedRunningTime="2026-02-19 18:51:34.219193684 +0000 UTC m=+1068.180413648" watchObservedRunningTime="2026-02-19 18:51:34.219996533 +0000 UTC m=+1068.181216487" Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.228122 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ece11938-c758-4d62-ad84-c630d040f511","Type":"ContainerStarted","Data":"564e0ec262fbe857fba50a7c6c6db6c6a022231c41fce021e88cfe3792cbe3ef"} Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.229853 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9wjqf" Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.230123 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-tkp7q" event={"ID":"04522760-872e-4efc-a852-726292ac24f4","Type":"ContainerStarted","Data":"f5ce2cb7b7b99196d49aff9f25ab067642afbe62910cb83f462d2f20329bc557"} Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.252870 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-tkp7q" podStartSLOduration=2.346355944 podStartE2EDuration="13.252849143s" podCreationTimestamp="2026-02-19 18:51:21 +0000 UTC" firstStartedPulling="2026-02-19 18:51:22.355172957 +0000 UTC m=+1056.316392911" lastFinishedPulling="2026-02-19 18:51:33.261666166 +0000 UTC m=+1067.222886110" observedRunningTime="2026-02-19 18:51:34.242495991 +0000 UTC m=+1068.203715945" watchObservedRunningTime="2026-02-19 18:51:34.252849143 +0000 UTC m=+1068.214069097" Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.862375 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b56fc7d7c-pb4d6"] Feb 19 18:51:34 crc kubenswrapper[4749]: E0219 18:51:34.862899 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15f880df-7af3-4bab-91f1-5085b70a86d0" containerName="mariadb-database-create" Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.862913 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f880df-7af3-4bab-91f1-5085b70a86d0" containerName="mariadb-database-create" Feb 19 18:51:34 crc kubenswrapper[4749]: E0219 18:51:34.862926 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbc2d05b-8f07-4386-9d4c-604e07f0f265" containerName="mariadb-account-create-update" Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.862932 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbc2d05b-8f07-4386-9d4c-604e07f0f265" containerName="mariadb-account-create-update" Feb 19 18:51:34 crc kubenswrapper[4749]: E0219 18:51:34.862944 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f34e48f9-e454-4f11-b78e-965516098e91" containerName="mariadb-database-create" Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.862950 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f34e48f9-e454-4f11-b78e-965516098e91" containerName="mariadb-database-create" Feb 19 18:51:34 crc kubenswrapper[4749]: E0219 18:51:34.862964 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb0bea8-d0bd-4a01-a752-7c9697971db8" containerName="glance-db-sync" Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.862969 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb0bea8-d0bd-4a01-a752-7c9697971db8" containerName="glance-db-sync" Feb 19 18:51:34 crc kubenswrapper[4749]: E0219 18:51:34.862979 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d88e0b5-c217-4481-bbbd-594240990a40" containerName="ovn-config" Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.862984 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d88e0b5-c217-4481-bbbd-594240990a40" containerName="ovn-config" Feb 19 18:51:34 crc kubenswrapper[4749]: E0219 18:51:34.862991 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1767b8ff-e819-4268-982c-57cd067b1cd5" containerName="mariadb-account-create-update" Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.862997 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1767b8ff-e819-4268-982c-57cd067b1cd5" containerName="mariadb-account-create-update" Feb 19 18:51:34 crc kubenswrapper[4749]: E0219 18:51:34.863005 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55d0d068-45a8-49a7-9cbe-a3ff70cc056f" containerName="mariadb-account-create-update" Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.863010 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="55d0d068-45a8-49a7-9cbe-a3ff70cc056f" containerName="mariadb-account-create-update" Feb 19 18:51:34 crc kubenswrapper[4749]: E0219 18:51:34.863023 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fa888c5-51f3-47a3-ae88-65694f44677a" containerName="mariadb-database-create" Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.863041 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fa888c5-51f3-47a3-ae88-65694f44677a" containerName="mariadb-database-create" Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.863191 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbc2d05b-8f07-4386-9d4c-604e07f0f265" containerName="mariadb-account-create-update" Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.863229 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="55d0d068-45a8-49a7-9cbe-a3ff70cc056f" containerName="mariadb-account-create-update" Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.863239 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d88e0b5-c217-4481-bbbd-594240990a40" containerName="ovn-config" Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.863246 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fa888c5-51f3-47a3-ae88-65694f44677a" containerName="mariadb-database-create" Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.863255 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f34e48f9-e454-4f11-b78e-965516098e91" containerName="mariadb-database-create" Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.863263 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb0bea8-d0bd-4a01-a752-7c9697971db8" containerName="glance-db-sync" Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.863270 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1767b8ff-e819-4268-982c-57cd067b1cd5" containerName="mariadb-account-create-update" Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.863280 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="15f880df-7af3-4bab-91f1-5085b70a86d0" containerName="mariadb-database-create" Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.864073 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b56fc7d7c-pb4d6" Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.895628 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b56fc7d7c-pb4d6"] Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.956520 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-config\") pod \"dnsmasq-dns-b56fc7d7c-pb4d6\" (UID: \"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6\") " pod="openstack/dnsmasq-dns-b56fc7d7c-pb4d6" Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.956620 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78xxs\" (UniqueName: \"kubernetes.io/projected/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-kube-api-access-78xxs\") pod \"dnsmasq-dns-b56fc7d7c-pb4d6\" (UID: \"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6\") " pod="openstack/dnsmasq-dns-b56fc7d7c-pb4d6" Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.956645 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-ovsdbserver-sb\") pod \"dnsmasq-dns-b56fc7d7c-pb4d6\" (UID: \"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6\") " pod="openstack/dnsmasq-dns-b56fc7d7c-pb4d6" Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.956694 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-dns-svc\") pod \"dnsmasq-dns-b56fc7d7c-pb4d6\" (UID: \"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6\") " pod="openstack/dnsmasq-dns-b56fc7d7c-pb4d6" Feb 19 18:51:34 crc kubenswrapper[4749]: I0219 18:51:34.956727 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-ovsdbserver-nb\") pod \"dnsmasq-dns-b56fc7d7c-pb4d6\" (UID: \"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6\") " pod="openstack/dnsmasq-dns-b56fc7d7c-pb4d6" Feb 19 18:51:35 crc kubenswrapper[4749]: I0219 18:51:35.058069 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-dns-svc\") pod \"dnsmasq-dns-b56fc7d7c-pb4d6\" (UID: \"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6\") " pod="openstack/dnsmasq-dns-b56fc7d7c-pb4d6" Feb 19 18:51:35 crc kubenswrapper[4749]: I0219 18:51:35.058133 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-ovsdbserver-nb\") pod \"dnsmasq-dns-b56fc7d7c-pb4d6\" (UID: \"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6\") " pod="openstack/dnsmasq-dns-b56fc7d7c-pb4d6" Feb 19 18:51:35 crc kubenswrapper[4749]: I0219 18:51:35.058199 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-config\") pod \"dnsmasq-dns-b56fc7d7c-pb4d6\" (UID: \"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6\") " pod="openstack/dnsmasq-dns-b56fc7d7c-pb4d6" Feb 19 18:51:35 crc kubenswrapper[4749]: I0219 18:51:35.058246 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78xxs\" (UniqueName: \"kubernetes.io/projected/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-kube-api-access-78xxs\") pod \"dnsmasq-dns-b56fc7d7c-pb4d6\" (UID: \"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6\") " pod="openstack/dnsmasq-dns-b56fc7d7c-pb4d6" Feb 19 18:51:35 crc kubenswrapper[4749]: I0219 18:51:35.058267 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-ovsdbserver-sb\") pod \"dnsmasq-dns-b56fc7d7c-pb4d6\" (UID: \"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6\") " pod="openstack/dnsmasq-dns-b56fc7d7c-pb4d6" Feb 19 18:51:35 crc kubenswrapper[4749]: I0219 18:51:35.090387 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-ovsdbserver-sb\") pod \"dnsmasq-dns-b56fc7d7c-pb4d6\" (UID: \"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6\") " pod="openstack/dnsmasq-dns-b56fc7d7c-pb4d6" Feb 19 18:51:35 crc kubenswrapper[4749]: I0219 18:51:35.090666 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-ovsdbserver-nb\") pod \"dnsmasq-dns-b56fc7d7c-pb4d6\" (UID: \"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6\") " pod="openstack/dnsmasq-dns-b56fc7d7c-pb4d6" Feb 19 18:51:35 crc kubenswrapper[4749]: I0219 18:51:35.090762 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-config\") pod \"dnsmasq-dns-b56fc7d7c-pb4d6\" (UID: \"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6\") " pod="openstack/dnsmasq-dns-b56fc7d7c-pb4d6" Feb 19 18:51:35 crc kubenswrapper[4749]: I0219 18:51:35.090773 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-dns-svc\") pod \"dnsmasq-dns-b56fc7d7c-pb4d6\" (UID: \"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6\") " pod="openstack/dnsmasq-dns-b56fc7d7c-pb4d6" Feb 19 18:51:35 crc kubenswrapper[4749]: I0219 18:51:35.094871 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78xxs\" (UniqueName: \"kubernetes.io/projected/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-kube-api-access-78xxs\") pod \"dnsmasq-dns-b56fc7d7c-pb4d6\" (UID: \"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6\") " pod="openstack/dnsmasq-dns-b56fc7d7c-pb4d6" Feb 19 18:51:35 crc kubenswrapper[4749]: I0219 18:51:35.203354 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b56fc7d7c-pb4d6" Feb 19 18:51:35 crc kubenswrapper[4749]: I0219 18:51:35.273384 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ece11938-c758-4d62-ad84-c630d040f511","Type":"ContainerStarted","Data":"59a3f89cefa7685dd2aebada2611e7f70fac0fdac4e5cfb63a2d4b29996e07ab"} Feb 19 18:51:35 crc kubenswrapper[4749]: I0219 18:51:35.274646 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ece11938-c758-4d62-ad84-c630d040f511","Type":"ContainerStarted","Data":"1ae8e0a09d97e2c910c59723c3b3bd7720c6165958e3f6ea4095332d0df98eed"} Feb 19 18:51:35 crc kubenswrapper[4749]: I0219 18:51:35.274710 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ece11938-c758-4d62-ad84-c630d040f511","Type":"ContainerStarted","Data":"8fdf7c003797b0d4e9346b5a561fffb34484d85dc41474c91c2c641ffc84c7f4"} Feb 19 18:51:35 crc kubenswrapper[4749]: I0219 18:51:35.274761 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ece11938-c758-4d62-ad84-c630d040f511","Type":"ContainerStarted","Data":"299220b26018de4ffdedc5983d3b0e10e7a89ce052c868f4bf537e227e6b48ae"} Feb 19 18:51:35 crc kubenswrapper[4749]: I0219 18:51:35.774129 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b56fc7d7c-pb4d6"] Feb 19 18:51:35 crc kubenswrapper[4749]: W0219 18:51:35.786275 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10f9e1fd_86a0_43f9_bfd5_8cfa3e4681e6.slice/crio-833f65e09108607cc79087bfc6a3d112ca3dec7f116b38900c0e6a06834c1899 WatchSource:0}: Error finding container 833f65e09108607cc79087bfc6a3d112ca3dec7f116b38900c0e6a06834c1899: Status 404 returned error can't find the container with id 833f65e09108607cc79087bfc6a3d112ca3dec7f116b38900c0e6a06834c1899 Feb 19 18:51:36 crc kubenswrapper[4749]: I0219 18:51:36.281783 4749 generic.go:334] "Generic (PLEG): container finished" podID="10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6" containerID="a95f20992b8f3c8efc12e85765a91fe6ba0d03a21dd4e2abff70cc871c71f067" exitCode=0 Feb 19 18:51:36 crc kubenswrapper[4749]: I0219 18:51:36.281851 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b56fc7d7c-pb4d6" event={"ID":"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6","Type":"ContainerDied","Data":"a95f20992b8f3c8efc12e85765a91fe6ba0d03a21dd4e2abff70cc871c71f067"} Feb 19 18:51:36 crc kubenswrapper[4749]: I0219 18:51:36.281880 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b56fc7d7c-pb4d6" event={"ID":"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6","Type":"ContainerStarted","Data":"833f65e09108607cc79087bfc6a3d112ca3dec7f116b38900c0e6a06834c1899"} Feb 19 18:51:36 crc kubenswrapper[4749]: I0219 18:51:36.288366 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ece11938-c758-4d62-ad84-c630d040f511","Type":"ContainerStarted","Data":"99c9e0254280b816f6a1acd20179abae4aaf8b9750ccf39667ef9df422039da5"} Feb 19 18:51:36 crc kubenswrapper[4749]: I0219 18:51:36.288410 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ece11938-c758-4d62-ad84-c630d040f511","Type":"ContainerStarted","Data":"6bd4774faf964a528356ec3681f2f755cca70d0733ee7f497d3f06cd2295fc7c"} Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.297788 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b56fc7d7c-pb4d6" event={"ID":"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6","Type":"ContainerStarted","Data":"cfd156470d53cdac4a098cc1e68daeb6fcd103b85ddbcb21073d73154747f40e"} Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.298291 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b56fc7d7c-pb4d6" Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.307419 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ece11938-c758-4d62-ad84-c630d040f511","Type":"ContainerStarted","Data":"9d15e44dd30387d7a0356e0d4dff570aaf09db7f7231f7d4fdb2b438b8f36262"} Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.350378 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b56fc7d7c-pb4d6" podStartSLOduration=3.350363272 podStartE2EDuration="3.350363272s" podCreationTimestamp="2026-02-19 18:51:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:51:37.318157359 +0000 UTC m=+1071.279377313" watchObservedRunningTime="2026-02-19 18:51:37.350363272 +0000 UTC m=+1071.311583226" Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.352203 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=39.366653697 podStartE2EDuration="54.352195167s" podCreationTimestamp="2026-02-19 18:50:43 +0000 UTC" firstStartedPulling="2026-02-19 18:51:19.169729759 +0000 UTC m=+1053.130949713" lastFinishedPulling="2026-02-19 18:51:34.155271229 +0000 UTC m=+1068.116491183" observedRunningTime="2026-02-19 18:51:37.348456536 +0000 UTC m=+1071.309676490" watchObservedRunningTime="2026-02-19 18:51:37.352195167 +0000 UTC m=+1071.313415131" Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.614700 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b56fc7d7c-pb4d6"] Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.647154 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-778545d54f-gjkrp"] Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.648852 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-778545d54f-gjkrp" Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.650597 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.670444 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-778545d54f-gjkrp"] Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.824778 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-ovsdbserver-nb\") pod \"dnsmasq-dns-778545d54f-gjkrp\" (UID: \"25a49c28-995a-475e-a81f-45632b61104d\") " pod="openstack/dnsmasq-dns-778545d54f-gjkrp" Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.824844 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-config\") pod \"dnsmasq-dns-778545d54f-gjkrp\" (UID: \"25a49c28-995a-475e-a81f-45632b61104d\") " pod="openstack/dnsmasq-dns-778545d54f-gjkrp" Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.824880 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-ovsdbserver-sb\") pod \"dnsmasq-dns-778545d54f-gjkrp\" (UID: \"25a49c28-995a-475e-a81f-45632b61104d\") " pod="openstack/dnsmasq-dns-778545d54f-gjkrp" Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.824968 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-dns-svc\") pod \"dnsmasq-dns-778545d54f-gjkrp\" (UID: \"25a49c28-995a-475e-a81f-45632b61104d\") " pod="openstack/dnsmasq-dns-778545d54f-gjkrp" Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.825012 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb4rr\" (UniqueName: \"kubernetes.io/projected/25a49c28-995a-475e-a81f-45632b61104d-kube-api-access-lb4rr\") pod \"dnsmasq-dns-778545d54f-gjkrp\" (UID: \"25a49c28-995a-475e-a81f-45632b61104d\") " pod="openstack/dnsmasq-dns-778545d54f-gjkrp" Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.825090 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-dns-swift-storage-0\") pod \"dnsmasq-dns-778545d54f-gjkrp\" (UID: \"25a49c28-995a-475e-a81f-45632b61104d\") " pod="openstack/dnsmasq-dns-778545d54f-gjkrp" Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.926339 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-config\") pod \"dnsmasq-dns-778545d54f-gjkrp\" (UID: \"25a49c28-995a-475e-a81f-45632b61104d\") " pod="openstack/dnsmasq-dns-778545d54f-gjkrp" Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.926411 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-ovsdbserver-sb\") pod \"dnsmasq-dns-778545d54f-gjkrp\" (UID: \"25a49c28-995a-475e-a81f-45632b61104d\") " pod="openstack/dnsmasq-dns-778545d54f-gjkrp" Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.926442 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-dns-svc\") pod \"dnsmasq-dns-778545d54f-gjkrp\" (UID: \"25a49c28-995a-475e-a81f-45632b61104d\") " pod="openstack/dnsmasq-dns-778545d54f-gjkrp" Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.926482 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb4rr\" (UniqueName: \"kubernetes.io/projected/25a49c28-995a-475e-a81f-45632b61104d-kube-api-access-lb4rr\") pod \"dnsmasq-dns-778545d54f-gjkrp\" (UID: \"25a49c28-995a-475e-a81f-45632b61104d\") " pod="openstack/dnsmasq-dns-778545d54f-gjkrp" Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.926536 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-dns-swift-storage-0\") pod \"dnsmasq-dns-778545d54f-gjkrp\" (UID: \"25a49c28-995a-475e-a81f-45632b61104d\") " pod="openstack/dnsmasq-dns-778545d54f-gjkrp" Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.926614 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-ovsdbserver-nb\") pod \"dnsmasq-dns-778545d54f-gjkrp\" (UID: \"25a49c28-995a-475e-a81f-45632b61104d\") " pod="openstack/dnsmasq-dns-778545d54f-gjkrp" Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.927403 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-config\") pod \"dnsmasq-dns-778545d54f-gjkrp\" (UID: \"25a49c28-995a-475e-a81f-45632b61104d\") " pod="openstack/dnsmasq-dns-778545d54f-gjkrp" Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.927410 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-dns-svc\") pod \"dnsmasq-dns-778545d54f-gjkrp\" (UID: \"25a49c28-995a-475e-a81f-45632b61104d\") " pod="openstack/dnsmasq-dns-778545d54f-gjkrp" Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.927569 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-ovsdbserver-nb\") pod \"dnsmasq-dns-778545d54f-gjkrp\" (UID: \"25a49c28-995a-475e-a81f-45632b61104d\") " pod="openstack/dnsmasq-dns-778545d54f-gjkrp" Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.927623 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-dns-swift-storage-0\") pod \"dnsmasq-dns-778545d54f-gjkrp\" (UID: \"25a49c28-995a-475e-a81f-45632b61104d\") " pod="openstack/dnsmasq-dns-778545d54f-gjkrp" Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.928040 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-ovsdbserver-sb\") pod \"dnsmasq-dns-778545d54f-gjkrp\" (UID: \"25a49c28-995a-475e-a81f-45632b61104d\") " pod="openstack/dnsmasq-dns-778545d54f-gjkrp" Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.955109 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb4rr\" (UniqueName: \"kubernetes.io/projected/25a49c28-995a-475e-a81f-45632b61104d-kube-api-access-lb4rr\") pod \"dnsmasq-dns-778545d54f-gjkrp\" (UID: \"25a49c28-995a-475e-a81f-45632b61104d\") " pod="openstack/dnsmasq-dns-778545d54f-gjkrp" Feb 19 18:51:37 crc kubenswrapper[4749]: I0219 18:51:37.969130 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-778545d54f-gjkrp" Feb 19 18:51:38 crc kubenswrapper[4749]: I0219 18:51:38.433842 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-778545d54f-gjkrp"] Feb 19 18:51:38 crc kubenswrapper[4749]: W0219 18:51:38.437730 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25a49c28_995a_475e_a81f_45632b61104d.slice/crio-79eec8e199b87de0a27a5606ba1f965c91e263384cb9b513197600d562675457 WatchSource:0}: Error finding container 79eec8e199b87de0a27a5606ba1f965c91e263384cb9b513197600d562675457: Status 404 returned error can't find the container with id 79eec8e199b87de0a27a5606ba1f965c91e263384cb9b513197600d562675457 Feb 19 18:51:39 crc kubenswrapper[4749]: I0219 18:51:39.325631 4749 generic.go:334] "Generic (PLEG): container finished" podID="04522760-872e-4efc-a852-726292ac24f4" containerID="f5ce2cb7b7b99196d49aff9f25ab067642afbe62910cb83f462d2f20329bc557" exitCode=0 Feb 19 18:51:39 crc kubenswrapper[4749]: I0219 18:51:39.325725 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-tkp7q" event={"ID":"04522760-872e-4efc-a852-726292ac24f4","Type":"ContainerDied","Data":"f5ce2cb7b7b99196d49aff9f25ab067642afbe62910cb83f462d2f20329bc557"} Feb 19 18:51:39 crc kubenswrapper[4749]: I0219 18:51:39.327836 4749 generic.go:334] "Generic (PLEG): container finished" podID="25a49c28-995a-475e-a81f-45632b61104d" containerID="d29ebf567c6e8b75e175622c9aba2535778054730821ed7d8ef3bd17f7197f33" exitCode=0 Feb 19 18:51:39 crc kubenswrapper[4749]: I0219 18:51:39.327952 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-778545d54f-gjkrp" event={"ID":"25a49c28-995a-475e-a81f-45632b61104d","Type":"ContainerDied","Data":"d29ebf567c6e8b75e175622c9aba2535778054730821ed7d8ef3bd17f7197f33"} Feb 19 18:51:39 crc kubenswrapper[4749]: I0219 18:51:39.327976 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-778545d54f-gjkrp" event={"ID":"25a49c28-995a-475e-a81f-45632b61104d","Type":"ContainerStarted","Data":"79eec8e199b87de0a27a5606ba1f965c91e263384cb9b513197600d562675457"} Feb 19 18:51:39 crc kubenswrapper[4749]: I0219 18:51:39.328100 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b56fc7d7c-pb4d6" podUID="10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6" containerName="dnsmasq-dns" containerID="cri-o://cfd156470d53cdac4a098cc1e68daeb6fcd103b85ddbcb21073d73154747f40e" gracePeriod=10 Feb 19 18:51:39 crc kubenswrapper[4749]: I0219 18:51:39.810138 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b56fc7d7c-pb4d6" Feb 19 18:51:39 crc kubenswrapper[4749]: I0219 18:51:39.963789 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-ovsdbserver-nb\") pod \"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6\" (UID: \"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6\") " Feb 19 18:51:39 crc kubenswrapper[4749]: I0219 18:51:39.964212 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78xxs\" (UniqueName: \"kubernetes.io/projected/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-kube-api-access-78xxs\") pod \"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6\" (UID: \"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6\") " Feb 19 18:51:39 crc kubenswrapper[4749]: I0219 18:51:39.964255 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-ovsdbserver-sb\") pod \"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6\" (UID: \"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6\") " Feb 19 18:51:39 crc kubenswrapper[4749]: I0219 18:51:39.964344 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-config\") pod \"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6\" (UID: \"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6\") " Feb 19 18:51:39 crc kubenswrapper[4749]: I0219 18:51:39.964385 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-dns-svc\") pod \"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6\" (UID: \"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6\") " Feb 19 18:51:39 crc kubenswrapper[4749]: I0219 18:51:39.969331 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-kube-api-access-78xxs" (OuterVolumeSpecName: "kube-api-access-78xxs") pod "10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6" (UID: "10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6"). InnerVolumeSpecName "kube-api-access-78xxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.005405 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6" (UID: "10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.007282 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6" (UID: "10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.008467 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6" (UID: "10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.024850 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-config" (OuterVolumeSpecName: "config") pod "10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6" (UID: "10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.065999 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78xxs\" (UniqueName: \"kubernetes.io/projected/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-kube-api-access-78xxs\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.066198 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.066289 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.066390 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.068364 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.340216 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-778545d54f-gjkrp" event={"ID":"25a49c28-995a-475e-a81f-45632b61104d","Type":"ContainerStarted","Data":"e313d87120d3eabcf0d8c7e3ff4af3de5074a5b09ab3e2b69effe282212aa9dd"} Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.341636 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-778545d54f-gjkrp" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.344376 4749 generic.go:334] "Generic (PLEG): container finished" podID="10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6" containerID="cfd156470d53cdac4a098cc1e68daeb6fcd103b85ddbcb21073d73154747f40e" exitCode=0 Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.344457 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b56fc7d7c-pb4d6" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.344481 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b56fc7d7c-pb4d6" event={"ID":"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6","Type":"ContainerDied","Data":"cfd156470d53cdac4a098cc1e68daeb6fcd103b85ddbcb21073d73154747f40e"} Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.344647 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b56fc7d7c-pb4d6" event={"ID":"10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6","Type":"ContainerDied","Data":"833f65e09108607cc79087bfc6a3d112ca3dec7f116b38900c0e6a06834c1899"} Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.344699 4749 scope.go:117] "RemoveContainer" containerID="cfd156470d53cdac4a098cc1e68daeb6fcd103b85ddbcb21073d73154747f40e" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.347664 4749 generic.go:334] "Generic (PLEG): container finished" podID="284da664-b432-48cf-8f30-fa7cc57bd5b3" containerID="352a35ca5c0a0988d7f8bf9cfca788b7f9d858c3dcd1018d9be50641b86b57ef" exitCode=0 Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.347955 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-trrfz" event={"ID":"284da664-b432-48cf-8f30-fa7cc57bd5b3","Type":"ContainerDied","Data":"352a35ca5c0a0988d7f8bf9cfca788b7f9d858c3dcd1018d9be50641b86b57ef"} Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.398963 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-778545d54f-gjkrp" podStartSLOduration=3.398946321 podStartE2EDuration="3.398946321s" podCreationTimestamp="2026-02-19 18:51:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:51:40.373595604 +0000 UTC m=+1074.334815558" watchObservedRunningTime="2026-02-19 18:51:40.398946321 +0000 UTC m=+1074.360166275" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.424152 4749 scope.go:117] "RemoveContainer" containerID="a95f20992b8f3c8efc12e85765a91fe6ba0d03a21dd4e2abff70cc871c71f067" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.432499 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b56fc7d7c-pb4d6"] Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.440418 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b56fc7d7c-pb4d6"] Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.446633 4749 scope.go:117] "RemoveContainer" containerID="cfd156470d53cdac4a098cc1e68daeb6fcd103b85ddbcb21073d73154747f40e" Feb 19 18:51:40 crc kubenswrapper[4749]: E0219 18:51:40.447152 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfd156470d53cdac4a098cc1e68daeb6fcd103b85ddbcb21073d73154747f40e\": container with ID starting with cfd156470d53cdac4a098cc1e68daeb6fcd103b85ddbcb21073d73154747f40e not found: ID does not exist" containerID="cfd156470d53cdac4a098cc1e68daeb6fcd103b85ddbcb21073d73154747f40e" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.447186 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfd156470d53cdac4a098cc1e68daeb6fcd103b85ddbcb21073d73154747f40e"} err="failed to get container status \"cfd156470d53cdac4a098cc1e68daeb6fcd103b85ddbcb21073d73154747f40e\": rpc error: code = NotFound desc = could not find container \"cfd156470d53cdac4a098cc1e68daeb6fcd103b85ddbcb21073d73154747f40e\": container with ID starting with cfd156470d53cdac4a098cc1e68daeb6fcd103b85ddbcb21073d73154747f40e not found: ID does not exist" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.447207 4749 scope.go:117] "RemoveContainer" containerID="a95f20992b8f3c8efc12e85765a91fe6ba0d03a21dd4e2abff70cc871c71f067" Feb 19 18:51:40 crc kubenswrapper[4749]: E0219 18:51:40.447541 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a95f20992b8f3c8efc12e85765a91fe6ba0d03a21dd4e2abff70cc871c71f067\": container with ID starting with a95f20992b8f3c8efc12e85765a91fe6ba0d03a21dd4e2abff70cc871c71f067 not found: ID does not exist" containerID="a95f20992b8f3c8efc12e85765a91fe6ba0d03a21dd4e2abff70cc871c71f067" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.447562 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a95f20992b8f3c8efc12e85765a91fe6ba0d03a21dd4e2abff70cc871c71f067"} err="failed to get container status \"a95f20992b8f3c8efc12e85765a91fe6ba0d03a21dd4e2abff70cc871c71f067\": rpc error: code = NotFound desc = could not find container \"a95f20992b8f3c8efc12e85765a91fe6ba0d03a21dd4e2abff70cc871c71f067\": container with ID starting with a95f20992b8f3c8efc12e85765a91fe6ba0d03a21dd4e2abff70cc871c71f067 not found: ID does not exist" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.696680 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6" path="/var/lib/kubelet/pods/10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6/volumes" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.752189 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-tkp7q" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.886775 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04522760-872e-4efc-a852-726292ac24f4-combined-ca-bundle\") pod \"04522760-872e-4efc-a852-726292ac24f4\" (UID: \"04522760-872e-4efc-a852-726292ac24f4\") " Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.886837 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04522760-872e-4efc-a852-726292ac24f4-config-data\") pod \"04522760-872e-4efc-a852-726292ac24f4\" (UID: \"04522760-872e-4efc-a852-726292ac24f4\") " Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.886942 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldgjl\" (UniqueName: \"kubernetes.io/projected/04522760-872e-4efc-a852-726292ac24f4-kube-api-access-ldgjl\") pod \"04522760-872e-4efc-a852-726292ac24f4\" (UID: \"04522760-872e-4efc-a852-726292ac24f4\") " Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.887002 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/04522760-872e-4efc-a852-726292ac24f4-db-sync-config-data\") pod \"04522760-872e-4efc-a852-726292ac24f4\" (UID: \"04522760-872e-4efc-a852-726292ac24f4\") " Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.891396 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04522760-872e-4efc-a852-726292ac24f4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "04522760-872e-4efc-a852-726292ac24f4" (UID: "04522760-872e-4efc-a852-726292ac24f4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.891512 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04522760-872e-4efc-a852-726292ac24f4-kube-api-access-ldgjl" (OuterVolumeSpecName: "kube-api-access-ldgjl") pod "04522760-872e-4efc-a852-726292ac24f4" (UID: "04522760-872e-4efc-a852-726292ac24f4"). InnerVolumeSpecName "kube-api-access-ldgjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.912372 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04522760-872e-4efc-a852-726292ac24f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04522760-872e-4efc-a852-726292ac24f4" (UID: "04522760-872e-4efc-a852-726292ac24f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.925699 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04522760-872e-4efc-a852-726292ac24f4-config-data" (OuterVolumeSpecName: "config-data") pod "04522760-872e-4efc-a852-726292ac24f4" (UID: "04522760-872e-4efc-a852-726292ac24f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.988951 4749 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/04522760-872e-4efc-a852-726292ac24f4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.988987 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04522760-872e-4efc-a852-726292ac24f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.988996 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04522760-872e-4efc-a852-726292ac24f4-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:40 crc kubenswrapper[4749]: I0219 18:51:40.989005 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldgjl\" (UniqueName: \"kubernetes.io/projected/04522760-872e-4efc-a852-726292ac24f4-kube-api-access-ldgjl\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:41 crc kubenswrapper[4749]: I0219 18:51:41.365845 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-tkp7q" event={"ID":"04522760-872e-4efc-a852-726292ac24f4","Type":"ContainerDied","Data":"3c418d6c071ce094e0b5a939a385d91ddaac3572e2cfd636845541b4460e9446"} Feb 19 18:51:41 crc kubenswrapper[4749]: I0219 18:51:41.365882 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-tkp7q" Feb 19 18:51:41 crc kubenswrapper[4749]: I0219 18:51:41.365888 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c418d6c071ce094e0b5a939a385d91ddaac3572e2cfd636845541b4460e9446" Feb 19 18:51:41 crc kubenswrapper[4749]: I0219 18:51:41.751171 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-trrfz" Feb 19 18:51:41 crc kubenswrapper[4749]: I0219 18:51:41.905740 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttflw\" (UniqueName: \"kubernetes.io/projected/284da664-b432-48cf-8f30-fa7cc57bd5b3-kube-api-access-ttflw\") pod \"284da664-b432-48cf-8f30-fa7cc57bd5b3\" (UID: \"284da664-b432-48cf-8f30-fa7cc57bd5b3\") " Feb 19 18:51:41 crc kubenswrapper[4749]: I0219 18:51:41.905886 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/284da664-b432-48cf-8f30-fa7cc57bd5b3-combined-ca-bundle\") pod \"284da664-b432-48cf-8f30-fa7cc57bd5b3\" (UID: \"284da664-b432-48cf-8f30-fa7cc57bd5b3\") " Feb 19 18:51:41 crc kubenswrapper[4749]: I0219 18:51:41.905964 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/284da664-b432-48cf-8f30-fa7cc57bd5b3-config-data\") pod \"284da664-b432-48cf-8f30-fa7cc57bd5b3\" (UID: \"284da664-b432-48cf-8f30-fa7cc57bd5b3\") " Feb 19 18:51:41 crc kubenswrapper[4749]: I0219 18:51:41.911413 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/284da664-b432-48cf-8f30-fa7cc57bd5b3-kube-api-access-ttflw" (OuterVolumeSpecName: "kube-api-access-ttflw") pod "284da664-b432-48cf-8f30-fa7cc57bd5b3" (UID: "284da664-b432-48cf-8f30-fa7cc57bd5b3"). InnerVolumeSpecName "kube-api-access-ttflw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:51:41 crc kubenswrapper[4749]: I0219 18:51:41.934212 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/284da664-b432-48cf-8f30-fa7cc57bd5b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "284da664-b432-48cf-8f30-fa7cc57bd5b3" (UID: "284da664-b432-48cf-8f30-fa7cc57bd5b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:51:41 crc kubenswrapper[4749]: I0219 18:51:41.957560 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/284da664-b432-48cf-8f30-fa7cc57bd5b3-config-data" (OuterVolumeSpecName: "config-data") pod "284da664-b432-48cf-8f30-fa7cc57bd5b3" (UID: "284da664-b432-48cf-8f30-fa7cc57bd5b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.007810 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/284da664-b432-48cf-8f30-fa7cc57bd5b3-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.007855 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttflw\" (UniqueName: \"kubernetes.io/projected/284da664-b432-48cf-8f30-fa7cc57bd5b3-kube-api-access-ttflw\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.007866 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/284da664-b432-48cf-8f30-fa7cc57bd5b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.374232 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-trrfz" event={"ID":"284da664-b432-48cf-8f30-fa7cc57bd5b3","Type":"ContainerDied","Data":"2dcfa948891457b1ab4dfda1923092ce8ec0432bc78f656baab626fcf2787335"} Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.374277 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dcfa948891457b1ab4dfda1923092ce8ec0432bc78f656baab626fcf2787335" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.374238 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-trrfz" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.627983 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-778545d54f-gjkrp"] Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.668537 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5889f4df85-qt29k"] Feb 19 18:51:42 crc kubenswrapper[4749]: E0219 18:51:42.668940 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="284da664-b432-48cf-8f30-fa7cc57bd5b3" containerName="keystone-db-sync" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.668958 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="284da664-b432-48cf-8f30-fa7cc57bd5b3" containerName="keystone-db-sync" Feb 19 18:51:42 crc kubenswrapper[4749]: E0219 18:51:42.668975 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04522760-872e-4efc-a852-726292ac24f4" containerName="watcher-db-sync" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.668982 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="04522760-872e-4efc-a852-726292ac24f4" containerName="watcher-db-sync" Feb 19 18:51:42 crc kubenswrapper[4749]: E0219 18:51:42.669003 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6" containerName="init" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.669011 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6" containerName="init" Feb 19 18:51:42 crc kubenswrapper[4749]: E0219 18:51:42.669041 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6" containerName="dnsmasq-dns" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.669050 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6" containerName="dnsmasq-dns" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.669244 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="10f9e1fd-86a0-43f9-bfd5-8cfa3e4681e6" containerName="dnsmasq-dns" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.669259 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="284da664-b432-48cf-8f30-fa7cc57bd5b3" containerName="keystone-db-sync" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.669277 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="04522760-872e-4efc-a852-726292ac24f4" containerName="watcher-db-sync" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.670522 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5889f4df85-qt29k" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.724092 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-8rjdg"] Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.725552 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8rjdg" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.729944 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.730389 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-xqgf2" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.730553 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.730798 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.730917 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.768236 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8rjdg"] Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.809338 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5889f4df85-qt29k"] Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.823796 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-config\") pod \"dnsmasq-dns-5889f4df85-qt29k\" (UID: \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\") " pod="openstack/dnsmasq-dns-5889f4df85-qt29k" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.823853 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-config-data\") pod \"keystone-bootstrap-8rjdg\" (UID: \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\") " pod="openstack/keystone-bootstrap-8rjdg" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.823912 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-ovsdbserver-sb\") pod \"dnsmasq-dns-5889f4df85-qt29k\" (UID: \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\") " pod="openstack/dnsmasq-dns-5889f4df85-qt29k" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.823937 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-ovsdbserver-nb\") pod \"dnsmasq-dns-5889f4df85-qt29k\" (UID: \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\") " pod="openstack/dnsmasq-dns-5889f4df85-qt29k" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.823955 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-combined-ca-bundle\") pod \"keystone-bootstrap-8rjdg\" (UID: \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\") " pod="openstack/keystone-bootstrap-8rjdg" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.823991 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfbzv\" (UniqueName: \"kubernetes.io/projected/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-kube-api-access-tfbzv\") pod \"dnsmasq-dns-5889f4df85-qt29k\" (UID: \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\") " pod="openstack/dnsmasq-dns-5889f4df85-qt29k" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.824015 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkf6q\" (UniqueName: \"kubernetes.io/projected/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-kube-api-access-wkf6q\") pod \"keystone-bootstrap-8rjdg\" (UID: \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\") " pod="openstack/keystone-bootstrap-8rjdg" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.824064 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-credential-keys\") pod \"keystone-bootstrap-8rjdg\" (UID: \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\") " pod="openstack/keystone-bootstrap-8rjdg" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.824166 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-scripts\") pod \"keystone-bootstrap-8rjdg\" (UID: \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\") " pod="openstack/keystone-bootstrap-8rjdg" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.824329 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-dns-svc\") pod \"dnsmasq-dns-5889f4df85-qt29k\" (UID: \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\") " pod="openstack/dnsmasq-dns-5889f4df85-qt29k" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.824388 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-dns-swift-storage-0\") pod \"dnsmasq-dns-5889f4df85-qt29k\" (UID: \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\") " pod="openstack/dnsmasq-dns-5889f4df85-qt29k" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.824445 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-fernet-keys\") pod \"keystone-bootstrap-8rjdg\" (UID: \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\") " pod="openstack/keystone-bootstrap-8rjdg" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.825609 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.826877 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.862377 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-p6p4n" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.862409 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.869099 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.888614 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.893238 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.900175 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.917606 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.927184 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfbzv\" (UniqueName: \"kubernetes.io/projected/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-kube-api-access-tfbzv\") pod \"dnsmasq-dns-5889f4df85-qt29k\" (UID: \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\") " pod="openstack/dnsmasq-dns-5889f4df85-qt29k" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.927258 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkf6q\" (UniqueName: \"kubernetes.io/projected/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-kube-api-access-wkf6q\") pod \"keystone-bootstrap-8rjdg\" (UID: \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\") " pod="openstack/keystone-bootstrap-8rjdg" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.927325 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d504807c-589e-4665-bc1d-865473001928-logs\") pod \"watcher-applier-0\" (UID: \"d504807c-589e-4665-bc1d-865473001928\") " pod="openstack/watcher-applier-0" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.927371 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-credential-keys\") pod \"keystone-bootstrap-8rjdg\" (UID: \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\") " pod="openstack/keystone-bootstrap-8rjdg" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.927395 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d504807c-589e-4665-bc1d-865473001928-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"d504807c-589e-4665-bc1d-865473001928\") " pod="openstack/watcher-applier-0" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.927421 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-scripts\") pod \"keystone-bootstrap-8rjdg\" (UID: \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\") " pod="openstack/keystone-bootstrap-8rjdg" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.927458 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-dns-svc\") pod \"dnsmasq-dns-5889f4df85-qt29k\" (UID: \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\") " pod="openstack/dnsmasq-dns-5889f4df85-qt29k" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.927474 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-dns-swift-storage-0\") pod \"dnsmasq-dns-5889f4df85-qt29k\" (UID: \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\") " pod="openstack/dnsmasq-dns-5889f4df85-qt29k" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.927499 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-fernet-keys\") pod \"keystone-bootstrap-8rjdg\" (UID: \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\") " pod="openstack/keystone-bootstrap-8rjdg" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.927549 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-config\") pod \"dnsmasq-dns-5889f4df85-qt29k\" (UID: \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\") " pod="openstack/dnsmasq-dns-5889f4df85-qt29k" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.927568 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-config-data\") pod \"keystone-bootstrap-8rjdg\" (UID: \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\") " pod="openstack/keystone-bootstrap-8rjdg" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.927616 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-ovsdbserver-sb\") pod \"dnsmasq-dns-5889f4df85-qt29k\" (UID: \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\") " pod="openstack/dnsmasq-dns-5889f4df85-qt29k" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.927638 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-ovsdbserver-nb\") pod \"dnsmasq-dns-5889f4df85-qt29k\" (UID: \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\") " pod="openstack/dnsmasq-dns-5889f4df85-qt29k" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.927660 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d504807c-589e-4665-bc1d-865473001928-config-data\") pod \"watcher-applier-0\" (UID: \"d504807c-589e-4665-bc1d-865473001928\") " pod="openstack/watcher-applier-0" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.927695 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-combined-ca-bundle\") pod \"keystone-bootstrap-8rjdg\" (UID: \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\") " pod="openstack/keystone-bootstrap-8rjdg" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.927714 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj7st\" (UniqueName: \"kubernetes.io/projected/d504807c-589e-4665-bc1d-865473001928-kube-api-access-jj7st\") pod \"watcher-applier-0\" (UID: \"d504807c-589e-4665-bc1d-865473001928\") " pod="openstack/watcher-applier-0" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.929567 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-dns-swift-storage-0\") pod \"dnsmasq-dns-5889f4df85-qt29k\" (UID: \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\") " pod="openstack/dnsmasq-dns-5889f4df85-qt29k" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.934061 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-config\") pod \"dnsmasq-dns-5889f4df85-qt29k\" (UID: \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\") " pod="openstack/dnsmasq-dns-5889f4df85-qt29k" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.934391 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-ovsdbserver-sb\") pod \"dnsmasq-dns-5889f4df85-qt29k\" (UID: \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\") " pod="openstack/dnsmasq-dns-5889f4df85-qt29k" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.935070 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-dns-svc\") pod \"dnsmasq-dns-5889f4df85-qt29k\" (UID: \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\") " pod="openstack/dnsmasq-dns-5889f4df85-qt29k" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.935755 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-ovsdbserver-nb\") pod \"dnsmasq-dns-5889f4df85-qt29k\" (UID: \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\") " pod="openstack/dnsmasq-dns-5889f4df85-qt29k" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.946351 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-fernet-keys\") pod \"keystone-bootstrap-8rjdg\" (UID: \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\") " pod="openstack/keystone-bootstrap-8rjdg" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.957406 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-scripts\") pod \"keystone-bootstrap-8rjdg\" (UID: \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\") " pod="openstack/keystone-bootstrap-8rjdg" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.959245 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-config-data\") pod \"keystone-bootstrap-8rjdg\" (UID: \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\") " pod="openstack/keystone-bootstrap-8rjdg" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.960570 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-credential-keys\") pod \"keystone-bootstrap-8rjdg\" (UID: \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\") " pod="openstack/keystone-bootstrap-8rjdg" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.964771 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-combined-ca-bundle\") pod \"keystone-bootstrap-8rjdg\" (UID: \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\") " pod="openstack/keystone-bootstrap-8rjdg" Feb 19 18:51:42 crc kubenswrapper[4749]: I0219 18:51:42.990324 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfbzv\" (UniqueName: \"kubernetes.io/projected/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-kube-api-access-tfbzv\") pod \"dnsmasq-dns-5889f4df85-qt29k\" (UID: \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\") " pod="openstack/dnsmasq-dns-5889f4df85-qt29k" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.002412 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.003714 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.012912 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5889f4df85-qt29k" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.019896 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkf6q\" (UniqueName: \"kubernetes.io/projected/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-kube-api-access-wkf6q\") pod \"keystone-bootstrap-8rjdg\" (UID: \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\") " pod="openstack/keystone-bootstrap-8rjdg" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.038416 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.038956 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d504807c-589e-4665-bc1d-865473001928-config-data\") pod \"watcher-applier-0\" (UID: \"d504807c-589e-4665-bc1d-865473001928\") " pod="openstack/watcher-applier-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.038998 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj7st\" (UniqueName: \"kubernetes.io/projected/d504807c-589e-4665-bc1d-865473001928-kube-api-access-jj7st\") pod \"watcher-applier-0\" (UID: \"d504807c-589e-4665-bc1d-865473001928\") " pod="openstack/watcher-applier-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.039052 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/763173db-176a-426d-bd85-e051d56ec5cf-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"763173db-176a-426d-bd85-e051d56ec5cf\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.052127 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.052234 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d504807c-589e-4665-bc1d-865473001928-logs\") pod \"watcher-applier-0\" (UID: \"d504807c-589e-4665-bc1d-865473001928\") " pod="openstack/watcher-applier-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.052306 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d504807c-589e-4665-bc1d-865473001928-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"d504807c-589e-4665-bc1d-865473001928\") " pod="openstack/watcher-applier-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.052361 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/763173db-176a-426d-bd85-e051d56ec5cf-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"763173db-176a-426d-bd85-e051d56ec5cf\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.052452 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck9ss\" (UniqueName: \"kubernetes.io/projected/763173db-176a-426d-bd85-e051d56ec5cf-kube-api-access-ck9ss\") pod \"watcher-decision-engine-0\" (UID: \"763173db-176a-426d-bd85-e051d56ec5cf\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.052492 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/763173db-176a-426d-bd85-e051d56ec5cf-config-data\") pod \"watcher-decision-engine-0\" (UID: \"763173db-176a-426d-bd85-e051d56ec5cf\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.052574 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/763173db-176a-426d-bd85-e051d56ec5cf-logs\") pod \"watcher-decision-engine-0\" (UID: \"763173db-176a-426d-bd85-e051d56ec5cf\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.052959 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d504807c-589e-4665-bc1d-865473001928-logs\") pod \"watcher-applier-0\" (UID: \"d504807c-589e-4665-bc1d-865473001928\") " pod="openstack/watcher-applier-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.067198 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d504807c-589e-4665-bc1d-865473001928-config-data\") pod \"watcher-applier-0\" (UID: \"d504807c-589e-4665-bc1d-865473001928\") " pod="openstack/watcher-applier-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.067719 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d504807c-589e-4665-bc1d-865473001928-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"d504807c-589e-4665-bc1d-865473001928\") " pod="openstack/watcher-applier-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.084641 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8rjdg" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.086499 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-59c4845857-58lkr"] Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.087988 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59c4845857-58lkr" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.126732 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj7st\" (UniqueName: \"kubernetes.io/projected/d504807c-589e-4665-bc1d-865473001928-kube-api-access-jj7st\") pod \"watcher-applier-0\" (UID: \"d504807c-589e-4665-bc1d-865473001928\") " pod="openstack/watcher-applier-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.135335 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.135530 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-8cs9j" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.135653 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.145242 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.168785 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-59c4845857-58lkr"] Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.190038 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.222504 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/facae815-e01a-4641-b07c-3530303cc691-config-data\") pod \"watcher-api-0\" (UID: \"facae815-e01a-4641-b07c-3530303cc691\") " pod="openstack/watcher-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.222611 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/facae815-e01a-4641-b07c-3530303cc691-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"facae815-e01a-4641-b07c-3530303cc691\") " pod="openstack/watcher-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.222703 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/763173db-176a-426d-bd85-e051d56ec5cf-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"763173db-176a-426d-bd85-e051d56ec5cf\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.222734 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql7zx\" (UniqueName: \"kubernetes.io/projected/facae815-e01a-4641-b07c-3530303cc691-kube-api-access-ql7zx\") pod \"watcher-api-0\" (UID: \"facae815-e01a-4641-b07c-3530303cc691\") " pod="openstack/watcher-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.222828 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck9ss\" (UniqueName: \"kubernetes.io/projected/763173db-176a-426d-bd85-e051d56ec5cf-kube-api-access-ck9ss\") pod \"watcher-decision-engine-0\" (UID: \"763173db-176a-426d-bd85-e051d56ec5cf\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.222881 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/763173db-176a-426d-bd85-e051d56ec5cf-config-data\") pod \"watcher-decision-engine-0\" (UID: \"763173db-176a-426d-bd85-e051d56ec5cf\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.223586 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/763173db-176a-426d-bd85-e051d56ec5cf-logs\") pod \"watcher-decision-engine-0\" (UID: \"763173db-176a-426d-bd85-e051d56ec5cf\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.223626 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/facae815-e01a-4641-b07c-3530303cc691-logs\") pod \"watcher-api-0\" (UID: \"facae815-e01a-4641-b07c-3530303cc691\") " pod="openstack/watcher-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.223669 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/facae815-e01a-4641-b07c-3530303cc691-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"facae815-e01a-4641-b07c-3530303cc691\") " pod="openstack/watcher-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.223721 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/763173db-176a-426d-bd85-e051d56ec5cf-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"763173db-176a-426d-bd85-e051d56ec5cf\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.225160 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/763173db-176a-426d-bd85-e051d56ec5cf-logs\") pod \"watcher-decision-engine-0\" (UID: \"763173db-176a-426d-bd85-e051d56ec5cf\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.231067 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/763173db-176a-426d-bd85-e051d56ec5cf-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"763173db-176a-426d-bd85-e051d56ec5cf\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.245723 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/763173db-176a-426d-bd85-e051d56ec5cf-config-data\") pod \"watcher-decision-engine-0\" (UID: \"763173db-176a-426d-bd85-e051d56ec5cf\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.251108 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/763173db-176a-426d-bd85-e051d56ec5cf-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"763173db-176a-426d-bd85-e051d56ec5cf\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.266229 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck9ss\" (UniqueName: \"kubernetes.io/projected/763173db-176a-426d-bd85-e051d56ec5cf-kube-api-access-ck9ss\") pod \"watcher-decision-engine-0\" (UID: \"763173db-176a-426d-bd85-e051d56ec5cf\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.329111 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd8ec365-fbcd-4680-90cf-9a41cae499f3-config-data\") pod \"horizon-59c4845857-58lkr\" (UID: \"bd8ec365-fbcd-4680-90cf-9a41cae499f3\") " pod="openstack/horizon-59c4845857-58lkr" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.337272 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bd8ec365-fbcd-4680-90cf-9a41cae499f3-horizon-secret-key\") pod \"horizon-59c4845857-58lkr\" (UID: \"bd8ec365-fbcd-4680-90cf-9a41cae499f3\") " pod="openstack/horizon-59c4845857-58lkr" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.337337 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd8ec365-fbcd-4680-90cf-9a41cae499f3-scripts\") pod \"horizon-59c4845857-58lkr\" (UID: \"bd8ec365-fbcd-4680-90cf-9a41cae499f3\") " pod="openstack/horizon-59c4845857-58lkr" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.337393 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/facae815-e01a-4641-b07c-3530303cc691-logs\") pod \"watcher-api-0\" (UID: \"facae815-e01a-4641-b07c-3530303cc691\") " pod="openstack/watcher-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.337472 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/facae815-e01a-4641-b07c-3530303cc691-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"facae815-e01a-4641-b07c-3530303cc691\") " pod="openstack/watcher-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.337520 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d6r7\" (UniqueName: \"kubernetes.io/projected/bd8ec365-fbcd-4680-90cf-9a41cae499f3-kube-api-access-4d6r7\") pod \"horizon-59c4845857-58lkr\" (UID: \"bd8ec365-fbcd-4680-90cf-9a41cae499f3\") " pod="openstack/horizon-59c4845857-58lkr" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.337619 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/facae815-e01a-4641-b07c-3530303cc691-config-data\") pod \"watcher-api-0\" (UID: \"facae815-e01a-4641-b07c-3530303cc691\") " pod="openstack/watcher-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.337705 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/facae815-e01a-4641-b07c-3530303cc691-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"facae815-e01a-4641-b07c-3530303cc691\") " pod="openstack/watcher-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.337815 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/facae815-e01a-4641-b07c-3530303cc691-logs\") pod \"watcher-api-0\" (UID: \"facae815-e01a-4641-b07c-3530303cc691\") " pod="openstack/watcher-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.340715 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.345339 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/facae815-e01a-4641-b07c-3530303cc691-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"facae815-e01a-4641-b07c-3530303cc691\") " pod="openstack/watcher-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.347377 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/facae815-e01a-4641-b07c-3530303cc691-config-data\") pod \"watcher-api-0\" (UID: \"facae815-e01a-4641-b07c-3530303cc691\") " pod="openstack/watcher-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.350733 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/facae815-e01a-4641-b07c-3530303cc691-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"facae815-e01a-4641-b07c-3530303cc691\") " pod="openstack/watcher-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.357240 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.364606 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql7zx\" (UniqueName: \"kubernetes.io/projected/facae815-e01a-4641-b07c-3530303cc691-kube-api-access-ql7zx\") pod \"watcher-api-0\" (UID: \"facae815-e01a-4641-b07c-3530303cc691\") " pod="openstack/watcher-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.364655 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd8ec365-fbcd-4680-90cf-9a41cae499f3-logs\") pod \"horizon-59c4845857-58lkr\" (UID: \"bd8ec365-fbcd-4680-90cf-9a41cae499f3\") " pod="openstack/horizon-59c4845857-58lkr" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.373431 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.373699 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.373809 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5889f4df85-qt29k"] Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.398081 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-jvrxf"] Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.399308 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jvrxf" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.412617 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql7zx\" (UniqueName: \"kubernetes.io/projected/facae815-e01a-4641-b07c-3530303cc691-kube-api-access-ql7zx\") pod \"watcher-api-0\" (UID: \"facae815-e01a-4641-b07c-3530303cc691\") " pod="openstack/watcher-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.420922 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-778545d54f-gjkrp" podUID="25a49c28-995a-475e-a81f-45632b61104d" containerName="dnsmasq-dns" containerID="cri-o://e313d87120d3eabcf0d8c7e3ff4af3de5074a5b09ab3e2b69effe282212aa9dd" gracePeriod=10 Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.426107 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.426370 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.426510 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-h7wtm" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.431122 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-qr8r7"] Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.432416 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-qr8r7" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.442471 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-fdrlz" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.442851 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.455863 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.456040 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.470358 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66d13d51-29b5-463f-873d-4a586878e0c4-scripts\") pod \"ceilometer-0\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " pod="openstack/ceilometer-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.470409 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d6r7\" (UniqueName: \"kubernetes.io/projected/bd8ec365-fbcd-4680-90cf-9a41cae499f3-kube-api-access-4d6r7\") pod \"horizon-59c4845857-58lkr\" (UID: \"bd8ec365-fbcd-4680-90cf-9a41cae499f3\") " pod="openstack/horizon-59c4845857-58lkr" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.470434 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e6e94fd-e114-41b2-8634-ca805b5e260f-config-data\") pod \"placement-db-sync-qr8r7\" (UID: \"4e6e94fd-e114-41b2-8634-ca805b5e260f\") " pod="openstack/placement-db-sync-qr8r7" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.470464 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e6e94fd-e114-41b2-8634-ca805b5e260f-combined-ca-bundle\") pod \"placement-db-sync-qr8r7\" (UID: \"4e6e94fd-e114-41b2-8634-ca805b5e260f\") " pod="openstack/placement-db-sync-qr8r7" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.470493 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e6e94fd-e114-41b2-8634-ca805b5e260f-logs\") pod \"placement-db-sync-qr8r7\" (UID: \"4e6e94fd-e114-41b2-8634-ca805b5e260f\") " pod="openstack/placement-db-sync-qr8r7" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.470517 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66d13d51-29b5-463f-873d-4a586878e0c4-log-httpd\") pod \"ceilometer-0\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " pod="openstack/ceilometer-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.470539 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d13d51-29b5-463f-873d-4a586878e0c4-config-data\") pod \"ceilometer-0\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " pod="openstack/ceilometer-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.470571 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e6e94fd-e114-41b2-8634-ca805b5e260f-scripts\") pod \"placement-db-sync-qr8r7\" (UID: \"4e6e94fd-e114-41b2-8634-ca805b5e260f\") " pod="openstack/placement-db-sync-qr8r7" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.470590 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd8ec365-fbcd-4680-90cf-9a41cae499f3-logs\") pod \"horizon-59c4845857-58lkr\" (UID: \"bd8ec365-fbcd-4680-90cf-9a41cae499f3\") " pod="openstack/horizon-59c4845857-58lkr" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.470617 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7h2b\" (UniqueName: \"kubernetes.io/projected/4e6e94fd-e114-41b2-8634-ca805b5e260f-kube-api-access-d7h2b\") pod \"placement-db-sync-qr8r7\" (UID: \"4e6e94fd-e114-41b2-8634-ca805b5e260f\") " pod="openstack/placement-db-sync-qr8r7" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.470644 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66d13d51-29b5-463f-873d-4a586878e0c4-run-httpd\") pod \"ceilometer-0\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " pod="openstack/ceilometer-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.470665 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66d13d51-29b5-463f-873d-4a586878e0c4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " pod="openstack/ceilometer-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.470688 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd8ec365-fbcd-4680-90cf-9a41cae499f3-config-data\") pod \"horizon-59c4845857-58lkr\" (UID: \"bd8ec365-fbcd-4680-90cf-9a41cae499f3\") " pod="openstack/horizon-59c4845857-58lkr" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.470703 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9efa7d7f-28c0-4bd7-ae4f-f988968544c0-combined-ca-bundle\") pod \"neutron-db-sync-jvrxf\" (UID: \"9efa7d7f-28c0-4bd7-ae4f-f988968544c0\") " pod="openstack/neutron-db-sync-jvrxf" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.470730 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9efa7d7f-28c0-4bd7-ae4f-f988968544c0-config\") pod \"neutron-db-sync-jvrxf\" (UID: \"9efa7d7f-28c0-4bd7-ae4f-f988968544c0\") " pod="openstack/neutron-db-sync-jvrxf" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.470757 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bd8ec365-fbcd-4680-90cf-9a41cae499f3-horizon-secret-key\") pod \"horizon-59c4845857-58lkr\" (UID: \"bd8ec365-fbcd-4680-90cf-9a41cae499f3\") " pod="openstack/horizon-59c4845857-58lkr" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.470793 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd8ec365-fbcd-4680-90cf-9a41cae499f3-scripts\") pod \"horizon-59c4845857-58lkr\" (UID: \"bd8ec365-fbcd-4680-90cf-9a41cae499f3\") " pod="openstack/horizon-59c4845857-58lkr" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.470810 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d13d51-29b5-463f-873d-4a586878e0c4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " pod="openstack/ceilometer-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.470831 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzbdf\" (UniqueName: \"kubernetes.io/projected/66d13d51-29b5-463f-873d-4a586878e0c4-kube-api-access-pzbdf\") pod \"ceilometer-0\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " pod="openstack/ceilometer-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.470851 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkr2c\" (UniqueName: \"kubernetes.io/projected/9efa7d7f-28c0-4bd7-ae4f-f988968544c0-kube-api-access-hkr2c\") pod \"neutron-db-sync-jvrxf\" (UID: \"9efa7d7f-28c0-4bd7-ae4f-f988968544c0\") " pod="openstack/neutron-db-sync-jvrxf" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.471503 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd8ec365-fbcd-4680-90cf-9a41cae499f3-logs\") pod \"horizon-59c4845857-58lkr\" (UID: \"bd8ec365-fbcd-4680-90cf-9a41cae499f3\") " pod="openstack/horizon-59c4845857-58lkr" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.472006 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd8ec365-fbcd-4680-90cf-9a41cae499f3-config-data\") pod \"horizon-59c4845857-58lkr\" (UID: \"bd8ec365-fbcd-4680-90cf-9a41cae499f3\") " pod="openstack/horizon-59c4845857-58lkr" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.472431 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd8ec365-fbcd-4680-90cf-9a41cae499f3-scripts\") pod \"horizon-59c4845857-58lkr\" (UID: \"bd8ec365-fbcd-4680-90cf-9a41cae499f3\") " pod="openstack/horizon-59c4845857-58lkr" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.483411 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bd8ec365-fbcd-4680-90cf-9a41cae499f3-horizon-secret-key\") pod \"horizon-59c4845857-58lkr\" (UID: \"bd8ec365-fbcd-4680-90cf-9a41cae499f3\") " pod="openstack/horizon-59c4845857-58lkr" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.484254 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.490102 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jvrxf"] Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.518511 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-cmbgb"] Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.519708 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cmbgb" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.551825 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.551992 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d6r7\" (UniqueName: \"kubernetes.io/projected/bd8ec365-fbcd-4680-90cf-9a41cae499f3-kube-api-access-4d6r7\") pod \"horizon-59c4845857-58lkr\" (UID: \"bd8ec365-fbcd-4680-90cf-9a41cae499f3\") " pod="openstack/horizon-59c4845857-58lkr" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.559130 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-cmbgb"] Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.565272 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.565528 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-dxh5g" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.573538 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e6e94fd-e114-41b2-8634-ca805b5e260f-config-data\") pod \"placement-db-sync-qr8r7\" (UID: \"4e6e94fd-e114-41b2-8634-ca805b5e260f\") " pod="openstack/placement-db-sync-qr8r7" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.573581 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e6e94fd-e114-41b2-8634-ca805b5e260f-combined-ca-bundle\") pod \"placement-db-sync-qr8r7\" (UID: \"4e6e94fd-e114-41b2-8634-ca805b5e260f\") " pod="openstack/placement-db-sync-qr8r7" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.573608 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5701fcc2-ae2a-4017-8991-3470421ff234-combined-ca-bundle\") pod \"barbican-db-sync-cmbgb\" (UID: \"5701fcc2-ae2a-4017-8991-3470421ff234\") " pod="openstack/barbican-db-sync-cmbgb" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.573634 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e6e94fd-e114-41b2-8634-ca805b5e260f-logs\") pod \"placement-db-sync-qr8r7\" (UID: \"4e6e94fd-e114-41b2-8634-ca805b5e260f\") " pod="openstack/placement-db-sync-qr8r7" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.573673 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66d13d51-29b5-463f-873d-4a586878e0c4-log-httpd\") pod \"ceilometer-0\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " pod="openstack/ceilometer-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.573696 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d13d51-29b5-463f-873d-4a586878e0c4-config-data\") pod \"ceilometer-0\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " pod="openstack/ceilometer-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.573725 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e6e94fd-e114-41b2-8634-ca805b5e260f-scripts\") pod \"placement-db-sync-qr8r7\" (UID: \"4e6e94fd-e114-41b2-8634-ca805b5e260f\") " pod="openstack/placement-db-sync-qr8r7" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.573746 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7h2b\" (UniqueName: \"kubernetes.io/projected/4e6e94fd-e114-41b2-8634-ca805b5e260f-kube-api-access-d7h2b\") pod \"placement-db-sync-qr8r7\" (UID: \"4e6e94fd-e114-41b2-8634-ca805b5e260f\") " pod="openstack/placement-db-sync-qr8r7" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.573774 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66d13d51-29b5-463f-873d-4a586878e0c4-run-httpd\") pod \"ceilometer-0\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " pod="openstack/ceilometer-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.573794 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66d13d51-29b5-463f-873d-4a586878e0c4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " pod="openstack/ceilometer-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.573815 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9efa7d7f-28c0-4bd7-ae4f-f988968544c0-combined-ca-bundle\") pod \"neutron-db-sync-jvrxf\" (UID: \"9efa7d7f-28c0-4bd7-ae4f-f988968544c0\") " pod="openstack/neutron-db-sync-jvrxf" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.573841 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9efa7d7f-28c0-4bd7-ae4f-f988968544c0-config\") pod \"neutron-db-sync-jvrxf\" (UID: \"9efa7d7f-28c0-4bd7-ae4f-f988968544c0\") " pod="openstack/neutron-db-sync-jvrxf" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.573863 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5701fcc2-ae2a-4017-8991-3470421ff234-db-sync-config-data\") pod \"barbican-db-sync-cmbgb\" (UID: \"5701fcc2-ae2a-4017-8991-3470421ff234\") " pod="openstack/barbican-db-sync-cmbgb" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.573890 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d13d51-29b5-463f-873d-4a586878e0c4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " pod="openstack/ceilometer-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.573904 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzbdf\" (UniqueName: \"kubernetes.io/projected/66d13d51-29b5-463f-873d-4a586878e0c4-kube-api-access-pzbdf\") pod \"ceilometer-0\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " pod="openstack/ceilometer-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.573922 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkr2c\" (UniqueName: \"kubernetes.io/projected/9efa7d7f-28c0-4bd7-ae4f-f988968544c0-kube-api-access-hkr2c\") pod \"neutron-db-sync-jvrxf\" (UID: \"9efa7d7f-28c0-4bd7-ae4f-f988968544c0\") " pod="openstack/neutron-db-sync-jvrxf" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.573945 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66d13d51-29b5-463f-873d-4a586878e0c4-scripts\") pod \"ceilometer-0\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " pod="openstack/ceilometer-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.573962 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvv7t\" (UniqueName: \"kubernetes.io/projected/5701fcc2-ae2a-4017-8991-3470421ff234-kube-api-access-vvv7t\") pod \"barbican-db-sync-cmbgb\" (UID: \"5701fcc2-ae2a-4017-8991-3470421ff234\") " pod="openstack/barbican-db-sync-cmbgb" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.577849 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66d13d51-29b5-463f-873d-4a586878e0c4-run-httpd\") pod \"ceilometer-0\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " pod="openstack/ceilometer-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.590931 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66d13d51-29b5-463f-873d-4a586878e0c4-log-httpd\") pod \"ceilometer-0\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " pod="openstack/ceilometer-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.591417 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e6e94fd-e114-41b2-8634-ca805b5e260f-logs\") pod \"placement-db-sync-qr8r7\" (UID: \"4e6e94fd-e114-41b2-8634-ca805b5e260f\") " pod="openstack/placement-db-sync-qr8r7" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.592501 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9efa7d7f-28c0-4bd7-ae4f-f988968544c0-combined-ca-bundle\") pod \"neutron-db-sync-jvrxf\" (UID: \"9efa7d7f-28c0-4bd7-ae4f-f988968544c0\") " pod="openstack/neutron-db-sync-jvrxf" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.599345 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d13d51-29b5-463f-873d-4a586878e0c4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " pod="openstack/ceilometer-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.600556 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d13d51-29b5-463f-873d-4a586878e0c4-config-data\") pod \"ceilometer-0\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " pod="openstack/ceilometer-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.604575 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e6e94fd-e114-41b2-8634-ca805b5e260f-scripts\") pod \"placement-db-sync-qr8r7\" (UID: \"4e6e94fd-e114-41b2-8634-ca805b5e260f\") " pod="openstack/placement-db-sync-qr8r7" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.606211 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9efa7d7f-28c0-4bd7-ae4f-f988968544c0-config\") pod \"neutron-db-sync-jvrxf\" (UID: \"9efa7d7f-28c0-4bd7-ae4f-f988968544c0\") " pod="openstack/neutron-db-sync-jvrxf" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.606689 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66d13d51-29b5-463f-873d-4a586878e0c4-scripts\") pod \"ceilometer-0\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " pod="openstack/ceilometer-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.612175 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-qr8r7"] Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.615578 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66d13d51-29b5-463f-873d-4a586878e0c4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " pod="openstack/ceilometer-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.620924 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e6e94fd-e114-41b2-8634-ca805b5e260f-config-data\") pod \"placement-db-sync-qr8r7\" (UID: \"4e6e94fd-e114-41b2-8634-ca805b5e260f\") " pod="openstack/placement-db-sync-qr8r7" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.621638 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e6e94fd-e114-41b2-8634-ca805b5e260f-combined-ca-bundle\") pod \"placement-db-sync-qr8r7\" (UID: \"4e6e94fd-e114-41b2-8634-ca805b5e260f\") " pod="openstack/placement-db-sync-qr8r7" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.637773 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkr2c\" (UniqueName: \"kubernetes.io/projected/9efa7d7f-28c0-4bd7-ae4f-f988968544c0-kube-api-access-hkr2c\") pod \"neutron-db-sync-jvrxf\" (UID: \"9efa7d7f-28c0-4bd7-ae4f-f988968544c0\") " pod="openstack/neutron-db-sync-jvrxf" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.642919 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzbdf\" (UniqueName: \"kubernetes.io/projected/66d13d51-29b5-463f-873d-4a586878e0c4-kube-api-access-pzbdf\") pod \"ceilometer-0\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " pod="openstack/ceilometer-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.658806 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7h2b\" (UniqueName: \"kubernetes.io/projected/4e6e94fd-e114-41b2-8634-ca805b5e260f-kube-api-access-d7h2b\") pod \"placement-db-sync-qr8r7\" (UID: \"4e6e94fd-e114-41b2-8634-ca805b5e260f\") " pod="openstack/placement-db-sync-qr8r7" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.666409 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6dcd8cd889-wv2ds"] Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.667853 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.682259 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5701fcc2-ae2a-4017-8991-3470421ff234-db-sync-config-data\") pod \"barbican-db-sync-cmbgb\" (UID: \"5701fcc2-ae2a-4017-8991-3470421ff234\") " pod="openstack/barbican-db-sync-cmbgb" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.682321 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvv7t\" (UniqueName: \"kubernetes.io/projected/5701fcc2-ae2a-4017-8991-3470421ff234-kube-api-access-vvv7t\") pod \"barbican-db-sync-cmbgb\" (UID: \"5701fcc2-ae2a-4017-8991-3470421ff234\") " pod="openstack/barbican-db-sync-cmbgb" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.682354 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5701fcc2-ae2a-4017-8991-3470421ff234-combined-ca-bundle\") pod \"barbican-db-sync-cmbgb\" (UID: \"5701fcc2-ae2a-4017-8991-3470421ff234\") " pod="openstack/barbican-db-sync-cmbgb" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.707108 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-f8dgh"] Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.708308 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f8dgh" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.714681 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5701fcc2-ae2a-4017-8991-3470421ff234-db-sync-config-data\") pod \"barbican-db-sync-cmbgb\" (UID: \"5701fcc2-ae2a-4017-8991-3470421ff234\") " pod="openstack/barbican-db-sync-cmbgb" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.716395 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.716743 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-4t4t8" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.719725 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvv7t\" (UniqueName: \"kubernetes.io/projected/5701fcc2-ae2a-4017-8991-3470421ff234-kube-api-access-vvv7t\") pod \"barbican-db-sync-cmbgb\" (UID: \"5701fcc2-ae2a-4017-8991-3470421ff234\") " pod="openstack/barbican-db-sync-cmbgb" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.720762 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5701fcc2-ae2a-4017-8991-3470421ff234-combined-ca-bundle\") pod \"barbican-db-sync-cmbgb\" (UID: \"5701fcc2-ae2a-4017-8991-3470421ff234\") " pod="openstack/barbican-db-sync-cmbgb" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.721675 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.722373 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.747185 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dcd8cd889-wv2ds"] Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.758422 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jvrxf" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.786277 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-dns-svc\") pod \"dnsmasq-dns-6dcd8cd889-wv2ds\" (UID: \"b1a9182f-9dd3-40f6-a0b1-12b570382705\") " pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.786426 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh6wq\" (UniqueName: \"kubernetes.io/projected/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-kube-api-access-xh6wq\") pod \"cinder-db-sync-f8dgh\" (UID: \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\") " pod="openstack/cinder-db-sync-f8dgh" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.786459 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-db-sync-config-data\") pod \"cinder-db-sync-f8dgh\" (UID: \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\") " pod="openstack/cinder-db-sync-f8dgh" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.786478 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-combined-ca-bundle\") pod \"cinder-db-sync-f8dgh\" (UID: \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\") " pod="openstack/cinder-db-sync-f8dgh" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.786511 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-etc-machine-id\") pod \"cinder-db-sync-f8dgh\" (UID: \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\") " pod="openstack/cinder-db-sync-f8dgh" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.786557 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-config\") pod \"dnsmasq-dns-6dcd8cd889-wv2ds\" (UID: \"b1a9182f-9dd3-40f6-a0b1-12b570382705\") " pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.786580 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-ovsdbserver-nb\") pod \"dnsmasq-dns-6dcd8cd889-wv2ds\" (UID: \"b1a9182f-9dd3-40f6-a0b1-12b570382705\") " pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.786611 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx45d\" (UniqueName: \"kubernetes.io/projected/b1a9182f-9dd3-40f6-a0b1-12b570382705-kube-api-access-fx45d\") pod \"dnsmasq-dns-6dcd8cd889-wv2ds\" (UID: \"b1a9182f-9dd3-40f6-a0b1-12b570382705\") " pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.786632 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-scripts\") pod \"cinder-db-sync-f8dgh\" (UID: \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\") " pod="openstack/cinder-db-sync-f8dgh" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.786654 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-config-data\") pod \"cinder-db-sync-f8dgh\" (UID: \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\") " pod="openstack/cinder-db-sync-f8dgh" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.786678 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-ovsdbserver-sb\") pod \"dnsmasq-dns-6dcd8cd889-wv2ds\" (UID: \"b1a9182f-9dd3-40f6-a0b1-12b570382705\") " pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.786716 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-dns-swift-storage-0\") pod \"dnsmasq-dns-6dcd8cd889-wv2ds\" (UID: \"b1a9182f-9dd3-40f6-a0b1-12b570382705\") " pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.808871 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-qr8r7" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.819892 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59c4845857-58lkr" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.825349 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-f8dgh"] Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.829248 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cmbgb" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.833589 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6ff5f99c57-gtnp4"] Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.835141 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6ff5f99c57-gtnp4" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.843533 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.856890 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6ff5f99c57-gtnp4"] Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.856988 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.861058 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.861208 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.861377 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.861534 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-mcps2" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.865491 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.888687 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c47c74b-13d3-47fa-859a-3b26113630b6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " pod="openstack/glance-default-external-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.888731 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-scripts\") pod \"horizon-6ff5f99c57-gtnp4\" (UID: \"9013866c-5d2f-4e3d-b7fe-d38f7339c6f0\") " pod="openstack/horizon-6ff5f99c57-gtnp4" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.888755 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-dns-svc\") pod \"dnsmasq-dns-6dcd8cd889-wv2ds\" (UID: \"b1a9182f-9dd3-40f6-a0b1-12b570382705\") " pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.888776 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5c47c74b-13d3-47fa-859a-3b26113630b6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " pod="openstack/glance-default-external-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.888796 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-config-data\") pod \"horizon-6ff5f99c57-gtnp4\" (UID: \"9013866c-5d2f-4e3d-b7fe-d38f7339c6f0\") " pod="openstack/horizon-6ff5f99c57-gtnp4" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.888848 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh6wq\" (UniqueName: \"kubernetes.io/projected/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-kube-api-access-xh6wq\") pod \"cinder-db-sync-f8dgh\" (UID: \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\") " pod="openstack/cinder-db-sync-f8dgh" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.888870 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-db-sync-config-data\") pod \"cinder-db-sync-f8dgh\" (UID: \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\") " pod="openstack/cinder-db-sync-f8dgh" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.888891 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-combined-ca-bundle\") pod \"cinder-db-sync-f8dgh\" (UID: \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\") " pod="openstack/cinder-db-sync-f8dgh" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.888955 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-etc-machine-id\") pod \"cinder-db-sync-f8dgh\" (UID: \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\") " pod="openstack/cinder-db-sync-f8dgh" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.889007 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c47c74b-13d3-47fa-859a-3b26113630b6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " pod="openstack/glance-default-external-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.889040 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-config\") pod \"dnsmasq-dns-6dcd8cd889-wv2ds\" (UID: \"b1a9182f-9dd3-40f6-a0b1-12b570382705\") " pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.889055 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c47c74b-13d3-47fa-859a-3b26113630b6-logs\") pod \"glance-default-external-api-0\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " pod="openstack/glance-default-external-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.889071 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " pod="openstack/glance-default-external-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.889094 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-ovsdbserver-nb\") pod \"dnsmasq-dns-6dcd8cd889-wv2ds\" (UID: \"b1a9182f-9dd3-40f6-a0b1-12b570382705\") " pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.889115 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx45d\" (UniqueName: \"kubernetes.io/projected/b1a9182f-9dd3-40f6-a0b1-12b570382705-kube-api-access-fx45d\") pod \"dnsmasq-dns-6dcd8cd889-wv2ds\" (UID: \"b1a9182f-9dd3-40f6-a0b1-12b570382705\") " pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.889130 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-scripts\") pod \"cinder-db-sync-f8dgh\" (UID: \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\") " pod="openstack/cinder-db-sync-f8dgh" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.889143 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-config-data\") pod \"cinder-db-sync-f8dgh\" (UID: \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\") " pod="openstack/cinder-db-sync-f8dgh" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.889162 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-ovsdbserver-sb\") pod \"dnsmasq-dns-6dcd8cd889-wv2ds\" (UID: \"b1a9182f-9dd3-40f6-a0b1-12b570382705\") " pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.889185 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c47c74b-13d3-47fa-859a-3b26113630b6-config-data\") pod \"glance-default-external-api-0\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " pod="openstack/glance-default-external-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.889203 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c47c74b-13d3-47fa-859a-3b26113630b6-scripts\") pod \"glance-default-external-api-0\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " pod="openstack/glance-default-external-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.889226 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s8rj\" (UniqueName: \"kubernetes.io/projected/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-kube-api-access-9s8rj\") pod \"horizon-6ff5f99c57-gtnp4\" (UID: \"9013866c-5d2f-4e3d-b7fe-d38f7339c6f0\") " pod="openstack/horizon-6ff5f99c57-gtnp4" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.889243 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-logs\") pod \"horizon-6ff5f99c57-gtnp4\" (UID: \"9013866c-5d2f-4e3d-b7fe-d38f7339c6f0\") " pod="openstack/horizon-6ff5f99c57-gtnp4" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.889260 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8mlb\" (UniqueName: \"kubernetes.io/projected/5c47c74b-13d3-47fa-859a-3b26113630b6-kube-api-access-k8mlb\") pod \"glance-default-external-api-0\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " pod="openstack/glance-default-external-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.889277 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-dns-swift-storage-0\") pod \"dnsmasq-dns-6dcd8cd889-wv2ds\" (UID: \"b1a9182f-9dd3-40f6-a0b1-12b570382705\") " pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.889325 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-horizon-secret-key\") pod \"horizon-6ff5f99c57-gtnp4\" (UID: \"9013866c-5d2f-4e3d-b7fe-d38f7339c6f0\") " pod="openstack/horizon-6ff5f99c57-gtnp4" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.892589 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-etc-machine-id\") pod \"cinder-db-sync-f8dgh\" (UID: \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\") " pod="openstack/cinder-db-sync-f8dgh" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.903864 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-db-sync-config-data\") pod \"cinder-db-sync-f8dgh\" (UID: \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\") " pod="openstack/cinder-db-sync-f8dgh" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.908547 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-scripts\") pod \"cinder-db-sync-f8dgh\" (UID: \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\") " pod="openstack/cinder-db-sync-f8dgh" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.912422 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-config-data\") pod \"cinder-db-sync-f8dgh\" (UID: \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\") " pod="openstack/cinder-db-sync-f8dgh" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.913422 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-dns-svc\") pod \"dnsmasq-dns-6dcd8cd889-wv2ds\" (UID: \"b1a9182f-9dd3-40f6-a0b1-12b570382705\") " pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.913503 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-ovsdbserver-nb\") pod \"dnsmasq-dns-6dcd8cd889-wv2ds\" (UID: \"b1a9182f-9dd3-40f6-a0b1-12b570382705\") " pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.913779 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-dns-swift-storage-0\") pod \"dnsmasq-dns-6dcd8cd889-wv2ds\" (UID: \"b1a9182f-9dd3-40f6-a0b1-12b570382705\") " pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.915209 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-config\") pod \"dnsmasq-dns-6dcd8cd889-wv2ds\" (UID: \"b1a9182f-9dd3-40f6-a0b1-12b570382705\") " pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.920927 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-ovsdbserver-sb\") pod \"dnsmasq-dns-6dcd8cd889-wv2ds\" (UID: \"b1a9182f-9dd3-40f6-a0b1-12b570382705\") " pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.925457 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-combined-ca-bundle\") pod \"cinder-db-sync-f8dgh\" (UID: \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\") " pod="openstack/cinder-db-sync-f8dgh" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.925476 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh6wq\" (UniqueName: \"kubernetes.io/projected/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-kube-api-access-xh6wq\") pod \"cinder-db-sync-f8dgh\" (UID: \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\") " pod="openstack/cinder-db-sync-f8dgh" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.933577 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx45d\" (UniqueName: \"kubernetes.io/projected/b1a9182f-9dd3-40f6-a0b1-12b570382705-kube-api-access-fx45d\") pod \"dnsmasq-dns-6dcd8cd889-wv2ds\" (UID: \"b1a9182f-9dd3-40f6-a0b1-12b570382705\") " pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.939147 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.942750 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.966349 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.970829 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.970973 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.996777 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c47c74b-13d3-47fa-859a-3b26113630b6-scripts\") pod \"glance-default-external-api-0\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " pod="openstack/glance-default-external-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.996860 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s8rj\" (UniqueName: \"kubernetes.io/projected/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-kube-api-access-9s8rj\") pod \"horizon-6ff5f99c57-gtnp4\" (UID: \"9013866c-5d2f-4e3d-b7fe-d38f7339c6f0\") " pod="openstack/horizon-6ff5f99c57-gtnp4" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.997118 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-logs\") pod \"horizon-6ff5f99c57-gtnp4\" (UID: \"9013866c-5d2f-4e3d-b7fe-d38f7339c6f0\") " pod="openstack/horizon-6ff5f99c57-gtnp4" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.997147 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8mlb\" (UniqueName: \"kubernetes.io/projected/5c47c74b-13d3-47fa-859a-3b26113630b6-kube-api-access-k8mlb\") pod \"glance-default-external-api-0\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " pod="openstack/glance-default-external-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.997225 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-horizon-secret-key\") pod \"horizon-6ff5f99c57-gtnp4\" (UID: \"9013866c-5d2f-4e3d-b7fe-d38f7339c6f0\") " pod="openstack/horizon-6ff5f99c57-gtnp4" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.997291 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c47c74b-13d3-47fa-859a-3b26113630b6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " pod="openstack/glance-default-external-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.997339 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-scripts\") pod \"horizon-6ff5f99c57-gtnp4\" (UID: \"9013866c-5d2f-4e3d-b7fe-d38f7339c6f0\") " pod="openstack/horizon-6ff5f99c57-gtnp4" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.997375 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5c47c74b-13d3-47fa-859a-3b26113630b6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " pod="openstack/glance-default-external-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.997413 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-config-data\") pod \"horizon-6ff5f99c57-gtnp4\" (UID: \"9013866c-5d2f-4e3d-b7fe-d38f7339c6f0\") " pod="openstack/horizon-6ff5f99c57-gtnp4" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.998287 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c47c74b-13d3-47fa-859a-3b26113630b6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " pod="openstack/glance-default-external-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.998666 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c47c74b-13d3-47fa-859a-3b26113630b6-logs\") pod \"glance-default-external-api-0\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " pod="openstack/glance-default-external-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.998812 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " pod="openstack/glance-default-external-api-0" Feb 19 18:51:43 crc kubenswrapper[4749]: I0219 18:51:43.998907 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c47c74b-13d3-47fa-859a-3b26113630b6-config-data\") pod \"glance-default-external-api-0\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " pod="openstack/glance-default-external-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.000440 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-logs\") pod \"horizon-6ff5f99c57-gtnp4\" (UID: \"9013866c-5d2f-4e3d-b7fe-d38f7339c6f0\") " pod="openstack/horizon-6ff5f99c57-gtnp4" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.001184 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.001400 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c47c74b-13d3-47fa-859a-3b26113630b6-logs\") pod \"glance-default-external-api-0\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " pod="openstack/glance-default-external-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.001878 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-scripts\") pod \"horizon-6ff5f99c57-gtnp4\" (UID: \"9013866c-5d2f-4e3d-b7fe-d38f7339c6f0\") " pod="openstack/horizon-6ff5f99c57-gtnp4" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.003149 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-config-data\") pod \"horizon-6ff5f99c57-gtnp4\" (UID: \"9013866c-5d2f-4e3d-b7fe-d38f7339c6f0\") " pod="openstack/horizon-6ff5f99c57-gtnp4" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.003352 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5c47c74b-13d3-47fa-859a-3b26113630b6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " pod="openstack/glance-default-external-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.006240 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-horizon-secret-key\") pod \"horizon-6ff5f99c57-gtnp4\" (UID: \"9013866c-5d2f-4e3d-b7fe-d38f7339c6f0\") " pod="openstack/horizon-6ff5f99c57-gtnp4" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.009134 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c47c74b-13d3-47fa-859a-3b26113630b6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " pod="openstack/glance-default-external-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.018800 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c47c74b-13d3-47fa-859a-3b26113630b6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " pod="openstack/glance-default-external-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.023174 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c47c74b-13d3-47fa-859a-3b26113630b6-scripts\") pod \"glance-default-external-api-0\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " pod="openstack/glance-default-external-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.023301 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s8rj\" (UniqueName: \"kubernetes.io/projected/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-kube-api-access-9s8rj\") pod \"horizon-6ff5f99c57-gtnp4\" (UID: \"9013866c-5d2f-4e3d-b7fe-d38f7339c6f0\") " pod="openstack/horizon-6ff5f99c57-gtnp4" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.037226 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8mlb\" (UniqueName: \"kubernetes.io/projected/5c47c74b-13d3-47fa-859a-3b26113630b6-kube-api-access-k8mlb\") pod \"glance-default-external-api-0\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " pod="openstack/glance-default-external-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.051060 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c47c74b-13d3-47fa-859a-3b26113630b6-config-data\") pod \"glance-default-external-api-0\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " pod="openstack/glance-default-external-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.102240 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eff44afa-e6ee-4d6f-a041-3734a9e4a782-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.102302 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff44afa-e6ee-4d6f-a041-3734a9e4a782-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.102328 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eff44afa-e6ee-4d6f-a041-3734a9e4a782-logs\") pod \"glance-default-internal-api-0\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.102343 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eff44afa-e6ee-4d6f-a041-3734a9e4a782-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.102361 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff44afa-e6ee-4d6f-a041-3734a9e4a782-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.102385 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.102413 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eff44afa-e6ee-4d6f-a041-3734a9e4a782-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.102493 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zprll\" (UniqueName: \"kubernetes.io/projected/eff44afa-e6ee-4d6f-a041-3734a9e4a782-kube-api-access-zprll\") pod \"glance-default-internal-api-0\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.104976 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " pod="openstack/glance-default-external-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.171564 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.196160 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f8dgh" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.204183 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff44afa-e6ee-4d6f-a041-3734a9e4a782-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.204236 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eff44afa-e6ee-4d6f-a041-3734a9e4a782-logs\") pod \"glance-default-internal-api-0\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.204258 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eff44afa-e6ee-4d6f-a041-3734a9e4a782-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.204282 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff44afa-e6ee-4d6f-a041-3734a9e4a782-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.204309 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.204343 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eff44afa-e6ee-4d6f-a041-3734a9e4a782-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.204408 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zprll\" (UniqueName: \"kubernetes.io/projected/eff44afa-e6ee-4d6f-a041-3734a9e4a782-kube-api-access-zprll\") pod \"glance-default-internal-api-0\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.204458 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eff44afa-e6ee-4d6f-a041-3734a9e4a782-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.204998 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.232932 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6ff5f99c57-gtnp4" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.292048 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.292325 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eff44afa-e6ee-4d6f-a041-3734a9e4a782-logs\") pod \"glance-default-internal-api-0\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.298421 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eff44afa-e6ee-4d6f-a041-3734a9e4a782-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.303757 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eff44afa-e6ee-4d6f-a041-3734a9e4a782-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.313621 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zprll\" (UniqueName: \"kubernetes.io/projected/eff44afa-e6ee-4d6f-a041-3734a9e4a782-kube-api-access-zprll\") pod \"glance-default-internal-api-0\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.314547 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff44afa-e6ee-4d6f-a041-3734a9e4a782-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.316146 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff44afa-e6ee-4d6f-a041-3734a9e4a782-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.316621 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eff44afa-e6ee-4d6f-a041-3734a9e4a782-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.322508 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.458760 4749 generic.go:334] "Generic (PLEG): container finished" podID="25a49c28-995a-475e-a81f-45632b61104d" containerID="e313d87120d3eabcf0d8c7e3ff4af3de5074a5b09ab3e2b69effe282212aa9dd" exitCode=0 Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.458807 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-778545d54f-gjkrp" event={"ID":"25a49c28-995a-475e-a81f-45632b61104d","Type":"ContainerDied","Data":"e313d87120d3eabcf0d8c7e3ff4af3de5074a5b09ab3e2b69effe282212aa9dd"} Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.466453 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8rjdg"] Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.486631 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5889f4df85-qt29k"] Feb 19 18:51:44 crc kubenswrapper[4749]: W0219 18:51:44.531740 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba6e5340_1a1a_421a_b3bb_f0d83811a28e.slice/crio-ed59e3a7a4c832aeb7818e30894b69bb287e8a25079861ea61814b79f3d81678 WatchSource:0}: Error finding container ed59e3a7a4c832aeb7818e30894b69bb287e8a25079861ea61814b79f3d81678: Status 404 returned error can't find the container with id ed59e3a7a4c832aeb7818e30894b69bb287e8a25079861ea61814b79f3d81678 Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.548952 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.596088 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.604814 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.709102 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-778545d54f-gjkrp" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.817105 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-ovsdbserver-sb\") pod \"25a49c28-995a-475e-a81f-45632b61104d\" (UID: \"25a49c28-995a-475e-a81f-45632b61104d\") " Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.817409 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb4rr\" (UniqueName: \"kubernetes.io/projected/25a49c28-995a-475e-a81f-45632b61104d-kube-api-access-lb4rr\") pod \"25a49c28-995a-475e-a81f-45632b61104d\" (UID: \"25a49c28-995a-475e-a81f-45632b61104d\") " Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.817463 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-dns-svc\") pod \"25a49c28-995a-475e-a81f-45632b61104d\" (UID: \"25a49c28-995a-475e-a81f-45632b61104d\") " Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.817491 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-config\") pod \"25a49c28-995a-475e-a81f-45632b61104d\" (UID: \"25a49c28-995a-475e-a81f-45632b61104d\") " Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.817569 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-dns-swift-storage-0\") pod \"25a49c28-995a-475e-a81f-45632b61104d\" (UID: \"25a49c28-995a-475e-a81f-45632b61104d\") " Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.817616 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-ovsdbserver-nb\") pod \"25a49c28-995a-475e-a81f-45632b61104d\" (UID: \"25a49c28-995a-475e-a81f-45632b61104d\") " Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.826238 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25a49c28-995a-475e-a81f-45632b61104d-kube-api-access-lb4rr" (OuterVolumeSpecName: "kube-api-access-lb4rr") pod "25a49c28-995a-475e-a81f-45632b61104d" (UID: "25a49c28-995a-475e-a81f-45632b61104d"). InnerVolumeSpecName "kube-api-access-lb4rr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.920703 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb4rr\" (UniqueName: \"kubernetes.io/projected/25a49c28-995a-475e-a81f-45632b61104d-kube-api-access-lb4rr\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.992636 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "25a49c28-995a-475e-a81f-45632b61104d" (UID: "25a49c28-995a-475e-a81f-45632b61104d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:44 crc kubenswrapper[4749]: I0219 18:51:44.997474 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "25a49c28-995a-475e-a81f-45632b61104d" (UID: "25a49c28-995a-475e-a81f-45632b61104d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.002428 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "25a49c28-995a-475e-a81f-45632b61104d" (UID: "25a49c28-995a-475e-a81f-45632b61104d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.005519 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "25a49c28-995a-475e-a81f-45632b61104d" (UID: "25a49c28-995a-475e-a81f-45632b61104d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.011547 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-config" (OuterVolumeSpecName: "config") pod "25a49c28-995a-475e-a81f-45632b61104d" (UID: "25a49c28-995a-475e-a81f-45632b61104d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.022331 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.022362 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.022372 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.022381 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.022390 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25a49c28-995a-475e-a81f-45632b61104d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.144240 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.166770 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-cmbgb"] Feb 19 18:51:45 crc kubenswrapper[4749]: W0219 18:51:45.202238 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5701fcc2_ae2a_4017_8991_3470421ff234.slice/crio-37a318854400c75cd258f5bde67fc9ea170b4c0c935cec65eda508fb2ec8e8bc WatchSource:0}: Error finding container 37a318854400c75cd258f5bde67fc9ea170b4c0c935cec65eda508fb2ec8e8bc: Status 404 returned error can't find the container with id 37a318854400c75cd258f5bde67fc9ea170b4c0c935cec65eda508fb2ec8e8bc Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.216003 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-59c4845857-58lkr"] Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.235103 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jvrxf"] Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.258167 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-qr8r7"] Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.290369 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.450833 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6ff5f99c57-gtnp4"] Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.467603 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"d504807c-589e-4665-bc1d-865473001928","Type":"ContainerStarted","Data":"21307c92df3606595000710a5a2a6268c2e9b4cc817a0b2db1415f08e1e1ac45"} Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.468516 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dcd8cd889-wv2ds"] Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.469102 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cmbgb" event={"ID":"5701fcc2-ae2a-4017-8991-3470421ff234","Type":"ContainerStarted","Data":"37a318854400c75cd258f5bde67fc9ea170b4c0c935cec65eda508fb2ec8e8bc"} Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.471822 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-778545d54f-gjkrp" event={"ID":"25a49c28-995a-475e-a81f-45632b61104d","Type":"ContainerDied","Data":"79eec8e199b87de0a27a5606ba1f965c91e263384cb9b513197600d562675457"} Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.471870 4749 scope.go:117] "RemoveContainer" containerID="e313d87120d3eabcf0d8c7e3ff4af3de5074a5b09ab3e2b69effe282212aa9dd" Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.471980 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-778545d54f-gjkrp" Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.477214 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-f8dgh"] Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.486290 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"facae815-e01a-4641-b07c-3530303cc691","Type":"ContainerStarted","Data":"ea65b1b543f81f44fdc7df55387816b69495f41516268b76af60091185e84cb7"} Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.486332 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"facae815-e01a-4641-b07c-3530303cc691","Type":"ContainerStarted","Data":"e318394177cf0f3c51292edf370d483582b16dd533b4e12b59a4c82b5ac4c570"} Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.498276 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"763173db-176a-426d-bd85-e051d56ec5cf","Type":"ContainerStarted","Data":"7658ea7fdcb8badbdbe7a597858743b9fe69f892e9dcebce8e21bea2fab84bce"} Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.500755 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8rjdg" event={"ID":"ba6e5340-1a1a-421a-b3bb-f0d83811a28e","Type":"ContainerStarted","Data":"7379a11bb2a3623b7b06c52e6724ffd514a62505ff46b0039924f9d92a454eaa"} Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.500791 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8rjdg" event={"ID":"ba6e5340-1a1a-421a-b3bb-f0d83811a28e","Type":"ContainerStarted","Data":"ed59e3a7a4c832aeb7818e30894b69bb287e8a25079861ea61814b79f3d81678"} Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.503408 4749 generic.go:334] "Generic (PLEG): container finished" podID="b16d3f1e-45df-4bd3-83f7-3c9396a8de95" containerID="eca5df20be40e664be83ba816954b2f43cc7cc2fd4ad412de9337b007d97095c" exitCode=0 Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.503450 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5889f4df85-qt29k" event={"ID":"b16d3f1e-45df-4bd3-83f7-3c9396a8de95","Type":"ContainerDied","Data":"eca5df20be40e664be83ba816954b2f43cc7cc2fd4ad412de9337b007d97095c"} Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.503495 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5889f4df85-qt29k" event={"ID":"b16d3f1e-45df-4bd3-83f7-3c9396a8de95","Type":"ContainerStarted","Data":"340499855c8bb098e278c1ebbe8ab12fdd6917dd924bad1179e25c8d2331c8f6"} Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.544550 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-778545d54f-gjkrp"] Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.559528 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-778545d54f-gjkrp"] Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.570829 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-8rjdg" podStartSLOduration=3.570810613 podStartE2EDuration="3.570810613s" podCreationTimestamp="2026-02-19 18:51:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:51:45.546463061 +0000 UTC m=+1079.507683055" watchObservedRunningTime="2026-02-19 18:51:45.570810613 +0000 UTC m=+1079.532030567" Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.591721 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.766708 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.785673 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.833174 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-59c4845857-58lkr"] Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.848465 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.873069 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8678f4d997-tsdv8"] Feb 19 18:51:45 crc kubenswrapper[4749]: E0219 18:51:45.873601 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25a49c28-995a-475e-a81f-45632b61104d" containerName="dnsmasq-dns" Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.873618 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="25a49c28-995a-475e-a81f-45632b61104d" containerName="dnsmasq-dns" Feb 19 18:51:45 crc kubenswrapper[4749]: E0219 18:51:45.873648 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25a49c28-995a-475e-a81f-45632b61104d" containerName="init" Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.873656 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="25a49c28-995a-475e-a81f-45632b61104d" containerName="init" Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.873873 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="25a49c28-995a-475e-a81f-45632b61104d" containerName="dnsmasq-dns" Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.875099 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8678f4d997-tsdv8" Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.928151 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8678f4d997-tsdv8"] Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.936334 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.951628 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac47ec62-11c1-4eea-91c5-0331050fd880-logs\") pod \"horizon-8678f4d997-tsdv8\" (UID: \"ac47ec62-11c1-4eea-91c5-0331050fd880\") " pod="openstack/horizon-8678f4d997-tsdv8" Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.951683 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ac47ec62-11c1-4eea-91c5-0331050fd880-config-data\") pod \"horizon-8678f4d997-tsdv8\" (UID: \"ac47ec62-11c1-4eea-91c5-0331050fd880\") " pod="openstack/horizon-8678f4d997-tsdv8" Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.951711 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw4rw\" (UniqueName: \"kubernetes.io/projected/ac47ec62-11c1-4eea-91c5-0331050fd880-kube-api-access-jw4rw\") pod \"horizon-8678f4d997-tsdv8\" (UID: \"ac47ec62-11c1-4eea-91c5-0331050fd880\") " pod="openstack/horizon-8678f4d997-tsdv8" Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.951766 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ac47ec62-11c1-4eea-91c5-0331050fd880-horizon-secret-key\") pod \"horizon-8678f4d997-tsdv8\" (UID: \"ac47ec62-11c1-4eea-91c5-0331050fd880\") " pod="openstack/horizon-8678f4d997-tsdv8" Feb 19 18:51:45 crc kubenswrapper[4749]: I0219 18:51:45.951830 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac47ec62-11c1-4eea-91c5-0331050fd880-scripts\") pod \"horizon-8678f4d997-tsdv8\" (UID: \"ac47ec62-11c1-4eea-91c5-0331050fd880\") " pod="openstack/horizon-8678f4d997-tsdv8" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.053056 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ac47ec62-11c1-4eea-91c5-0331050fd880-config-data\") pod \"horizon-8678f4d997-tsdv8\" (UID: \"ac47ec62-11c1-4eea-91c5-0331050fd880\") " pod="openstack/horizon-8678f4d997-tsdv8" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.053101 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw4rw\" (UniqueName: \"kubernetes.io/projected/ac47ec62-11c1-4eea-91c5-0331050fd880-kube-api-access-jw4rw\") pod \"horizon-8678f4d997-tsdv8\" (UID: \"ac47ec62-11c1-4eea-91c5-0331050fd880\") " pod="openstack/horizon-8678f4d997-tsdv8" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.053165 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ac47ec62-11c1-4eea-91c5-0331050fd880-horizon-secret-key\") pod \"horizon-8678f4d997-tsdv8\" (UID: \"ac47ec62-11c1-4eea-91c5-0331050fd880\") " pod="openstack/horizon-8678f4d997-tsdv8" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.053229 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac47ec62-11c1-4eea-91c5-0331050fd880-scripts\") pod \"horizon-8678f4d997-tsdv8\" (UID: \"ac47ec62-11c1-4eea-91c5-0331050fd880\") " pod="openstack/horizon-8678f4d997-tsdv8" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.053268 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac47ec62-11c1-4eea-91c5-0331050fd880-logs\") pod \"horizon-8678f4d997-tsdv8\" (UID: \"ac47ec62-11c1-4eea-91c5-0331050fd880\") " pod="openstack/horizon-8678f4d997-tsdv8" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.053532 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac47ec62-11c1-4eea-91c5-0331050fd880-logs\") pod \"horizon-8678f4d997-tsdv8\" (UID: \"ac47ec62-11c1-4eea-91c5-0331050fd880\") " pod="openstack/horizon-8678f4d997-tsdv8" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.054311 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac47ec62-11c1-4eea-91c5-0331050fd880-scripts\") pod \"horizon-8678f4d997-tsdv8\" (UID: \"ac47ec62-11c1-4eea-91c5-0331050fd880\") " pod="openstack/horizon-8678f4d997-tsdv8" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.054871 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ac47ec62-11c1-4eea-91c5-0331050fd880-config-data\") pod \"horizon-8678f4d997-tsdv8\" (UID: \"ac47ec62-11c1-4eea-91c5-0331050fd880\") " pod="openstack/horizon-8678f4d997-tsdv8" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.065573 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ac47ec62-11c1-4eea-91c5-0331050fd880-horizon-secret-key\") pod \"horizon-8678f4d997-tsdv8\" (UID: \"ac47ec62-11c1-4eea-91c5-0331050fd880\") " pod="openstack/horizon-8678f4d997-tsdv8" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.068999 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw4rw\" (UniqueName: \"kubernetes.io/projected/ac47ec62-11c1-4eea-91c5-0331050fd880-kube-api-access-jw4rw\") pod \"horizon-8678f4d997-tsdv8\" (UID: \"ac47ec62-11c1-4eea-91c5-0331050fd880\") " pod="openstack/horizon-8678f4d997-tsdv8" Feb 19 18:51:46 crc kubenswrapper[4749]: W0219 18:51:46.189242 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd8ec365_fbcd_4680_90cf_9a41cae499f3.slice/crio-c7608286a4a0987c5d4894464a3d7561cc2beddd22e03f44777556c4553e6b08 WatchSource:0}: Error finding container c7608286a4a0987c5d4894464a3d7561cc2beddd22e03f44777556c4553e6b08: Status 404 returned error can't find the container with id c7608286a4a0987c5d4894464a3d7561cc2beddd22e03f44777556c4553e6b08 Feb 19 18:51:46 crc kubenswrapper[4749]: W0219 18:51:46.201748 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3a8e773_dd45_4aa6_96ba_d3e6cb89f42d.slice/crio-16f9265adf1eeb0e3748eff49f78899440489e65956dcd213849389af444239e WatchSource:0}: Error finding container 16f9265adf1eeb0e3748eff49f78899440489e65956dcd213849389af444239e: Status 404 returned error can't find the container with id 16f9265adf1eeb0e3748eff49f78899440489e65956dcd213849389af444239e Feb 19 18:51:46 crc kubenswrapper[4749]: W0219 18:51:46.227412 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeff44afa_e6ee_4d6f_a041_3734a9e4a782.slice/crio-546e63d61a6b047c246182dd4dd3b29ec17a06bbb31f617717cbc677963581c7 WatchSource:0}: Error finding container 546e63d61a6b047c246182dd4dd3b29ec17a06bbb31f617717cbc677963581c7: Status 404 returned error can't find the container with id 546e63d61a6b047c246182dd4dd3b29ec17a06bbb31f617717cbc677963581c7 Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.251776 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8678f4d997-tsdv8" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.320198 4749 scope.go:117] "RemoveContainer" containerID="d29ebf567c6e8b75e175622c9aba2535778054730821ed7d8ef3bd17f7197f33" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.322595 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5889f4df85-qt29k" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.461546 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-dns-svc\") pod \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\" (UID: \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\") " Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.461588 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-ovsdbserver-sb\") pod \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\" (UID: \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\") " Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.461625 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfbzv\" (UniqueName: \"kubernetes.io/projected/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-kube-api-access-tfbzv\") pod \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\" (UID: \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\") " Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.461649 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-config\") pod \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\" (UID: \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\") " Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.461723 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-dns-swift-storage-0\") pod \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\" (UID: \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\") " Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.461749 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-ovsdbserver-nb\") pod \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\" (UID: \"b16d3f1e-45df-4bd3-83f7-3c9396a8de95\") " Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.486229 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-kube-api-access-tfbzv" (OuterVolumeSpecName: "kube-api-access-tfbzv") pod "b16d3f1e-45df-4bd3-83f7-3c9396a8de95" (UID: "b16d3f1e-45df-4bd3-83f7-3c9396a8de95"). InnerVolumeSpecName "kube-api-access-tfbzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.527628 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-config" (OuterVolumeSpecName: "config") pod "b16d3f1e-45df-4bd3-83f7-3c9396a8de95" (UID: "b16d3f1e-45df-4bd3-83f7-3c9396a8de95"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.528820 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b16d3f1e-45df-4bd3-83f7-3c9396a8de95" (UID: "b16d3f1e-45df-4bd3-83f7-3c9396a8de95"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.548763 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b16d3f1e-45df-4bd3-83f7-3c9396a8de95" (UID: "b16d3f1e-45df-4bd3-83f7-3c9396a8de95"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.562451 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b16d3f1e-45df-4bd3-83f7-3c9396a8de95" (UID: "b16d3f1e-45df-4bd3-83f7-3c9396a8de95"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.564796 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfbzv\" (UniqueName: \"kubernetes.io/projected/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-kube-api-access-tfbzv\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.564831 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.564843 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.564853 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.564862 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.584868 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b16d3f1e-45df-4bd3-83f7-3c9396a8de95" (UID: "b16d3f1e-45df-4bd3-83f7-3c9396a8de95"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.597225 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" event={"ID":"b1a9182f-9dd3-40f6-a0b1-12b570382705","Type":"ContainerStarted","Data":"2296a32bf91398e5096c030f8233284784a37cc2b27348420bd2ab94aa4ba5aa"} Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.606746 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"facae815-e01a-4641-b07c-3530303cc691","Type":"ContainerStarted","Data":"d80ef33824a77d689c9f2c9c292b16fa7bd25fe28567d403b35fb92fc7ed97db"} Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.606832 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="facae815-e01a-4641-b07c-3530303cc691" containerName="watcher-api-log" containerID="cri-o://ea65b1b543f81f44fdc7df55387816b69495f41516268b76af60091185e84cb7" gracePeriod=30 Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.606974 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="facae815-e01a-4641-b07c-3530303cc691" containerName="watcher-api" containerID="cri-o://d80ef33824a77d689c9f2c9c292b16fa7bd25fe28567d403b35fb92fc7ed97db" gracePeriod=30 Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.607240 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.611407 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="facae815-e01a-4641-b07c-3530303cc691" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.154:9322/\": dial tcp 10.217.0.154:9322: connect: connection refused" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.614495 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66d13d51-29b5-463f-873d-4a586878e0c4","Type":"ContainerStarted","Data":"995fa8fcb59ed8027892c9e5f75bee09ddddc8ec01d3286d286b049176900f50"} Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.616102 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6ff5f99c57-gtnp4" event={"ID":"9013866c-5d2f-4e3d-b7fe-d38f7339c6f0","Type":"ContainerStarted","Data":"3cfeec090f7f50e440cb1cdb491384d96fd5188a9d4c6124f44f0356a60f0625"} Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.617410 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-qr8r7" event={"ID":"4e6e94fd-e114-41b2-8634-ca805b5e260f","Type":"ContainerStarted","Data":"61100cbda9d361ef97e18e799345a72bb4279dd423e3032bbf4fa2a109dabb29"} Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.618923 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59c4845857-58lkr" event={"ID":"bd8ec365-fbcd-4680-90cf-9a41cae499f3","Type":"ContainerStarted","Data":"c7608286a4a0987c5d4894464a3d7561cc2beddd22e03f44777556c4553e6b08"} Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.629465 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5889f4df85-qt29k" event={"ID":"b16d3f1e-45df-4bd3-83f7-3c9396a8de95","Type":"ContainerDied","Data":"340499855c8bb098e278c1ebbe8ab12fdd6917dd924bad1179e25c8d2331c8f6"} Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.629514 4749 scope.go:117] "RemoveContainer" containerID="eca5df20be40e664be83ba816954b2f43cc7cc2fd4ad412de9337b007d97095c" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.629610 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5889f4df85-qt29k" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.639974 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eff44afa-e6ee-4d6f-a041-3734a9e4a782","Type":"ContainerStarted","Data":"546e63d61a6b047c246182dd4dd3b29ec17a06bbb31f617717cbc677963581c7"} Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.664477 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=4.664459824 podStartE2EDuration="4.664459824s" podCreationTimestamp="2026-02-19 18:51:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:51:46.629585865 +0000 UTC m=+1080.590805819" watchObservedRunningTime="2026-02-19 18:51:46.664459824 +0000 UTC m=+1080.625679778" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.692157 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b16d3f1e-45df-4bd3-83f7-3c9396a8de95-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.788142 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25a49c28-995a-475e-a81f-45632b61104d" path="/var/lib/kubelet/pods/25a49c28-995a-475e-a81f-45632b61104d/volumes" Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.790818 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f8dgh" event={"ID":"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d","Type":"ContainerStarted","Data":"16f9265adf1eeb0e3748eff49f78899440489e65956dcd213849389af444239e"} Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.790866 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jvrxf" event={"ID":"9efa7d7f-28c0-4bd7-ae4f-f988968544c0","Type":"ContainerStarted","Data":"0d743b7b2b16f0298b31b12e733f020cef7960475ba08e5103cab7a47bb20256"} Feb 19 18:51:46 crc kubenswrapper[4749]: I0219 18:51:46.836050 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 18:51:47 crc kubenswrapper[4749]: I0219 18:51:47.239121 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8678f4d997-tsdv8"] Feb 19 18:51:47 crc kubenswrapper[4749]: W0219 18:51:47.266015 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac47ec62_11c1_4eea_91c5_0331050fd880.slice/crio-f0c3f447954ade864d88c78fc089d9ee8d1a26916b4c207b0391d7208e0e2acb WatchSource:0}: Error finding container f0c3f447954ade864d88c78fc089d9ee8d1a26916b4c207b0391d7208e0e2acb: Status 404 returned error can't find the container with id f0c3f447954ade864d88c78fc089d9ee8d1a26916b4c207b0391d7208e0e2acb Feb 19 18:51:47 crc kubenswrapper[4749]: I0219 18:51:47.308953 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5889f4df85-qt29k"] Feb 19 18:51:47 crc kubenswrapper[4749]: I0219 18:51:47.326968 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5889f4df85-qt29k"] Feb 19 18:51:47 crc kubenswrapper[4749]: I0219 18:51:47.728577 4749 generic.go:334] "Generic (PLEG): container finished" podID="facae815-e01a-4641-b07c-3530303cc691" containerID="ea65b1b543f81f44fdc7df55387816b69495f41516268b76af60091185e84cb7" exitCode=143 Feb 19 18:51:47 crc kubenswrapper[4749]: I0219 18:51:47.728679 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"facae815-e01a-4641-b07c-3530303cc691","Type":"ContainerDied","Data":"ea65b1b543f81f44fdc7df55387816b69495f41516268b76af60091185e84cb7"} Feb 19 18:51:47 crc kubenswrapper[4749]: I0219 18:51:47.736516 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"d504807c-589e-4665-bc1d-865473001928","Type":"ContainerStarted","Data":"897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885"} Feb 19 18:51:47 crc kubenswrapper[4749]: I0219 18:51:47.738018 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jvrxf" event={"ID":"9efa7d7f-28c0-4bd7-ae4f-f988968544c0","Type":"ContainerStarted","Data":"59151c6bf16b4a152bc622f54e0507052951f8cc27cdb87f310817d937e10c53"} Feb 19 18:51:47 crc kubenswrapper[4749]: I0219 18:51:47.740382 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8678f4d997-tsdv8" event={"ID":"ac47ec62-11c1-4eea-91c5-0331050fd880","Type":"ContainerStarted","Data":"f0c3f447954ade864d88c78fc089d9ee8d1a26916b4c207b0391d7208e0e2acb"} Feb 19 18:51:47 crc kubenswrapper[4749]: I0219 18:51:47.746349 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5c47c74b-13d3-47fa-859a-3b26113630b6","Type":"ContainerStarted","Data":"7655c226353c5f7b5f909c8143c3d12f2d6809dc196016229c302de7ed6d9954"} Feb 19 18:51:48 crc kubenswrapper[4749]: I0219 18:51:48.487174 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 19 18:51:48 crc kubenswrapper[4749]: I0219 18:51:48.716744 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b16d3f1e-45df-4bd3-83f7-3c9396a8de95" path="/var/lib/kubelet/pods/b16d3f1e-45df-4bd3-83f7-3c9396a8de95/volumes" Feb 19 18:51:48 crc kubenswrapper[4749]: I0219 18:51:48.771561 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5c47c74b-13d3-47fa-859a-3b26113630b6","Type":"ContainerStarted","Data":"a0410ab97e285a06c2966cd70f61787bee43eba44b005f0657a88e4cfcf7019b"} Feb 19 18:51:48 crc kubenswrapper[4749]: I0219 18:51:48.773946 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eff44afa-e6ee-4d6f-a041-3734a9e4a782","Type":"ContainerStarted","Data":"a0a5fca73f5fe6c3ebf775e367b944412a62d951e18a6d1a1fda0ef64a52a1bb"} Feb 19 18:51:48 crc kubenswrapper[4749]: I0219 18:51:48.778959 4749 generic.go:334] "Generic (PLEG): container finished" podID="b1a9182f-9dd3-40f6-a0b1-12b570382705" containerID="864ca710bf18b9c2c27af884e0d6f7d42aba809fb695ab137ad4f67f227c1993" exitCode=0 Feb 19 18:51:48 crc kubenswrapper[4749]: I0219 18:51:48.779824 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" event={"ID":"b1a9182f-9dd3-40f6-a0b1-12b570382705","Type":"ContainerDied","Data":"864ca710bf18b9c2c27af884e0d6f7d42aba809fb695ab137ad4f67f227c1993"} Feb 19 18:51:48 crc kubenswrapper[4749]: I0219 18:51:48.804505 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-jvrxf" podStartSLOduration=5.804489406 podStartE2EDuration="5.804489406s" podCreationTimestamp="2026-02-19 18:51:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:51:48.797650889 +0000 UTC m=+1082.758870843" watchObservedRunningTime="2026-02-19 18:51:48.804489406 +0000 UTC m=+1082.765709360" Feb 19 18:51:49 crc kubenswrapper[4749]: I0219 18:51:49.820081 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" event={"ID":"b1a9182f-9dd3-40f6-a0b1-12b570382705","Type":"ContainerStarted","Data":"68823225846a106051d510641f982fd731d3cc7b2a596eb3b8a2beecd361de79"} Feb 19 18:51:49 crc kubenswrapper[4749]: I0219 18:51:49.821577 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" Feb 19 18:51:49 crc kubenswrapper[4749]: I0219 18:51:49.838156 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=5.959374442 podStartE2EDuration="7.838139086s" podCreationTimestamp="2026-02-19 18:51:42 +0000 UTC" firstStartedPulling="2026-02-19 18:51:44.60157427 +0000 UTC m=+1078.562794224" lastFinishedPulling="2026-02-19 18:51:46.480338914 +0000 UTC m=+1080.441558868" observedRunningTime="2026-02-19 18:51:48.848410654 +0000 UTC m=+1082.809630618" watchObservedRunningTime="2026-02-19 18:51:49.838139086 +0000 UTC m=+1083.799359040" Feb 19 18:51:49 crc kubenswrapper[4749]: I0219 18:51:49.840549 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" podStartSLOduration=6.840540275 podStartE2EDuration="6.840540275s" podCreationTimestamp="2026-02-19 18:51:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:51:49.837468809 +0000 UTC m=+1083.798688773" watchObservedRunningTime="2026-02-19 18:51:49.840540275 +0000 UTC m=+1083.801760229" Feb 19 18:51:50 crc kubenswrapper[4749]: I0219 18:51:50.835875 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5c47c74b-13d3-47fa-859a-3b26113630b6","Type":"ContainerStarted","Data":"897fdd4e61c3eca9165f228a49ca773afc4b23a7886f0ec1b3e4c8c8365063dd"} Feb 19 18:51:50 crc kubenswrapper[4749]: I0219 18:51:50.836264 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5c47c74b-13d3-47fa-859a-3b26113630b6" containerName="glance-httpd" containerID="cri-o://897fdd4e61c3eca9165f228a49ca773afc4b23a7886f0ec1b3e4c8c8365063dd" gracePeriod=30 Feb 19 18:51:50 crc kubenswrapper[4749]: I0219 18:51:50.835958 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5c47c74b-13d3-47fa-859a-3b26113630b6" containerName="glance-log" containerID="cri-o://a0410ab97e285a06c2966cd70f61787bee43eba44b005f0657a88e4cfcf7019b" gracePeriod=30 Feb 19 18:51:50 crc kubenswrapper[4749]: I0219 18:51:50.839753 4749 generic.go:334] "Generic (PLEG): container finished" podID="ba6e5340-1a1a-421a-b3bb-f0d83811a28e" containerID="7379a11bb2a3623b7b06c52e6724ffd514a62505ff46b0039924f9d92a454eaa" exitCode=0 Feb 19 18:51:50 crc kubenswrapper[4749]: I0219 18:51:50.839830 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8rjdg" event={"ID":"ba6e5340-1a1a-421a-b3bb-f0d83811a28e","Type":"ContainerDied","Data":"7379a11bb2a3623b7b06c52e6724ffd514a62505ff46b0039924f9d92a454eaa"} Feb 19 18:51:50 crc kubenswrapper[4749]: I0219 18:51:50.848541 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eff44afa-e6ee-4d6f-a041-3734a9e4a782","Type":"ContainerStarted","Data":"1c7b85a94c5bf116444861434a4880d49fe54402ee0914e03635675f8667c31e"} Feb 19 18:51:50 crc kubenswrapper[4749]: I0219 18:51:50.848661 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="eff44afa-e6ee-4d6f-a041-3734a9e4a782" containerName="glance-log" containerID="cri-o://a0a5fca73f5fe6c3ebf775e367b944412a62d951e18a6d1a1fda0ef64a52a1bb" gracePeriod=30 Feb 19 18:51:50 crc kubenswrapper[4749]: I0219 18:51:50.848708 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="eff44afa-e6ee-4d6f-a041-3734a9e4a782" containerName="glance-httpd" containerID="cri-o://1c7b85a94c5bf116444861434a4880d49fe54402ee0914e03635675f8667c31e" gracePeriod=30 Feb 19 18:51:50 crc kubenswrapper[4749]: I0219 18:51:50.858347 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.85832922 podStartE2EDuration="7.85832922s" podCreationTimestamp="2026-02-19 18:51:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:51:50.855995793 +0000 UTC m=+1084.817215757" watchObservedRunningTime="2026-02-19 18:51:50.85832922 +0000 UTC m=+1084.819549174" Feb 19 18:51:50 crc kubenswrapper[4749]: I0219 18:51:50.907691 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.90766826 podStartE2EDuration="7.90766826s" podCreationTimestamp="2026-02-19 18:51:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:51:50.896066078 +0000 UTC m=+1084.857286042" watchObservedRunningTime="2026-02-19 18:51:50.90766826 +0000 UTC m=+1084.868888214" Feb 19 18:51:51 crc kubenswrapper[4749]: I0219 18:51:51.427863 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 19 18:51:51 crc kubenswrapper[4749]: I0219 18:51:51.859377 4749 generic.go:334] "Generic (PLEG): container finished" podID="eff44afa-e6ee-4d6f-a041-3734a9e4a782" containerID="1c7b85a94c5bf116444861434a4880d49fe54402ee0914e03635675f8667c31e" exitCode=0 Feb 19 18:51:51 crc kubenswrapper[4749]: I0219 18:51:51.859409 4749 generic.go:334] "Generic (PLEG): container finished" podID="eff44afa-e6ee-4d6f-a041-3734a9e4a782" containerID="a0a5fca73f5fe6c3ebf775e367b944412a62d951e18a6d1a1fda0ef64a52a1bb" exitCode=143 Feb 19 18:51:51 crc kubenswrapper[4749]: I0219 18:51:51.859446 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eff44afa-e6ee-4d6f-a041-3734a9e4a782","Type":"ContainerDied","Data":"1c7b85a94c5bf116444861434a4880d49fe54402ee0914e03635675f8667c31e"} Feb 19 18:51:51 crc kubenswrapper[4749]: I0219 18:51:51.859509 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eff44afa-e6ee-4d6f-a041-3734a9e4a782","Type":"ContainerDied","Data":"a0a5fca73f5fe6c3ebf775e367b944412a62d951e18a6d1a1fda0ef64a52a1bb"} Feb 19 18:51:51 crc kubenswrapper[4749]: I0219 18:51:51.862410 4749 generic.go:334] "Generic (PLEG): container finished" podID="5c47c74b-13d3-47fa-859a-3b26113630b6" containerID="897fdd4e61c3eca9165f228a49ca773afc4b23a7886f0ec1b3e4c8c8365063dd" exitCode=0 Feb 19 18:51:51 crc kubenswrapper[4749]: I0219 18:51:51.862430 4749 generic.go:334] "Generic (PLEG): container finished" podID="5c47c74b-13d3-47fa-859a-3b26113630b6" containerID="a0410ab97e285a06c2966cd70f61787bee43eba44b005f0657a88e4cfcf7019b" exitCode=143 Feb 19 18:51:51 crc kubenswrapper[4749]: I0219 18:51:51.862438 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5c47c74b-13d3-47fa-859a-3b26113630b6","Type":"ContainerDied","Data":"897fdd4e61c3eca9165f228a49ca773afc4b23a7886f0ec1b3e4c8c8365063dd"} Feb 19 18:51:51 crc kubenswrapper[4749]: I0219 18:51:51.862465 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5c47c74b-13d3-47fa-859a-3b26113630b6","Type":"ContainerDied","Data":"a0410ab97e285a06c2966cd70f61787bee43eba44b005f0657a88e4cfcf7019b"} Feb 19 18:51:51 crc kubenswrapper[4749]: I0219 18:51:51.984182 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6ff5f99c57-gtnp4"] Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.004444 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-866f5f4f4b-zfvsn"] Feb 19 18:51:52 crc kubenswrapper[4749]: E0219 18:51:52.004851 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b16d3f1e-45df-4bd3-83f7-3c9396a8de95" containerName="init" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.004864 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b16d3f1e-45df-4bd3-83f7-3c9396a8de95" containerName="init" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.005087 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b16d3f1e-45df-4bd3-83f7-3c9396a8de95" containerName="init" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.006173 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.011392 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.029070 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-866f5f4f4b-zfvsn"] Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.075164 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8dcae23-3df3-4de3-8a0a-499c15a90daa-config-data\") pod \"horizon-866f5f4f4b-zfvsn\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.075239 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f8dcae23-3df3-4de3-8a0a-499c15a90daa-horizon-secret-key\") pod \"horizon-866f5f4f4b-zfvsn\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.075273 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8dcae23-3df3-4de3-8a0a-499c15a90daa-scripts\") pod \"horizon-866f5f4f4b-zfvsn\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.075313 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8dcae23-3df3-4de3-8a0a-499c15a90daa-logs\") pod \"horizon-866f5f4f4b-zfvsn\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.075506 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dcae23-3df3-4de3-8a0a-499c15a90daa-combined-ca-bundle\") pod \"horizon-866f5f4f4b-zfvsn\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.075576 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8dcae23-3df3-4de3-8a0a-499c15a90daa-horizon-tls-certs\") pod \"horizon-866f5f4f4b-zfvsn\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.075754 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x24s\" (UniqueName: \"kubernetes.io/projected/f8dcae23-3df3-4de3-8a0a-499c15a90daa-kube-api-access-9x24s\") pod \"horizon-866f5f4f4b-zfvsn\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.093715 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8678f4d997-tsdv8"] Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.167856 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7b4d589db8-c89ft"] Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.173233 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b4d589db8-c89ft" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.176478 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8812ac95-8284-4b4f-a838-b5ab30a55fad-config-data\") pod \"horizon-7b4d589db8-c89ft\" (UID: \"8812ac95-8284-4b4f-a838-b5ab30a55fad\") " pod="openstack/horizon-7b4d589db8-c89ft" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.176520 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8812ac95-8284-4b4f-a838-b5ab30a55fad-scripts\") pod \"horizon-7b4d589db8-c89ft\" (UID: \"8812ac95-8284-4b4f-a838-b5ab30a55fad\") " pod="openstack/horizon-7b4d589db8-c89ft" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.176558 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f8dcae23-3df3-4de3-8a0a-499c15a90daa-horizon-secret-key\") pod \"horizon-866f5f4f4b-zfvsn\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.176581 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8812ac95-8284-4b4f-a838-b5ab30a55fad-logs\") pod \"horizon-7b4d589db8-c89ft\" (UID: \"8812ac95-8284-4b4f-a838-b5ab30a55fad\") " pod="openstack/horizon-7b4d589db8-c89ft" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.176604 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8dcae23-3df3-4de3-8a0a-499c15a90daa-scripts\") pod \"horizon-866f5f4f4b-zfvsn\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.176630 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8812ac95-8284-4b4f-a838-b5ab30a55fad-horizon-tls-certs\") pod \"horizon-7b4d589db8-c89ft\" (UID: \"8812ac95-8284-4b4f-a838-b5ab30a55fad\") " pod="openstack/horizon-7b4d589db8-c89ft" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.176652 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8dcae23-3df3-4de3-8a0a-499c15a90daa-logs\") pod \"horizon-866f5f4f4b-zfvsn\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.176674 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dcae23-3df3-4de3-8a0a-499c15a90daa-combined-ca-bundle\") pod \"horizon-866f5f4f4b-zfvsn\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.176697 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8812ac95-8284-4b4f-a838-b5ab30a55fad-horizon-secret-key\") pod \"horizon-7b4d589db8-c89ft\" (UID: \"8812ac95-8284-4b4f-a838-b5ab30a55fad\") " pod="openstack/horizon-7b4d589db8-c89ft" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.176718 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8dcae23-3df3-4de3-8a0a-499c15a90daa-horizon-tls-certs\") pod \"horizon-866f5f4f4b-zfvsn\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.176742 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x24s\" (UniqueName: \"kubernetes.io/projected/f8dcae23-3df3-4de3-8a0a-499c15a90daa-kube-api-access-9x24s\") pod \"horizon-866f5f4f4b-zfvsn\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.176779 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46vgt\" (UniqueName: \"kubernetes.io/projected/8812ac95-8284-4b4f-a838-b5ab30a55fad-kube-api-access-46vgt\") pod \"horizon-7b4d589db8-c89ft\" (UID: \"8812ac95-8284-4b4f-a838-b5ab30a55fad\") " pod="openstack/horizon-7b4d589db8-c89ft" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.176811 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8dcae23-3df3-4de3-8a0a-499c15a90daa-config-data\") pod \"horizon-866f5f4f4b-zfvsn\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.176833 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8812ac95-8284-4b4f-a838-b5ab30a55fad-combined-ca-bundle\") pod \"horizon-7b4d589db8-c89ft\" (UID: \"8812ac95-8284-4b4f-a838-b5ab30a55fad\") " pod="openstack/horizon-7b4d589db8-c89ft" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.178927 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8dcae23-3df3-4de3-8a0a-499c15a90daa-config-data\") pod \"horizon-866f5f4f4b-zfvsn\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.179341 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8dcae23-3df3-4de3-8a0a-499c15a90daa-logs\") pod \"horizon-866f5f4f4b-zfvsn\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.179477 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8dcae23-3df3-4de3-8a0a-499c15a90daa-scripts\") pod \"horizon-866f5f4f4b-zfvsn\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.192616 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8dcae23-3df3-4de3-8a0a-499c15a90daa-horizon-tls-certs\") pod \"horizon-866f5f4f4b-zfvsn\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.196803 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dcae23-3df3-4de3-8a0a-499c15a90daa-combined-ca-bundle\") pod \"horizon-866f5f4f4b-zfvsn\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.198446 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f8dcae23-3df3-4de3-8a0a-499c15a90daa-horizon-secret-key\") pod \"horizon-866f5f4f4b-zfvsn\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.199877 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b4d589db8-c89ft"] Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.281703 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46vgt\" (UniqueName: \"kubernetes.io/projected/8812ac95-8284-4b4f-a838-b5ab30a55fad-kube-api-access-46vgt\") pod \"horizon-7b4d589db8-c89ft\" (UID: \"8812ac95-8284-4b4f-a838-b5ab30a55fad\") " pod="openstack/horizon-7b4d589db8-c89ft" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.281763 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8812ac95-8284-4b4f-a838-b5ab30a55fad-combined-ca-bundle\") pod \"horizon-7b4d589db8-c89ft\" (UID: \"8812ac95-8284-4b4f-a838-b5ab30a55fad\") " pod="openstack/horizon-7b4d589db8-c89ft" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.281791 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8812ac95-8284-4b4f-a838-b5ab30a55fad-config-data\") pod \"horizon-7b4d589db8-c89ft\" (UID: \"8812ac95-8284-4b4f-a838-b5ab30a55fad\") " pod="openstack/horizon-7b4d589db8-c89ft" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.281810 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8812ac95-8284-4b4f-a838-b5ab30a55fad-scripts\") pod \"horizon-7b4d589db8-c89ft\" (UID: \"8812ac95-8284-4b4f-a838-b5ab30a55fad\") " pod="openstack/horizon-7b4d589db8-c89ft" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.281847 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8812ac95-8284-4b4f-a838-b5ab30a55fad-logs\") pod \"horizon-7b4d589db8-c89ft\" (UID: \"8812ac95-8284-4b4f-a838-b5ab30a55fad\") " pod="openstack/horizon-7b4d589db8-c89ft" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.281876 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8812ac95-8284-4b4f-a838-b5ab30a55fad-horizon-tls-certs\") pod \"horizon-7b4d589db8-c89ft\" (UID: \"8812ac95-8284-4b4f-a838-b5ab30a55fad\") " pod="openstack/horizon-7b4d589db8-c89ft" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.281912 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8812ac95-8284-4b4f-a838-b5ab30a55fad-horizon-secret-key\") pod \"horizon-7b4d589db8-c89ft\" (UID: \"8812ac95-8284-4b4f-a838-b5ab30a55fad\") " pod="openstack/horizon-7b4d589db8-c89ft" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.291365 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x24s\" (UniqueName: \"kubernetes.io/projected/f8dcae23-3df3-4de3-8a0a-499c15a90daa-kube-api-access-9x24s\") pod \"horizon-866f5f4f4b-zfvsn\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.307136 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8812ac95-8284-4b4f-a838-b5ab30a55fad-config-data\") pod \"horizon-7b4d589db8-c89ft\" (UID: \"8812ac95-8284-4b4f-a838-b5ab30a55fad\") " pod="openstack/horizon-7b4d589db8-c89ft" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.310493 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8812ac95-8284-4b4f-a838-b5ab30a55fad-logs\") pod \"horizon-7b4d589db8-c89ft\" (UID: \"8812ac95-8284-4b4f-a838-b5ab30a55fad\") " pod="openstack/horizon-7b4d589db8-c89ft" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.312062 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8812ac95-8284-4b4f-a838-b5ab30a55fad-scripts\") pod \"horizon-7b4d589db8-c89ft\" (UID: \"8812ac95-8284-4b4f-a838-b5ab30a55fad\") " pod="openstack/horizon-7b4d589db8-c89ft" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.329823 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8812ac95-8284-4b4f-a838-b5ab30a55fad-combined-ca-bundle\") pod \"horizon-7b4d589db8-c89ft\" (UID: \"8812ac95-8284-4b4f-a838-b5ab30a55fad\") " pod="openstack/horizon-7b4d589db8-c89ft" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.344473 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.346016 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8812ac95-8284-4b4f-a838-b5ab30a55fad-horizon-secret-key\") pod \"horizon-7b4d589db8-c89ft\" (UID: \"8812ac95-8284-4b4f-a838-b5ab30a55fad\") " pod="openstack/horizon-7b4d589db8-c89ft" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.349653 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46vgt\" (UniqueName: \"kubernetes.io/projected/8812ac95-8284-4b4f-a838-b5ab30a55fad-kube-api-access-46vgt\") pod \"horizon-7b4d589db8-c89ft\" (UID: \"8812ac95-8284-4b4f-a838-b5ab30a55fad\") " pod="openstack/horizon-7b4d589db8-c89ft" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.381669 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8812ac95-8284-4b4f-a838-b5ab30a55fad-horizon-tls-certs\") pod \"horizon-7b4d589db8-c89ft\" (UID: \"8812ac95-8284-4b4f-a838-b5ab30a55fad\") " pod="openstack/horizon-7b4d589db8-c89ft" Feb 19 18:51:52 crc kubenswrapper[4749]: I0219 18:51:52.449859 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b4d589db8-c89ft" Feb 19 18:51:53 crc kubenswrapper[4749]: I0219 18:51:53.193551 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Feb 19 18:51:53 crc kubenswrapper[4749]: I0219 18:51:53.193871 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Feb 19 18:51:53 crc kubenswrapper[4749]: I0219 18:51:53.231792 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Feb 19 18:51:53 crc kubenswrapper[4749]: I0219 18:51:53.905684 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Feb 19 18:51:53 crc kubenswrapper[4749]: I0219 18:51:53.947479 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 18:51:54 crc kubenswrapper[4749]: I0219 18:51:54.173975 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" Feb 19 18:51:54 crc kubenswrapper[4749]: I0219 18:51:54.241187 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-746f4bbcc9-sjckp"] Feb 19 18:51:54 crc kubenswrapper[4749]: I0219 18:51:54.241460 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" podUID="feeb65e2-e83b-4028-b7de-fa94205ccd40" containerName="dnsmasq-dns" containerID="cri-o://8ba3d5b99d0381c433531c1399e7c1130409eac94c1973be2efe946a9286bcb0" gracePeriod=10 Feb 19 18:51:54 crc kubenswrapper[4749]: I0219 18:51:54.890382 4749 generic.go:334] "Generic (PLEG): container finished" podID="feeb65e2-e83b-4028-b7de-fa94205ccd40" containerID="8ba3d5b99d0381c433531c1399e7c1130409eac94c1973be2efe946a9286bcb0" exitCode=0 Feb 19 18:51:54 crc kubenswrapper[4749]: I0219 18:51:54.890530 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" event={"ID":"feeb65e2-e83b-4028-b7de-fa94205ccd40","Type":"ContainerDied","Data":"8ba3d5b99d0381c433531c1399e7c1130409eac94c1973be2efe946a9286bcb0"} Feb 19 18:51:55 crc kubenswrapper[4749]: I0219 18:51:55.898306 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="d504807c-589e-4665-bc1d-865473001928" containerName="watcher-applier" containerID="cri-o://897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885" gracePeriod=30 Feb 19 18:51:57 crc kubenswrapper[4749]: I0219 18:51:57.915989 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8rjdg" event={"ID":"ba6e5340-1a1a-421a-b3bb-f0d83811a28e","Type":"ContainerDied","Data":"ed59e3a7a4c832aeb7818e30894b69bb287e8a25079861ea61814b79f3d81678"} Feb 19 18:51:57 crc kubenswrapper[4749]: I0219 18:51:57.916461 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed59e3a7a4c832aeb7818e30894b69bb287e8a25079861ea61814b79f3d81678" Feb 19 18:51:57 crc kubenswrapper[4749]: I0219 18:51:57.937066 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8rjdg" Feb 19 18:51:58 crc kubenswrapper[4749]: I0219 18:51:58.022951 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkf6q\" (UniqueName: \"kubernetes.io/projected/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-kube-api-access-wkf6q\") pod \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\" (UID: \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\") " Feb 19 18:51:58 crc kubenswrapper[4749]: I0219 18:51:58.023014 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-fernet-keys\") pod \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\" (UID: \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\") " Feb 19 18:51:58 crc kubenswrapper[4749]: I0219 18:51:58.023052 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-combined-ca-bundle\") pod \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\" (UID: \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\") " Feb 19 18:51:58 crc kubenswrapper[4749]: I0219 18:51:58.023159 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-scripts\") pod \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\" (UID: \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\") " Feb 19 18:51:58 crc kubenswrapper[4749]: I0219 18:51:58.023237 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-config-data\") pod \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\" (UID: \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\") " Feb 19 18:51:58 crc kubenswrapper[4749]: I0219 18:51:58.023287 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-credential-keys\") pod \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\" (UID: \"ba6e5340-1a1a-421a-b3bb-f0d83811a28e\") " Feb 19 18:51:58 crc kubenswrapper[4749]: I0219 18:51:58.028762 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-scripts" (OuterVolumeSpecName: "scripts") pod "ba6e5340-1a1a-421a-b3bb-f0d83811a28e" (UID: "ba6e5340-1a1a-421a-b3bb-f0d83811a28e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:51:58 crc kubenswrapper[4749]: I0219 18:51:58.029502 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-kube-api-access-wkf6q" (OuterVolumeSpecName: "kube-api-access-wkf6q") pod "ba6e5340-1a1a-421a-b3bb-f0d83811a28e" (UID: "ba6e5340-1a1a-421a-b3bb-f0d83811a28e"). InnerVolumeSpecName "kube-api-access-wkf6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:51:58 crc kubenswrapper[4749]: I0219 18:51:58.031131 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ba6e5340-1a1a-421a-b3bb-f0d83811a28e" (UID: "ba6e5340-1a1a-421a-b3bb-f0d83811a28e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:51:58 crc kubenswrapper[4749]: I0219 18:51:58.031261 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ba6e5340-1a1a-421a-b3bb-f0d83811a28e" (UID: "ba6e5340-1a1a-421a-b3bb-f0d83811a28e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:51:58 crc kubenswrapper[4749]: I0219 18:51:58.050589 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-config-data" (OuterVolumeSpecName: "config-data") pod "ba6e5340-1a1a-421a-b3bb-f0d83811a28e" (UID: "ba6e5340-1a1a-421a-b3bb-f0d83811a28e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:51:58 crc kubenswrapper[4749]: I0219 18:51:58.052236 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba6e5340-1a1a-421a-b3bb-f0d83811a28e" (UID: "ba6e5340-1a1a-421a-b3bb-f0d83811a28e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:51:58 crc kubenswrapper[4749]: I0219 18:51:58.126317 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:58 crc kubenswrapper[4749]: I0219 18:51:58.126379 4749 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:58 crc kubenswrapper[4749]: I0219 18:51:58.126401 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkf6q\" (UniqueName: \"kubernetes.io/projected/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-kube-api-access-wkf6q\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:58 crc kubenswrapper[4749]: I0219 18:51:58.126417 4749 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:58 crc kubenswrapper[4749]: I0219 18:51:58.126434 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:58 crc kubenswrapper[4749]: I0219 18:51:58.126450 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba6e5340-1a1a-421a-b3bb-f0d83811a28e-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:51:58 crc kubenswrapper[4749]: E0219 18:51:58.196668 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 19 18:51:58 crc kubenswrapper[4749]: E0219 18:51:58.198677 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 19 18:51:58 crc kubenswrapper[4749]: E0219 18:51:58.200245 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 19 18:51:58 crc kubenswrapper[4749]: E0219 18:51:58.200275 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="d504807c-589e-4665-bc1d-865473001928" containerName="watcher-applier" Feb 19 18:51:58 crc kubenswrapper[4749]: I0219 18:51:58.859422 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" podUID="feeb65e2-e83b-4028-b7de-fa94205ccd40" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.122:5353: connect: connection refused" Feb 19 18:51:58 crc kubenswrapper[4749]: I0219 18:51:58.925784 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8rjdg" Feb 19 18:51:58 crc kubenswrapper[4749]: I0219 18:51:58.927213 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"763173db-176a-426d-bd85-e051d56ec5cf","Type":"ContainerStarted","Data":"017ee1107b65424619a4a0a0459814477d6930cc4863971e4911ed32a3f6c743"} Feb 19 18:51:58 crc kubenswrapper[4749]: I0219 18:51:58.946138 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=12.780603937 podStartE2EDuration="16.946119802s" podCreationTimestamp="2026-02-19 18:51:42 +0000 UTC" firstStartedPulling="2026-02-19 18:51:45.169582911 +0000 UTC m=+1079.130802865" lastFinishedPulling="2026-02-19 18:51:49.335098776 +0000 UTC m=+1083.296318730" observedRunningTime="2026-02-19 18:51:58.942169216 +0000 UTC m=+1092.903389190" watchObservedRunningTime="2026-02-19 18:51:58.946119802 +0000 UTC m=+1092.907339746" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.020658 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-8rjdg"] Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.030581 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-8rjdg"] Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.118931 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-wsg2v"] Feb 19 18:51:59 crc kubenswrapper[4749]: E0219 18:51:59.119450 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba6e5340-1a1a-421a-b3bb-f0d83811a28e" containerName="keystone-bootstrap" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.119474 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba6e5340-1a1a-421a-b3bb-f0d83811a28e" containerName="keystone-bootstrap" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.119676 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba6e5340-1a1a-421a-b3bb-f0d83811a28e" containerName="keystone-bootstrap" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.120403 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wsg2v" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.122701 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.122900 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.122929 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.123005 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-xqgf2" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.122929 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.127663 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wsg2v"] Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.146818 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-config-data\") pod \"keystone-bootstrap-wsg2v\" (UID: \"180a38b2-06f7-49bb-8641-4d82c2e14183\") " pod="openstack/keystone-bootstrap-wsg2v" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.146890 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-fernet-keys\") pod \"keystone-bootstrap-wsg2v\" (UID: \"180a38b2-06f7-49bb-8641-4d82c2e14183\") " pod="openstack/keystone-bootstrap-wsg2v" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.146923 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-credential-keys\") pod \"keystone-bootstrap-wsg2v\" (UID: \"180a38b2-06f7-49bb-8641-4d82c2e14183\") " pod="openstack/keystone-bootstrap-wsg2v" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.148774 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-scripts\") pod \"keystone-bootstrap-wsg2v\" (UID: \"180a38b2-06f7-49bb-8641-4d82c2e14183\") " pod="openstack/keystone-bootstrap-wsg2v" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.148937 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csxng\" (UniqueName: \"kubernetes.io/projected/180a38b2-06f7-49bb-8641-4d82c2e14183-kube-api-access-csxng\") pod \"keystone-bootstrap-wsg2v\" (UID: \"180a38b2-06f7-49bb-8641-4d82c2e14183\") " pod="openstack/keystone-bootstrap-wsg2v" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.149066 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-combined-ca-bundle\") pod \"keystone-bootstrap-wsg2v\" (UID: \"180a38b2-06f7-49bb-8641-4d82c2e14183\") " pod="openstack/keystone-bootstrap-wsg2v" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.250285 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csxng\" (UniqueName: \"kubernetes.io/projected/180a38b2-06f7-49bb-8641-4d82c2e14183-kube-api-access-csxng\") pod \"keystone-bootstrap-wsg2v\" (UID: \"180a38b2-06f7-49bb-8641-4d82c2e14183\") " pod="openstack/keystone-bootstrap-wsg2v" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.250596 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-combined-ca-bundle\") pod \"keystone-bootstrap-wsg2v\" (UID: \"180a38b2-06f7-49bb-8641-4d82c2e14183\") " pod="openstack/keystone-bootstrap-wsg2v" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.250761 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-config-data\") pod \"keystone-bootstrap-wsg2v\" (UID: \"180a38b2-06f7-49bb-8641-4d82c2e14183\") " pod="openstack/keystone-bootstrap-wsg2v" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.250889 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-fernet-keys\") pod \"keystone-bootstrap-wsg2v\" (UID: \"180a38b2-06f7-49bb-8641-4d82c2e14183\") " pod="openstack/keystone-bootstrap-wsg2v" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.250970 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-credential-keys\") pod \"keystone-bootstrap-wsg2v\" (UID: \"180a38b2-06f7-49bb-8641-4d82c2e14183\") " pod="openstack/keystone-bootstrap-wsg2v" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.251099 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-scripts\") pod \"keystone-bootstrap-wsg2v\" (UID: \"180a38b2-06f7-49bb-8641-4d82c2e14183\") " pod="openstack/keystone-bootstrap-wsg2v" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.254391 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-scripts\") pod \"keystone-bootstrap-wsg2v\" (UID: \"180a38b2-06f7-49bb-8641-4d82c2e14183\") " pod="openstack/keystone-bootstrap-wsg2v" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.254554 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-credential-keys\") pod \"keystone-bootstrap-wsg2v\" (UID: \"180a38b2-06f7-49bb-8641-4d82c2e14183\") " pod="openstack/keystone-bootstrap-wsg2v" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.255114 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-config-data\") pod \"keystone-bootstrap-wsg2v\" (UID: \"180a38b2-06f7-49bb-8641-4d82c2e14183\") " pod="openstack/keystone-bootstrap-wsg2v" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.255696 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-combined-ca-bundle\") pod \"keystone-bootstrap-wsg2v\" (UID: \"180a38b2-06f7-49bb-8641-4d82c2e14183\") " pod="openstack/keystone-bootstrap-wsg2v" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.266219 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-fernet-keys\") pod \"keystone-bootstrap-wsg2v\" (UID: \"180a38b2-06f7-49bb-8641-4d82c2e14183\") " pod="openstack/keystone-bootstrap-wsg2v" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.266259 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csxng\" (UniqueName: \"kubernetes.io/projected/180a38b2-06f7-49bb-8641-4d82c2e14183-kube-api-access-csxng\") pod \"keystone-bootstrap-wsg2v\" (UID: \"180a38b2-06f7-49bb-8641-4d82c2e14183\") " pod="openstack/keystone-bootstrap-wsg2v" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.452438 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wsg2v" Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.937282 4749 generic.go:334] "Generic (PLEG): container finished" podID="d504807c-589e-4665-bc1d-865473001928" containerID="897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885" exitCode=0 Feb 19 18:51:59 crc kubenswrapper[4749]: I0219 18:51:59.938386 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"d504807c-589e-4665-bc1d-865473001928","Type":"ContainerDied","Data":"897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885"} Feb 19 18:52:00 crc kubenswrapper[4749]: I0219 18:52:00.691561 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba6e5340-1a1a-421a-b3bb-f0d83811a28e" path="/var/lib/kubelet/pods/ba6e5340-1a1a-421a-b3bb-f0d83811a28e/volumes" Feb 19 18:52:02 crc kubenswrapper[4749]: I0219 18:52:02.961786 4749 generic.go:334] "Generic (PLEG): container finished" podID="763173db-176a-426d-bd85-e051d56ec5cf" containerID="017ee1107b65424619a4a0a0459814477d6930cc4863971e4911ed32a3f6c743" exitCode=1 Feb 19 18:52:02 crc kubenswrapper[4749]: I0219 18:52:02.961862 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"763173db-176a-426d-bd85-e051d56ec5cf","Type":"ContainerDied","Data":"017ee1107b65424619a4a0a0459814477d6930cc4863971e4911ed32a3f6c743"} Feb 19 18:52:02 crc kubenswrapper[4749]: I0219 18:52:02.962677 4749 scope.go:117] "RemoveContainer" containerID="017ee1107b65424619a4a0a0459814477d6930cc4863971e4911ed32a3f6c743" Feb 19 18:52:03 crc kubenswrapper[4749]: E0219 18:52:03.196141 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885 is running failed: container process not found" containerID="897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 19 18:52:03 crc kubenswrapper[4749]: E0219 18:52:03.197710 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885 is running failed: container process not found" containerID="897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 19 18:52:03 crc kubenswrapper[4749]: E0219 18:52:03.197982 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885 is running failed: container process not found" containerID="897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 19 18:52:03 crc kubenswrapper[4749]: E0219 18:52:03.198136 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885 is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="d504807c-589e-4665-bc1d-865473001928" containerName="watcher-applier" Feb 19 18:52:03 crc kubenswrapper[4749]: I0219 18:52:03.552536 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 18:52:03 crc kubenswrapper[4749]: I0219 18:52:03.552582 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 18:52:03 crc kubenswrapper[4749]: I0219 18:52:03.859885 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" podUID="feeb65e2-e83b-4028-b7de-fa94205ccd40" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.122:5353: connect: connection refused" Feb 19 18:52:08 crc kubenswrapper[4749]: E0219 18:52:08.195417 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885 is running failed: container process not found" containerID="897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 19 18:52:08 crc kubenswrapper[4749]: E0219 18:52:08.197630 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885 is running failed: container process not found" containerID="897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 19 18:52:08 crc kubenswrapper[4749]: E0219 18:52:08.197945 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885 is running failed: container process not found" containerID="897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 19 18:52:08 crc kubenswrapper[4749]: E0219 18:52:08.198008 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885 is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="d504807c-589e-4665-bc1d-865473001928" containerName="watcher-applier" Feb 19 18:52:08 crc kubenswrapper[4749]: I0219 18:52:08.860598 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" podUID="feeb65e2-e83b-4028-b7de-fa94205ccd40" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.122:5353: connect: connection refused" Feb 19 18:52:08 crc kubenswrapper[4749]: I0219 18:52:08.860807 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" Feb 19 18:52:11 crc kubenswrapper[4749]: E0219 18:52:11.603240 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.75:5001/podified-master-centos10/openstack-horizon:watcher_latest" Feb 19 18:52:11 crc kubenswrapper[4749]: E0219 18:52:11.603549 4749 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.75:5001/podified-master-centos10/openstack-horizon:watcher_latest" Feb 19 18:52:11 crc kubenswrapper[4749]: E0219 18:52:11.603748 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.75:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nc8hd8h66hd5h5bdh697hbch9dh574h657h698hb9h77h66fh674h5fch75h54dh67h67h6ch654h9h598hf7h85hbdhfch5d6h657h65ch659q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9s8rj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6ff5f99c57-gtnp4_openstack(9013866c-5d2f-4e3d-b7fe-d38f7339c6f0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 18:52:11 crc kubenswrapper[4749]: E0219 18:52:11.605842 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.75:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-6ff5f99c57-gtnp4" podUID="9013866c-5d2f-4e3d-b7fe-d38f7339c6f0" Feb 19 18:52:11 crc kubenswrapper[4749]: E0219 18:52:11.631275 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.75:5001/podified-master-centos10/openstack-horizon:watcher_latest" Feb 19 18:52:11 crc kubenswrapper[4749]: E0219 18:52:11.631346 4749 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.75:5001/podified-master-centos10/openstack-horizon:watcher_latest" Feb 19 18:52:11 crc kubenswrapper[4749]: E0219 18:52:11.631510 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.75:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nbdh696h99hd7h544h5f9h67fhf7h75h584h86hd9h548h56dh4h65dh67ch67fh666h5d4h5f6hbhc7h68fh5b5hd5hc9hbbh568h679h58fh5cdq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4d6r7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-59c4845857-58lkr_openstack(bd8ec365-fbcd-4680-90cf-9a41cae499f3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 18:52:11 crc kubenswrapper[4749]: E0219 18:52:11.633934 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.75:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-59c4845857-58lkr" podUID="bd8ec365-fbcd-4680-90cf-9a41cae499f3" Feb 19 18:52:11 crc kubenswrapper[4749]: E0219 18:52:11.635256 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.75:5001/podified-master-centos10/openstack-horizon:watcher_latest" Feb 19 18:52:11 crc kubenswrapper[4749]: E0219 18:52:11.635293 4749 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.75:5001/podified-master-centos10/openstack-horizon:watcher_latest" Feb 19 18:52:11 crc kubenswrapper[4749]: E0219 18:52:11.635397 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.75:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n549h5fch654h575hc7h665h5d4h548h57h556h678h67bhch557hc9h87h5dh65fh64h64fh548hcfh64ch59bh68bh644hfhb6h569hcdh59ch577q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jw4rw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-8678f4d997-tsdv8_openstack(ac47ec62-11c1-4eea-91c5-0331050fd880): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 18:52:11 crc kubenswrapper[4749]: E0219 18:52:11.637407 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.75:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-8678f4d997-tsdv8" podUID="ac47ec62-11c1-4eea-91c5-0331050fd880" Feb 19 18:52:12 crc kubenswrapper[4749]: E0219 18:52:12.421516 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.75:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Feb 19 18:52:12 crc kubenswrapper[4749]: E0219 18:52:12.421981 4749 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.75:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Feb 19 18:52:12 crc kubenswrapper[4749]: E0219 18:52:12.422143 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:38.102.83.75:5001/podified-master-centos10/openstack-barbican-api:watcher_latest,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vvv7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-cmbgb_openstack(5701fcc2-ae2a-4017-8991-3470421ff234): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 18:52:12 crc kubenswrapper[4749]: E0219 18:52:12.423603 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-cmbgb" podUID="5701fcc2-ae2a-4017-8991-3470421ff234" Feb 19 18:52:13 crc kubenswrapper[4749]: E0219 18:52:13.067387 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.75:5001/podified-master-centos10/openstack-barbican-api:watcher_latest\\\"\"" pod="openstack/barbican-db-sync-cmbgb" podUID="5701fcc2-ae2a-4017-8991-3470421ff234" Feb 19 18:52:13 crc kubenswrapper[4749]: E0219 18:52:13.194233 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885 is running failed: container process not found" containerID="897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 19 18:52:13 crc kubenswrapper[4749]: E0219 18:52:13.194705 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885 is running failed: container process not found" containerID="897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 19 18:52:13 crc kubenswrapper[4749]: E0219 18:52:13.195187 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885 is running failed: container process not found" containerID="897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 19 18:52:13 crc kubenswrapper[4749]: E0219 18:52:13.195298 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885 is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="d504807c-589e-4665-bc1d-865473001928" containerName="watcher-applier" Feb 19 18:52:13 crc kubenswrapper[4749]: E0219 18:52:13.402773 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.75:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Feb 19 18:52:13 crc kubenswrapper[4749]: E0219 18:52:13.402813 4749 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.75:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Feb 19 18:52:13 crc kubenswrapper[4749]: E0219 18:52:13.402921 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:38.102.83.75:5001/podified-master-centos10/openstack-cinder-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xh6wq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-f8dgh_openstack(d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 18:52:13 crc kubenswrapper[4749]: E0219 18:52:13.404221 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-f8dgh" podUID="d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.552859 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.553277 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.603316 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.607100 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.716779 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.739004 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.746637 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d504807c-589e-4665-bc1d-865473001928-config-data\") pod \"d504807c-589e-4665-bc1d-865473001928\" (UID: \"d504807c-589e-4665-bc1d-865473001928\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.746677 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c47c74b-13d3-47fa-859a-3b26113630b6-scripts\") pod \"5c47c74b-13d3-47fa-859a-3b26113630b6\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.746704 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eff44afa-e6ee-4d6f-a041-3734a9e4a782-httpd-run\") pod \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.746721 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"5c47c74b-13d3-47fa-859a-3b26113630b6\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.746814 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff44afa-e6ee-4d6f-a041-3734a9e4a782-combined-ca-bundle\") pod \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.746889 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8mlb\" (UniqueName: \"kubernetes.io/projected/5c47c74b-13d3-47fa-859a-3b26113630b6-kube-api-access-k8mlb\") pod \"5c47c74b-13d3-47fa-859a-3b26113630b6\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.747330 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5c47c74b-13d3-47fa-859a-3b26113630b6-httpd-run\") pod \"5c47c74b-13d3-47fa-859a-3b26113630b6\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.747376 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj7st\" (UniqueName: \"kubernetes.io/projected/d504807c-589e-4665-bc1d-865473001928-kube-api-access-jj7st\") pod \"d504807c-589e-4665-bc1d-865473001928\" (UID: \"d504807c-589e-4665-bc1d-865473001928\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.747394 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gp2p\" (UniqueName: \"kubernetes.io/projected/feeb65e2-e83b-4028-b7de-fa94205ccd40-kube-api-access-4gp2p\") pod \"feeb65e2-e83b-4028-b7de-fa94205ccd40\" (UID: \"feeb65e2-e83b-4028-b7de-fa94205ccd40\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.747416 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c47c74b-13d3-47fa-859a-3b26113630b6-public-tls-certs\") pod \"5c47c74b-13d3-47fa-859a-3b26113630b6\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.747436 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/feeb65e2-e83b-4028-b7de-fa94205ccd40-dns-svc\") pod \"feeb65e2-e83b-4028-b7de-fa94205ccd40\" (UID: \"feeb65e2-e83b-4028-b7de-fa94205ccd40\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.747467 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eff44afa-e6ee-4d6f-a041-3734a9e4a782-scripts\") pod \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.747522 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feeb65e2-e83b-4028-b7de-fa94205ccd40-config\") pod \"feeb65e2-e83b-4028-b7de-fa94205ccd40\" (UID: \"feeb65e2-e83b-4028-b7de-fa94205ccd40\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.747555 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.747576 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d504807c-589e-4665-bc1d-865473001928-logs\") pod \"d504807c-589e-4665-bc1d-865473001928\" (UID: \"d504807c-589e-4665-bc1d-865473001928\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.747600 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c47c74b-13d3-47fa-859a-3b26113630b6-config-data\") pod \"5c47c74b-13d3-47fa-859a-3b26113630b6\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.747617 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c47c74b-13d3-47fa-859a-3b26113630b6-logs\") pod \"5c47c74b-13d3-47fa-859a-3b26113630b6\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.747655 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff44afa-e6ee-4d6f-a041-3734a9e4a782-config-data\") pod \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.747688 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/feeb65e2-e83b-4028-b7de-fa94205ccd40-ovsdbserver-sb\") pod \"feeb65e2-e83b-4028-b7de-fa94205ccd40\" (UID: \"feeb65e2-e83b-4028-b7de-fa94205ccd40\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.747716 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/feeb65e2-e83b-4028-b7de-fa94205ccd40-ovsdbserver-nb\") pod \"feeb65e2-e83b-4028-b7de-fa94205ccd40\" (UID: \"feeb65e2-e83b-4028-b7de-fa94205ccd40\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.747776 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eff44afa-e6ee-4d6f-a041-3734a9e4a782-internal-tls-certs\") pod \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.747794 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d504807c-589e-4665-bc1d-865473001928-combined-ca-bundle\") pod \"d504807c-589e-4665-bc1d-865473001928\" (UID: \"d504807c-589e-4665-bc1d-865473001928\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.747822 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eff44afa-e6ee-4d6f-a041-3734a9e4a782-logs\") pod \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.747848 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zprll\" (UniqueName: \"kubernetes.io/projected/eff44afa-e6ee-4d6f-a041-3734a9e4a782-kube-api-access-zprll\") pod \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\" (UID: \"eff44afa-e6ee-4d6f-a041-3734a9e4a782\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.747864 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c47c74b-13d3-47fa-859a-3b26113630b6-combined-ca-bundle\") pod \"5c47c74b-13d3-47fa-859a-3b26113630b6\" (UID: \"5c47c74b-13d3-47fa-859a-3b26113630b6\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.752459 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c47c74b-13d3-47fa-859a-3b26113630b6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5c47c74b-13d3-47fa-859a-3b26113630b6" (UID: "5c47c74b-13d3-47fa-859a-3b26113630b6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.758701 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eff44afa-e6ee-4d6f-a041-3734a9e4a782-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "eff44afa-e6ee-4d6f-a041-3734a9e4a782" (UID: "eff44afa-e6ee-4d6f-a041-3734a9e4a782"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.759772 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d504807c-589e-4665-bc1d-865473001928-kube-api-access-jj7st" (OuterVolumeSpecName: "kube-api-access-jj7st") pod "d504807c-589e-4665-bc1d-865473001928" (UID: "d504807c-589e-4665-bc1d-865473001928"). InnerVolumeSpecName "kube-api-access-jj7st". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.760201 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d504807c-589e-4665-bc1d-865473001928-logs" (OuterVolumeSpecName: "logs") pod "d504807c-589e-4665-bc1d-865473001928" (UID: "d504807c-589e-4665-bc1d-865473001928"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.760385 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eff44afa-e6ee-4d6f-a041-3734a9e4a782-logs" (OuterVolumeSpecName: "logs") pod "eff44afa-e6ee-4d6f-a041-3734a9e4a782" (UID: "eff44afa-e6ee-4d6f-a041-3734a9e4a782"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.760678 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c47c74b-13d3-47fa-859a-3b26113630b6-logs" (OuterVolumeSpecName: "logs") pod "5c47c74b-13d3-47fa-859a-3b26113630b6" (UID: "5c47c74b-13d3-47fa-859a-3b26113630b6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.760911 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59c4845857-58lkr" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.776499 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eff44afa-e6ee-4d6f-a041-3734a9e4a782-kube-api-access-zprll" (OuterVolumeSpecName: "kube-api-access-zprll") pod "eff44afa-e6ee-4d6f-a041-3734a9e4a782" (UID: "eff44afa-e6ee-4d6f-a041-3734a9e4a782"). InnerVolumeSpecName "kube-api-access-zprll". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.776648 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feeb65e2-e83b-4028-b7de-fa94205ccd40-kube-api-access-4gp2p" (OuterVolumeSpecName: "kube-api-access-4gp2p") pod "feeb65e2-e83b-4028-b7de-fa94205ccd40" (UID: "feeb65e2-e83b-4028-b7de-fa94205ccd40"). InnerVolumeSpecName "kube-api-access-4gp2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.779176 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6ff5f99c57-gtnp4" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.784355 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8678f4d997-tsdv8" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.791331 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c47c74b-13d3-47fa-859a-3b26113630b6-scripts" (OuterVolumeSpecName: "scripts") pod "5c47c74b-13d3-47fa-859a-3b26113630b6" (UID: "5c47c74b-13d3-47fa-859a-3b26113630b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.792184 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c47c74b-13d3-47fa-859a-3b26113630b6-kube-api-access-k8mlb" (OuterVolumeSpecName: "kube-api-access-k8mlb") pod "5c47c74b-13d3-47fa-859a-3b26113630b6" (UID: "5c47c74b-13d3-47fa-859a-3b26113630b6"). InnerVolumeSpecName "kube-api-access-k8mlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.792208 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "5c47c74b-13d3-47fa-859a-3b26113630b6" (UID: "5c47c74b-13d3-47fa-859a-3b26113630b6"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.803253 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "eff44afa-e6ee-4d6f-a041-3734a9e4a782" (UID: "eff44afa-e6ee-4d6f-a041-3734a9e4a782"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.817312 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff44afa-e6ee-4d6f-a041-3734a9e4a782-scripts" (OuterVolumeSpecName: "scripts") pod "eff44afa-e6ee-4d6f-a041-3734a9e4a782" (UID: "eff44afa-e6ee-4d6f-a041-3734a9e4a782"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.848419 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-logs\") pod \"9013866c-5d2f-4e3d-b7fe-d38f7339c6f0\" (UID: \"9013866c-5d2f-4e3d-b7fe-d38f7339c6f0\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.848480 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bd8ec365-fbcd-4680-90cf-9a41cae499f3-horizon-secret-key\") pod \"bd8ec365-fbcd-4680-90cf-9a41cae499f3\" (UID: \"bd8ec365-fbcd-4680-90cf-9a41cae499f3\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.848515 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ac47ec62-11c1-4eea-91c5-0331050fd880-horizon-secret-key\") pod \"ac47ec62-11c1-4eea-91c5-0331050fd880\" (UID: \"ac47ec62-11c1-4eea-91c5-0331050fd880\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.848622 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d6r7\" (UniqueName: \"kubernetes.io/projected/bd8ec365-fbcd-4680-90cf-9a41cae499f3-kube-api-access-4d6r7\") pod \"bd8ec365-fbcd-4680-90cf-9a41cae499f3\" (UID: \"bd8ec365-fbcd-4680-90cf-9a41cae499f3\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.848648 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac47ec62-11c1-4eea-91c5-0331050fd880-scripts\") pod \"ac47ec62-11c1-4eea-91c5-0331050fd880\" (UID: \"ac47ec62-11c1-4eea-91c5-0331050fd880\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.848690 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac47ec62-11c1-4eea-91c5-0331050fd880-logs\") pod \"ac47ec62-11c1-4eea-91c5-0331050fd880\" (UID: \"ac47ec62-11c1-4eea-91c5-0331050fd880\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.848728 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-logs" (OuterVolumeSpecName: "logs") pod "9013866c-5d2f-4e3d-b7fe-d38f7339c6f0" (UID: "9013866c-5d2f-4e3d-b7fe-d38f7339c6f0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.848738 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd8ec365-fbcd-4680-90cf-9a41cae499f3-scripts\") pod \"bd8ec365-fbcd-4680-90cf-9a41cae499f3\" (UID: \"bd8ec365-fbcd-4680-90cf-9a41cae499f3\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.848806 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd8ec365-fbcd-4680-90cf-9a41cae499f3-config-data\") pod \"bd8ec365-fbcd-4680-90cf-9a41cae499f3\" (UID: \"bd8ec365-fbcd-4680-90cf-9a41cae499f3\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.848833 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-scripts\") pod \"9013866c-5d2f-4e3d-b7fe-d38f7339c6f0\" (UID: \"9013866c-5d2f-4e3d-b7fe-d38f7339c6f0\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.848865 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd8ec365-fbcd-4680-90cf-9a41cae499f3-logs\") pod \"bd8ec365-fbcd-4680-90cf-9a41cae499f3\" (UID: \"bd8ec365-fbcd-4680-90cf-9a41cae499f3\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.848944 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ac47ec62-11c1-4eea-91c5-0331050fd880-config-data\") pod \"ac47ec62-11c1-4eea-91c5-0331050fd880\" (UID: \"ac47ec62-11c1-4eea-91c5-0331050fd880\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.849012 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-horizon-secret-key\") pod \"9013866c-5d2f-4e3d-b7fe-d38f7339c6f0\" (UID: \"9013866c-5d2f-4e3d-b7fe-d38f7339c6f0\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.849295 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-config-data\") pod \"9013866c-5d2f-4e3d-b7fe-d38f7339c6f0\" (UID: \"9013866c-5d2f-4e3d-b7fe-d38f7339c6f0\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.849340 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw4rw\" (UniqueName: \"kubernetes.io/projected/ac47ec62-11c1-4eea-91c5-0331050fd880-kube-api-access-jw4rw\") pod \"ac47ec62-11c1-4eea-91c5-0331050fd880\" (UID: \"ac47ec62-11c1-4eea-91c5-0331050fd880\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.849372 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s8rj\" (UniqueName: \"kubernetes.io/projected/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-kube-api-access-9s8rj\") pod \"9013866c-5d2f-4e3d-b7fe-d38f7339c6f0\" (UID: \"9013866c-5d2f-4e3d-b7fe-d38f7339c6f0\") " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.849614 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-scripts" (OuterVolumeSpecName: "scripts") pod "9013866c-5d2f-4e3d-b7fe-d38f7339c6f0" (UID: "9013866c-5d2f-4e3d-b7fe-d38f7339c6f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.849733 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd8ec365-fbcd-4680-90cf-9a41cae499f3-config-data" (OuterVolumeSpecName: "config-data") pod "bd8ec365-fbcd-4680-90cf-9a41cae499f3" (UID: "bd8ec365-fbcd-4680-90cf-9a41cae499f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.849801 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5c47c74b-13d3-47fa-859a-3b26113630b6-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.849840 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj7st\" (UniqueName: \"kubernetes.io/projected/d504807c-589e-4665-bc1d-865473001928-kube-api-access-jj7st\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.849856 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gp2p\" (UniqueName: \"kubernetes.io/projected/feeb65e2-e83b-4028-b7de-fa94205ccd40-kube-api-access-4gp2p\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.849870 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eff44afa-e6ee-4d6f-a041-3734a9e4a782-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.849894 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.849906 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d504807c-589e-4665-bc1d-865473001928-logs\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.849917 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c47c74b-13d3-47fa-859a-3b26113630b6-logs\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.849927 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.849937 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eff44afa-e6ee-4d6f-a041-3734a9e4a782-logs\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.849948 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zprll\" (UniqueName: \"kubernetes.io/projected/eff44afa-e6ee-4d6f-a041-3734a9e4a782-kube-api-access-zprll\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.849959 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c47c74b-13d3-47fa-859a-3b26113630b6-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.849970 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eff44afa-e6ee-4d6f-a041-3734a9e4a782-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.849989 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.850019 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8mlb\" (UniqueName: \"kubernetes.io/projected/5c47c74b-13d3-47fa-859a-3b26113630b6-kube-api-access-k8mlb\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.850053 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-logs\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.850520 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac47ec62-11c1-4eea-91c5-0331050fd880-scripts" (OuterVolumeSpecName: "scripts") pod "ac47ec62-11c1-4eea-91c5-0331050fd880" (UID: "ac47ec62-11c1-4eea-91c5-0331050fd880"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.850695 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd8ec365-fbcd-4680-90cf-9a41cae499f3-scripts" (OuterVolumeSpecName: "scripts") pod "bd8ec365-fbcd-4680-90cf-9a41cae499f3" (UID: "bd8ec365-fbcd-4680-90cf-9a41cae499f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.851353 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac47ec62-11c1-4eea-91c5-0331050fd880-config-data" (OuterVolumeSpecName: "config-data") pod "ac47ec62-11c1-4eea-91c5-0331050fd880" (UID: "ac47ec62-11c1-4eea-91c5-0331050fd880"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.851409 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac47ec62-11c1-4eea-91c5-0331050fd880-logs" (OuterVolumeSpecName: "logs") pod "ac47ec62-11c1-4eea-91c5-0331050fd880" (UID: "ac47ec62-11c1-4eea-91c5-0331050fd880"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.851817 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd8ec365-fbcd-4680-90cf-9a41cae499f3-logs" (OuterVolumeSpecName: "logs") pod "bd8ec365-fbcd-4680-90cf-9a41cae499f3" (UID: "bd8ec365-fbcd-4680-90cf-9a41cae499f3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.851874 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-config-data" (OuterVolumeSpecName: "config-data") pod "9013866c-5d2f-4e3d-b7fe-d38f7339c6f0" (UID: "9013866c-5d2f-4e3d-b7fe-d38f7339c6f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.862368 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-kube-api-access-9s8rj" (OuterVolumeSpecName: "kube-api-access-9s8rj") pod "9013866c-5d2f-4e3d-b7fe-d38f7339c6f0" (UID: "9013866c-5d2f-4e3d-b7fe-d38f7339c6f0"). InnerVolumeSpecName "kube-api-access-9s8rj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.864326 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd8ec365-fbcd-4680-90cf-9a41cae499f3-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "bd8ec365-fbcd-4680-90cf-9a41cae499f3" (UID: "bd8ec365-fbcd-4680-90cf-9a41cae499f3"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.865210 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "9013866c-5d2f-4e3d-b7fe-d38f7339c6f0" (UID: "9013866c-5d2f-4e3d-b7fe-d38f7339c6f0"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.866381 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac47ec62-11c1-4eea-91c5-0331050fd880-kube-api-access-jw4rw" (OuterVolumeSpecName: "kube-api-access-jw4rw") pod "ac47ec62-11c1-4eea-91c5-0331050fd880" (UID: "ac47ec62-11c1-4eea-91c5-0331050fd880"). InnerVolumeSpecName "kube-api-access-jw4rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.866650 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd8ec365-fbcd-4680-90cf-9a41cae499f3-kube-api-access-4d6r7" (OuterVolumeSpecName: "kube-api-access-4d6r7") pod "bd8ec365-fbcd-4680-90cf-9a41cae499f3" (UID: "bd8ec365-fbcd-4680-90cf-9a41cae499f3"). InnerVolumeSpecName "kube-api-access-4d6r7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.871976 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac47ec62-11c1-4eea-91c5-0331050fd880-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ac47ec62-11c1-4eea-91c5-0331050fd880" (UID: "ac47ec62-11c1-4eea-91c5-0331050fd880"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.902352 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff44afa-e6ee-4d6f-a041-3734a9e4a782-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eff44afa-e6ee-4d6f-a041-3734a9e4a782" (UID: "eff44afa-e6ee-4d6f-a041-3734a9e4a782"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.905262 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d504807c-589e-4665-bc1d-865473001928-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d504807c-589e-4665-bc1d-865473001928" (UID: "d504807c-589e-4665-bc1d-865473001928"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:13 crc kubenswrapper[4749]: W0219 18:52:13.906457 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod180a38b2_06f7_49bb_8641_4d82c2e14183.slice/crio-98473bfe28f6412d6adc4f301b7e3d51eae06583ed4a1fa49c3c7531e1f240c4 WatchSource:0}: Error finding container 98473bfe28f6412d6adc4f301b7e3d51eae06583ed4a1fa49c3c7531e1f240c4: Status 404 returned error can't find the container with id 98473bfe28f6412d6adc4f301b7e3d51eae06583ed4a1fa49c3c7531e1f240c4 Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.911092 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wsg2v"] Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.950826 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d6r7\" (UniqueName: \"kubernetes.io/projected/bd8ec365-fbcd-4680-90cf-9a41cae499f3-kube-api-access-4d6r7\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.950868 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac47ec62-11c1-4eea-91c5-0331050fd880-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.950880 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac47ec62-11c1-4eea-91c5-0331050fd880-logs\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.950893 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd8ec365-fbcd-4680-90cf-9a41cae499f3-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.950906 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd8ec365-fbcd-4680-90cf-9a41cae499f3-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.950920 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd8ec365-fbcd-4680-90cf-9a41cae499f3-logs\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.950933 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ac47ec62-11c1-4eea-91c5-0331050fd880-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.950944 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d504807c-589e-4665-bc1d-865473001928-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.950957 4749 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.950968 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.950979 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jw4rw\" (UniqueName: \"kubernetes.io/projected/ac47ec62-11c1-4eea-91c5-0331050fd880-kube-api-access-jw4rw\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.950991 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s8rj\" (UniqueName: \"kubernetes.io/projected/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0-kube-api-access-9s8rj\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.951004 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff44afa-e6ee-4d6f-a041-3734a9e4a782-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.951015 4749 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bd8ec365-fbcd-4680-90cf-9a41cae499f3-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.951046 4749 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ac47ec62-11c1-4eea-91c5-0331050fd880-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.971284 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b4d589db8-c89ft"] Feb 19 18:52:13 crc kubenswrapper[4749]: I0219 18:52:13.977056 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.002355 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.005785 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-866f5f4f4b-zfvsn"] Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.021857 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c47c74b-13d3-47fa-859a-3b26113630b6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5c47c74b-13d3-47fa-859a-3b26113630b6" (UID: "5c47c74b-13d3-47fa-859a-3b26113630b6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.030035 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d504807c-589e-4665-bc1d-865473001928-config-data" (OuterVolumeSpecName: "config-data") pod "d504807c-589e-4665-bc1d-865473001928" (UID: "d504807c-589e-4665-bc1d-865473001928"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.033975 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/feeb65e2-e83b-4028-b7de-fa94205ccd40-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "feeb65e2-e83b-4028-b7de-fa94205ccd40" (UID: "feeb65e2-e83b-4028-b7de-fa94205ccd40"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.037490 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/feeb65e2-e83b-4028-b7de-fa94205ccd40-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "feeb65e2-e83b-4028-b7de-fa94205ccd40" (UID: "feeb65e2-e83b-4028-b7de-fa94205ccd40"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.047604 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c47c74b-13d3-47fa-859a-3b26113630b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c47c74b-13d3-47fa-859a-3b26113630b6" (UID: "5c47c74b-13d3-47fa-859a-3b26113630b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.053970 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff44afa-e6ee-4d6f-a041-3734a9e4a782-config-data" (OuterVolumeSpecName: "config-data") pod "eff44afa-e6ee-4d6f-a041-3734a9e4a782" (UID: "eff44afa-e6ee-4d6f-a041-3734a9e4a782"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.060207 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c47c74b-13d3-47fa-859a-3b26113630b6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.060264 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/feeb65e2-e83b-4028-b7de-fa94205ccd40-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.060286 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.060299 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff44afa-e6ee-4d6f-a041-3734a9e4a782-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.060335 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/feeb65e2-e83b-4028-b7de-fa94205ccd40-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.060350 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c47c74b-13d3-47fa-859a-3b26113630b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.060366 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d504807c-589e-4665-bc1d-865473001928-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.060377 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.090327 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c47c74b-13d3-47fa-859a-3b26113630b6-config-data" (OuterVolumeSpecName: "config-data") pod "5c47c74b-13d3-47fa-859a-3b26113630b6" (UID: "5c47c74b-13d3-47fa-859a-3b26113630b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.092590 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b4d589db8-c89ft" event={"ID":"8812ac95-8284-4b4f-a838-b5ab30a55fad","Type":"ContainerStarted","Data":"073a6dd4972bbbea5e9638ff1d1143c0219adc3d5bff175cdaa2cd0c39de058f"} Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.096510 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8678f4d997-tsdv8" event={"ID":"ac47ec62-11c1-4eea-91c5-0331050fd880","Type":"ContainerDied","Data":"f0c3f447954ade864d88c78fc089d9ee8d1a26916b4c207b0391d7208e0e2acb"} Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.096624 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8678f4d997-tsdv8" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.097358 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff44afa-e6ee-4d6f-a041-3734a9e4a782-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "eff44afa-e6ee-4d6f-a041-3734a9e4a782" (UID: "eff44afa-e6ee-4d6f-a041-3734a9e4a782"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.099212 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59c4845857-58lkr" event={"ID":"bd8ec365-fbcd-4680-90cf-9a41cae499f3","Type":"ContainerDied","Data":"c7608286a4a0987c5d4894464a3d7561cc2beddd22e03f44777556c4553e6b08"} Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.099339 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59c4845857-58lkr" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.113252 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-qr8r7" event={"ID":"4e6e94fd-e114-41b2-8634-ca805b5e260f","Type":"ContainerStarted","Data":"5efeef4609cfd895356d5ac735bb91074bc2f5dc82f3c2ece8267ae058e92e5e"} Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.116371 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/feeb65e2-e83b-4028-b7de-fa94205ccd40-config" (OuterVolumeSpecName: "config") pod "feeb65e2-e83b-4028-b7de-fa94205ccd40" (UID: "feeb65e2-e83b-4028-b7de-fa94205ccd40"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.117715 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"d504807c-589e-4665-bc1d-865473001928","Type":"ContainerDied","Data":"21307c92df3606595000710a5a2a6268c2e9b4cc817a0b2db1415f08e1e1ac45"} Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.117762 4749 scope.go:117] "RemoveContainer" containerID="897adf459149d118674caea831d4fb88b031b1ee84a890ff019055b73d760885" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.117975 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.118578 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/feeb65e2-e83b-4028-b7de-fa94205ccd40-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "feeb65e2-e83b-4028-b7de-fa94205ccd40" (UID: "feeb65e2-e83b-4028-b7de-fa94205ccd40"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.128853 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66d13d51-29b5-463f-873d-4a586878e0c4","Type":"ContainerStarted","Data":"6ce9e7be68ee2c73ea58411c5582a5214befdbaa9b34e3fc190b6c493b5cc976"} Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.132093 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5c47c74b-13d3-47fa-859a-3b26113630b6","Type":"ContainerDied","Data":"7655c226353c5f7b5f909c8143c3d12f2d6809dc196016229c302de7ed6d9954"} Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.132207 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.137822 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-qr8r7" podStartSLOduration=4.024289657 podStartE2EDuration="31.137805474s" podCreationTimestamp="2026-02-19 18:51:43 +0000 UTC" firstStartedPulling="2026-02-19 18:51:46.301891012 +0000 UTC m=+1080.263110966" lastFinishedPulling="2026-02-19 18:52:13.415406829 +0000 UTC m=+1107.376626783" observedRunningTime="2026-02-19 18:52:14.129302327 +0000 UTC m=+1108.090522291" watchObservedRunningTime="2026-02-19 18:52:14.137805474 +0000 UTC m=+1108.099025428" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.143447 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.146305 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eff44afa-e6ee-4d6f-a041-3734a9e4a782","Type":"ContainerDied","Data":"546e63d61a6b047c246182dd4dd3b29ec17a06bbb31f617717cbc677963581c7"} Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.153235 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" event={"ID":"feeb65e2-e83b-4028-b7de-fa94205ccd40","Type":"ContainerDied","Data":"76ff485dcce11c75362c064274cdb60ebe4ed6ff9773cdf30c4c7334c86f0081"} Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.153623 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-746f4bbcc9-sjckp" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.159586 4749 scope.go:117] "RemoveContainer" containerID="897fdd4e61c3eca9165f228a49ca773afc4b23a7886f0ec1b3e4c8c8365063dd" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.168611 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"763173db-176a-426d-bd85-e051d56ec5cf","Type":"ContainerStarted","Data":"cecf2d389ccc361882dad3f9a6bf6528e23aa044d6a9d743e749d18ef74520d2"} Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.170256 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eff44afa-e6ee-4d6f-a041-3734a9e4a782-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.170285 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feeb65e2-e83b-4028-b7de-fa94205ccd40-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.170296 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c47c74b-13d3-47fa-859a-3b26113630b6-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.170305 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/feeb65e2-e83b-4028-b7de-fa94205ccd40-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.182947 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-866f5f4f4b-zfvsn" event={"ID":"f8dcae23-3df3-4de3-8a0a-499c15a90daa","Type":"ContainerStarted","Data":"b138cbe178c3181869550453becd39b69d38d5ee3e960302cedeabe7f2c27acc"} Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.186824 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8678f4d997-tsdv8"] Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.188856 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wsg2v" event={"ID":"180a38b2-06f7-49bb-8641-4d82c2e14183","Type":"ContainerStarted","Data":"98473bfe28f6412d6adc4f301b7e3d51eae06583ed4a1fa49c3c7531e1f240c4"} Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.203676 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8678f4d997-tsdv8"] Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.205846 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6ff5f99c57-gtnp4" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.206340 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6ff5f99c57-gtnp4" event={"ID":"9013866c-5d2f-4e3d-b7fe-d38f7339c6f0","Type":"ContainerDied","Data":"3cfeec090f7f50e440cb1cdb491384d96fd5188a9d4c6124f44f0356a60f0625"} Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.210993 4749 scope.go:117] "RemoveContainer" containerID="a0410ab97e285a06c2966cd70f61787bee43eba44b005f0657a88e4cfcf7019b" Feb 19 18:52:14 crc kubenswrapper[4749]: E0219 18:52:14.211249 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.75:5001/podified-master-centos10/openstack-cinder-api:watcher_latest\\\"\"" pod="openstack/cinder-db-sync-f8dgh" podUID="d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.284317 4749 scope.go:117] "RemoveContainer" containerID="1c7b85a94c5bf116444861434a4880d49fe54402ee0914e03635675f8667c31e" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.305095 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-59c4845857-58lkr"] Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.319535 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-59c4845857-58lkr"] Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.349517 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.370258 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.392724 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.408069 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.418051 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Feb 19 18:52:14 crc kubenswrapper[4749]: E0219 18:52:14.418410 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c47c74b-13d3-47fa-859a-3b26113630b6" containerName="glance-httpd" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.418429 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c47c74b-13d3-47fa-859a-3b26113630b6" containerName="glance-httpd" Feb 19 18:52:14 crc kubenswrapper[4749]: E0219 18:52:14.418449 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d504807c-589e-4665-bc1d-865473001928" containerName="watcher-applier" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.418456 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d504807c-589e-4665-bc1d-865473001928" containerName="watcher-applier" Feb 19 18:52:14 crc kubenswrapper[4749]: E0219 18:52:14.418471 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff44afa-e6ee-4d6f-a041-3734a9e4a782" containerName="glance-httpd" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.418477 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff44afa-e6ee-4d6f-a041-3734a9e4a782" containerName="glance-httpd" Feb 19 18:52:14 crc kubenswrapper[4749]: E0219 18:52:14.418493 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff44afa-e6ee-4d6f-a041-3734a9e4a782" containerName="glance-log" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.418499 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff44afa-e6ee-4d6f-a041-3734a9e4a782" containerName="glance-log" Feb 19 18:52:14 crc kubenswrapper[4749]: E0219 18:52:14.418510 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feeb65e2-e83b-4028-b7de-fa94205ccd40" containerName="init" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.418516 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="feeb65e2-e83b-4028-b7de-fa94205ccd40" containerName="init" Feb 19 18:52:14 crc kubenswrapper[4749]: E0219 18:52:14.418527 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feeb65e2-e83b-4028-b7de-fa94205ccd40" containerName="dnsmasq-dns" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.418535 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="feeb65e2-e83b-4028-b7de-fa94205ccd40" containerName="dnsmasq-dns" Feb 19 18:52:14 crc kubenswrapper[4749]: E0219 18:52:14.418548 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c47c74b-13d3-47fa-859a-3b26113630b6" containerName="glance-log" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.418553 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c47c74b-13d3-47fa-859a-3b26113630b6" containerName="glance-log" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.418710 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff44afa-e6ee-4d6f-a041-3734a9e4a782" containerName="glance-log" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.418743 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d504807c-589e-4665-bc1d-865473001928" containerName="watcher-applier" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.418754 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c47c74b-13d3-47fa-859a-3b26113630b6" containerName="glance-httpd" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.418766 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="feeb65e2-e83b-4028-b7de-fa94205ccd40" containerName="dnsmasq-dns" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.418779 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c47c74b-13d3-47fa-859a-3b26113630b6" containerName="glance-log" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.418787 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff44afa-e6ee-4d6f-a041-3734a9e4a782" containerName="glance-httpd" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.419341 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.422041 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.428526 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.443766 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.455851 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.457640 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.460843 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-mcps2" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.463055 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.464006 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.464530 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.467136 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.492483 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.494016 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.496624 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.496940 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.500428 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.527083 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.533790 4749 scope.go:117] "RemoveContainer" containerID="a0a5fca73f5fe6c3ebf775e367b944412a62d951e18a6d1a1fda0ef64a52a1bb" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.557963 4749 scope.go:117] "RemoveContainer" containerID="8ba3d5b99d0381c433531c1399e7c1130409eac94c1973be2efe946a9286bcb0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.559536 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6ff5f99c57-gtnp4"] Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.568328 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6ff5f99c57-gtnp4"] Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.582265 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.582300 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.582322 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.582350 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk52t\" (UniqueName: \"kubernetes.io/projected/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-kube-api-access-wk52t\") pod \"glance-default-internal-api-0\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.582368 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwt25\" (UniqueName: \"kubernetes.io/projected/2124f654-902e-4591-9c6f-e98e919dc8ca-kube-api-access-hwt25\") pod \"watcher-applier-0\" (UID: \"2124f654-902e-4591-9c6f-e98e919dc8ca\") " pod="openstack/watcher-applier-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.582387 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2124f654-902e-4591-9c6f-e98e919dc8ca-logs\") pod \"watcher-applier-0\" (UID: \"2124f654-902e-4591-9c6f-e98e919dc8ca\") " pod="openstack/watcher-applier-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.582437 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2124f654-902e-4591-9c6f-e98e919dc8ca-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"2124f654-902e-4591-9c6f-e98e919dc8ca\") " pod="openstack/watcher-applier-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.582460 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.582475 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.582521 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.582544 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-logs\") pod \"glance-default-internal-api-0\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.582587 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2124f654-902e-4591-9c6f-e98e919dc8ca-config-data\") pod \"watcher-applier-0\" (UID: \"2124f654-902e-4591-9c6f-e98e919dc8ca\") " pod="openstack/watcher-applier-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.596484 4749 scope.go:117] "RemoveContainer" containerID="cd262b4a040b37c0de314cd9b9ee003eee8028544d645592dbdc70c9d5d62512" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.635401 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-746f4bbcc9-sjckp"] Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.650357 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-746f4bbcc9-sjckp"] Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.685194 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2124f654-902e-4591-9c6f-e98e919dc8ca-config-data\") pod \"watcher-applier-0\" (UID: \"2124f654-902e-4591-9c6f-e98e919dc8ca\") " pod="openstack/watcher-applier-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.685253 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e26f3eb-2933-4e84-9be8-2529ff29fc93-config-data\") pod \"glance-default-external-api-0\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " pod="openstack/glance-default-external-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.685281 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e26f3eb-2933-4e84-9be8-2529ff29fc93-scripts\") pod \"glance-default-external-api-0\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " pod="openstack/glance-default-external-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.685318 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.685343 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.685365 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.685408 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk52t\" (UniqueName: \"kubernetes.io/projected/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-kube-api-access-wk52t\") pod \"glance-default-internal-api-0\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.685431 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e26f3eb-2933-4e84-9be8-2529ff29fc93-logs\") pod \"glance-default-external-api-0\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " pod="openstack/glance-default-external-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.685453 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwt25\" (UniqueName: \"kubernetes.io/projected/2124f654-902e-4591-9c6f-e98e919dc8ca-kube-api-access-hwt25\") pod \"watcher-applier-0\" (UID: \"2124f654-902e-4591-9c6f-e98e919dc8ca\") " pod="openstack/watcher-applier-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.685473 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " pod="openstack/glance-default-external-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.685498 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2124f654-902e-4591-9c6f-e98e919dc8ca-logs\") pod \"watcher-applier-0\" (UID: \"2124f654-902e-4591-9c6f-e98e919dc8ca\") " pod="openstack/watcher-applier-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.685530 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e26f3eb-2933-4e84-9be8-2529ff29fc93-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " pod="openstack/glance-default-external-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.685562 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxdtj\" (UniqueName: \"kubernetes.io/projected/7e26f3eb-2933-4e84-9be8-2529ff29fc93-kube-api-access-hxdtj\") pod \"glance-default-external-api-0\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " pod="openstack/glance-default-external-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.685592 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7e26f3eb-2933-4e84-9be8-2529ff29fc93-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " pod="openstack/glance-default-external-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.685646 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2124f654-902e-4591-9c6f-e98e919dc8ca-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"2124f654-902e-4591-9c6f-e98e919dc8ca\") " pod="openstack/watcher-applier-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.685680 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.685702 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.685766 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.685790 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e26f3eb-2933-4e84-9be8-2529ff29fc93-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " pod="openstack/glance-default-external-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.685837 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-logs\") pod \"glance-default-internal-api-0\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.686132 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.686391 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-logs\") pod \"glance-default-internal-api-0\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.686676 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2124f654-902e-4591-9c6f-e98e919dc8ca-logs\") pod \"watcher-applier-0\" (UID: \"2124f654-902e-4591-9c6f-e98e919dc8ca\") " pod="openstack/watcher-applier-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.688156 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.693692 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2124f654-902e-4591-9c6f-e98e919dc8ca-config-data\") pod \"watcher-applier-0\" (UID: \"2124f654-902e-4591-9c6f-e98e919dc8ca\") " pod="openstack/watcher-applier-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.710294 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.711084 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2124f654-902e-4591-9c6f-e98e919dc8ca-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"2124f654-902e-4591-9c6f-e98e919dc8ca\") " pod="openstack/watcher-applier-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.711491 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.712015 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.740601 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwt25\" (UniqueName: \"kubernetes.io/projected/2124f654-902e-4591-9c6f-e98e919dc8ca-kube-api-access-hwt25\") pod \"watcher-applier-0\" (UID: \"2124f654-902e-4591-9c6f-e98e919dc8ca\") " pod="openstack/watcher-applier-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.745323 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c47c74b-13d3-47fa-859a-3b26113630b6" path="/var/lib/kubelet/pods/5c47c74b-13d3-47fa-859a-3b26113630b6/volumes" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.746643 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9013866c-5d2f-4e3d-b7fe-d38f7339c6f0" path="/var/lib/kubelet/pods/9013866c-5d2f-4e3d-b7fe-d38f7339c6f0/volumes" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.747865 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac47ec62-11c1-4eea-91c5-0331050fd880" path="/var/lib/kubelet/pods/ac47ec62-11c1-4eea-91c5-0331050fd880/volumes" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.749471 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd8ec365-fbcd-4680-90cf-9a41cae499f3" path="/var/lib/kubelet/pods/bd8ec365-fbcd-4680-90cf-9a41cae499f3/volumes" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.754670 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d504807c-589e-4665-bc1d-865473001928" path="/var/lib/kubelet/pods/d504807c-589e-4665-bc1d-865473001928/volumes" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.755736 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk52t\" (UniqueName: \"kubernetes.io/projected/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-kube-api-access-wk52t\") pod \"glance-default-internal-api-0\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.761518 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.779969 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eff44afa-e6ee-4d6f-a041-3734a9e4a782" path="/var/lib/kubelet/pods/eff44afa-e6ee-4d6f-a041-3734a9e4a782/volumes" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.780998 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feeb65e2-e83b-4028-b7de-fa94205ccd40" path="/var/lib/kubelet/pods/feeb65e2-e83b-4028-b7de-fa94205ccd40/volumes" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.787773 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e26f3eb-2933-4e84-9be8-2529ff29fc93-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " pod="openstack/glance-default-external-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.787820 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxdtj\" (UniqueName: \"kubernetes.io/projected/7e26f3eb-2933-4e84-9be8-2529ff29fc93-kube-api-access-hxdtj\") pod \"glance-default-external-api-0\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " pod="openstack/glance-default-external-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.787847 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7e26f3eb-2933-4e84-9be8-2529ff29fc93-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " pod="openstack/glance-default-external-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.787932 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e26f3eb-2933-4e84-9be8-2529ff29fc93-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " pod="openstack/glance-default-external-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.787992 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e26f3eb-2933-4e84-9be8-2529ff29fc93-config-data\") pod \"glance-default-external-api-0\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " pod="openstack/glance-default-external-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.788010 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e26f3eb-2933-4e84-9be8-2529ff29fc93-scripts\") pod \"glance-default-external-api-0\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " pod="openstack/glance-default-external-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.788084 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e26f3eb-2933-4e84-9be8-2529ff29fc93-logs\") pod \"glance-default-external-api-0\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " pod="openstack/glance-default-external-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.788102 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " pod="openstack/glance-default-external-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.788397 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.803165 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e26f3eb-2933-4e84-9be8-2529ff29fc93-config-data\") pod \"glance-default-external-api-0\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " pod="openstack/glance-default-external-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.803247 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7e26f3eb-2933-4e84-9be8-2529ff29fc93-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " pod="openstack/glance-default-external-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.803593 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e26f3eb-2933-4e84-9be8-2529ff29fc93-logs\") pod \"glance-default-external-api-0\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " pod="openstack/glance-default-external-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.804009 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e26f3eb-2933-4e84-9be8-2529ff29fc93-scripts\") pod \"glance-default-external-api-0\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " pod="openstack/glance-default-external-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.804689 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e26f3eb-2933-4e84-9be8-2529ff29fc93-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " pod="openstack/glance-default-external-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.807150 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxdtj\" (UniqueName: \"kubernetes.io/projected/7e26f3eb-2933-4e84-9be8-2529ff29fc93-kube-api-access-hxdtj\") pod \"glance-default-external-api-0\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " pod="openstack/glance-default-external-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.807387 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.809934 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e26f3eb-2933-4e84-9be8-2529ff29fc93-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " pod="openstack/glance-default-external-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.817061 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " pod="openstack/glance-default-external-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.883488 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.930510 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 18:52:14 crc kubenswrapper[4749]: I0219 18:52:14.981461 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 18:52:15 crc kubenswrapper[4749]: I0219 18:52:15.278592 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wsg2v" event={"ID":"180a38b2-06f7-49bb-8641-4d82c2e14183","Type":"ContainerStarted","Data":"aeb577c04e53e47cd1c67a35b32c7ae083c5ca53af319660498c257ead2a633e"} Feb 19 18:52:15 crc kubenswrapper[4749]: I0219 18:52:15.289899 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-866f5f4f4b-zfvsn" event={"ID":"f8dcae23-3df3-4de3-8a0a-499c15a90daa","Type":"ContainerStarted","Data":"2b8e7df8773bc6af522d0c6c7d572642c3a5fdcfab6d7b7b20a03d6c230947c7"} Feb 19 18:52:15 crc kubenswrapper[4749]: I0219 18:52:15.289959 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-866f5f4f4b-zfvsn" event={"ID":"f8dcae23-3df3-4de3-8a0a-499c15a90daa","Type":"ContainerStarted","Data":"7ed458c93c10e8a5d70696643628559a1656a862171ba6383208264fc426b8b5"} Feb 19 18:52:15 crc kubenswrapper[4749]: I0219 18:52:15.306311 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-wsg2v" podStartSLOduration=16.306291411 podStartE2EDuration="16.306291411s" podCreationTimestamp="2026-02-19 18:51:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:52:15.30018049 +0000 UTC m=+1109.261400454" watchObservedRunningTime="2026-02-19 18:52:15.306291411 +0000 UTC m=+1109.267511365" Feb 19 18:52:15 crc kubenswrapper[4749]: I0219 18:52:15.314078 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b4d589db8-c89ft" event={"ID":"8812ac95-8284-4b4f-a838-b5ab30a55fad","Type":"ContainerStarted","Data":"4f55ac9e20db43f08ba8674da0462556b179c7af240b517cc6d1a9cff2174890"} Feb 19 18:52:15 crc kubenswrapper[4749]: I0219 18:52:15.314126 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b4d589db8-c89ft" event={"ID":"8812ac95-8284-4b4f-a838-b5ab30a55fad","Type":"ContainerStarted","Data":"f0a875e9630d7f27af99a2b5718a090355c41085d6fd1f7bc26fdbb22cbde38f"} Feb 19 18:52:15 crc kubenswrapper[4749]: I0219 18:52:15.335673 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-866f5f4f4b-zfvsn" podStartSLOduration=24.201416064 podStartE2EDuration="24.335654101s" podCreationTimestamp="2026-02-19 18:51:51 +0000 UTC" firstStartedPulling="2026-02-19 18:52:14.025304574 +0000 UTC m=+1107.986524528" lastFinishedPulling="2026-02-19 18:52:14.159542611 +0000 UTC m=+1108.120762565" observedRunningTime="2026-02-19 18:52:15.319925711 +0000 UTC m=+1109.281145675" watchObservedRunningTime="2026-02-19 18:52:15.335654101 +0000 UTC m=+1109.296874055" Feb 19 18:52:15 crc kubenswrapper[4749]: I0219 18:52:15.358344 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7b4d589db8-c89ft" podStartSLOduration=23.276404506 podStartE2EDuration="23.358322423s" podCreationTimestamp="2026-02-19 18:51:52 +0000 UTC" firstStartedPulling="2026-02-19 18:52:14.008295789 +0000 UTC m=+1107.969515753" lastFinishedPulling="2026-02-19 18:52:14.090213716 +0000 UTC m=+1108.051433670" observedRunningTime="2026-02-19 18:52:15.343614782 +0000 UTC m=+1109.304834736" watchObservedRunningTime="2026-02-19 18:52:15.358322423 +0000 UTC m=+1109.319542377" Feb 19 18:52:15 crc kubenswrapper[4749]: I0219 18:52:15.516560 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 18:52:15 crc kubenswrapper[4749]: I0219 18:52:15.694480 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 18:52:15 crc kubenswrapper[4749]: I0219 18:52:15.843577 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 18:52:16 crc kubenswrapper[4749]: I0219 18:52:16.328345 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7e26f3eb-2933-4e84-9be8-2529ff29fc93","Type":"ContainerStarted","Data":"1343d0cfac32a0bf41ee917814cf8be72fb255949b8057ad931ada62110fe2b6"} Feb 19 18:52:16 crc kubenswrapper[4749]: I0219 18:52:16.333990 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"2124f654-902e-4591-9c6f-e98e919dc8ca","Type":"ContainerStarted","Data":"71ee3854394215c8e85f9b61fe43bf76098e7269572bd5d5696006317382cbd2"} Feb 19 18:52:16 crc kubenswrapper[4749]: I0219 18:52:16.334107 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"2124f654-902e-4591-9c6f-e98e919dc8ca","Type":"ContainerStarted","Data":"510a5190113cb3b81aad6b5e0d50068f3a5e2da5ccc962f4dc327094fb96ef02"} Feb 19 18:52:16 crc kubenswrapper[4749]: I0219 18:52:16.336107 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66d13d51-29b5-463f-873d-4a586878e0c4","Type":"ContainerStarted","Data":"d9d38f858e6edc203dc8df1bf5835ed813e32cef59fc673493c9bd967a7663f5"} Feb 19 18:52:16 crc kubenswrapper[4749]: I0219 18:52:16.337982 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e","Type":"ContainerStarted","Data":"10a7a972a3ca4987065ef5cc5e43b26ec415bd4394cb473967e9e3102dbe2671"} Feb 19 18:52:16 crc kubenswrapper[4749]: I0219 18:52:16.359912 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=2.359897261 podStartE2EDuration="2.359897261s" podCreationTimestamp="2026-02-19 18:52:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:52:16.355252031 +0000 UTC m=+1110.316471985" watchObservedRunningTime="2026-02-19 18:52:16.359897261 +0000 UTC m=+1110.321117215" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.125122 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.294902 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/facae815-e01a-4641-b07c-3530303cc691-config-data\") pod \"facae815-e01a-4641-b07c-3530303cc691\" (UID: \"facae815-e01a-4641-b07c-3530303cc691\") " Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.294987 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql7zx\" (UniqueName: \"kubernetes.io/projected/facae815-e01a-4641-b07c-3530303cc691-kube-api-access-ql7zx\") pod \"facae815-e01a-4641-b07c-3530303cc691\" (UID: \"facae815-e01a-4641-b07c-3530303cc691\") " Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.295078 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/facae815-e01a-4641-b07c-3530303cc691-logs\") pod \"facae815-e01a-4641-b07c-3530303cc691\" (UID: \"facae815-e01a-4641-b07c-3530303cc691\") " Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.295099 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/facae815-e01a-4641-b07c-3530303cc691-custom-prometheus-ca\") pod \"facae815-e01a-4641-b07c-3530303cc691\" (UID: \"facae815-e01a-4641-b07c-3530303cc691\") " Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.295123 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/facae815-e01a-4641-b07c-3530303cc691-combined-ca-bundle\") pod \"facae815-e01a-4641-b07c-3530303cc691\" (UID: \"facae815-e01a-4641-b07c-3530303cc691\") " Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.296131 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/facae815-e01a-4641-b07c-3530303cc691-logs" (OuterVolumeSpecName: "logs") pod "facae815-e01a-4641-b07c-3530303cc691" (UID: "facae815-e01a-4641-b07c-3530303cc691"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.302390 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/facae815-e01a-4641-b07c-3530303cc691-kube-api-access-ql7zx" (OuterVolumeSpecName: "kube-api-access-ql7zx") pod "facae815-e01a-4641-b07c-3530303cc691" (UID: "facae815-e01a-4641-b07c-3530303cc691"). InnerVolumeSpecName "kube-api-access-ql7zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.349270 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/facae815-e01a-4641-b07c-3530303cc691-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "facae815-e01a-4641-b07c-3530303cc691" (UID: "facae815-e01a-4641-b07c-3530303cc691"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.360763 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e","Type":"ContainerStarted","Data":"430b1de81387aa810d8cf01a79bc304a5124d103d6f552e5dcf0d40f9cf5a79f"} Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.371456 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7e26f3eb-2933-4e84-9be8-2529ff29fc93","Type":"ContainerStarted","Data":"2faff4371d10d3fe6bb565ff39f306b9c697e0b74686119bf87e91ca4a42c13d"} Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.393051 4749 generic.go:334] "Generic (PLEG): container finished" podID="facae815-e01a-4641-b07c-3530303cc691" containerID="d80ef33824a77d689c9f2c9c292b16fa7bd25fe28567d403b35fb92fc7ed97db" exitCode=137 Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.393181 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"facae815-e01a-4641-b07c-3530303cc691","Type":"ContainerDied","Data":"d80ef33824a77d689c9f2c9c292b16fa7bd25fe28567d403b35fb92fc7ed97db"} Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.393345 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"facae815-e01a-4641-b07c-3530303cc691","Type":"ContainerDied","Data":"e318394177cf0f3c51292edf370d483582b16dd533b4e12b59a4c82b5ac4c570"} Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.393365 4749 scope.go:117] "RemoveContainer" containerID="d80ef33824a77d689c9f2c9c292b16fa7bd25fe28567d403b35fb92fc7ed97db" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.393704 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.398186 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql7zx\" (UniqueName: \"kubernetes.io/projected/facae815-e01a-4641-b07c-3530303cc691-kube-api-access-ql7zx\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.398214 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/facae815-e01a-4641-b07c-3530303cc691-logs\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.398225 4749 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/facae815-e01a-4641-b07c-3530303cc691-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.415196 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/facae815-e01a-4641-b07c-3530303cc691-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "facae815-e01a-4641-b07c-3530303cc691" (UID: "facae815-e01a-4641-b07c-3530303cc691"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.432069 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/facae815-e01a-4641-b07c-3530303cc691-config-data" (OuterVolumeSpecName: "config-data") pod "facae815-e01a-4641-b07c-3530303cc691" (UID: "facae815-e01a-4641-b07c-3530303cc691"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.500608 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/facae815-e01a-4641-b07c-3530303cc691-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.500696 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/facae815-e01a-4641-b07c-3530303cc691-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.528431 4749 scope.go:117] "RemoveContainer" containerID="ea65b1b543f81f44fdc7df55387816b69495f41516268b76af60091185e84cb7" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.602498 4749 scope.go:117] "RemoveContainer" containerID="d80ef33824a77d689c9f2c9c292b16fa7bd25fe28567d403b35fb92fc7ed97db" Feb 19 18:52:17 crc kubenswrapper[4749]: E0219 18:52:17.606339 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d80ef33824a77d689c9f2c9c292b16fa7bd25fe28567d403b35fb92fc7ed97db\": container with ID starting with d80ef33824a77d689c9f2c9c292b16fa7bd25fe28567d403b35fb92fc7ed97db not found: ID does not exist" containerID="d80ef33824a77d689c9f2c9c292b16fa7bd25fe28567d403b35fb92fc7ed97db" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.606422 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d80ef33824a77d689c9f2c9c292b16fa7bd25fe28567d403b35fb92fc7ed97db"} err="failed to get container status \"d80ef33824a77d689c9f2c9c292b16fa7bd25fe28567d403b35fb92fc7ed97db\": rpc error: code = NotFound desc = could not find container \"d80ef33824a77d689c9f2c9c292b16fa7bd25fe28567d403b35fb92fc7ed97db\": container with ID starting with d80ef33824a77d689c9f2c9c292b16fa7bd25fe28567d403b35fb92fc7ed97db not found: ID does not exist" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.606456 4749 scope.go:117] "RemoveContainer" containerID="ea65b1b543f81f44fdc7df55387816b69495f41516268b76af60091185e84cb7" Feb 19 18:52:17 crc kubenswrapper[4749]: E0219 18:52:17.607282 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea65b1b543f81f44fdc7df55387816b69495f41516268b76af60091185e84cb7\": container with ID starting with ea65b1b543f81f44fdc7df55387816b69495f41516268b76af60091185e84cb7 not found: ID does not exist" containerID="ea65b1b543f81f44fdc7df55387816b69495f41516268b76af60091185e84cb7" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.607334 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea65b1b543f81f44fdc7df55387816b69495f41516268b76af60091185e84cb7"} err="failed to get container status \"ea65b1b543f81f44fdc7df55387816b69495f41516268b76af60091185e84cb7\": rpc error: code = NotFound desc = could not find container \"ea65b1b543f81f44fdc7df55387816b69495f41516268b76af60091185e84cb7\": container with ID starting with ea65b1b543f81f44fdc7df55387816b69495f41516268b76af60091185e84cb7 not found: ID does not exist" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.731306 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.750214 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.761080 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Feb 19 18:52:17 crc kubenswrapper[4749]: E0219 18:52:17.761481 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="facae815-e01a-4641-b07c-3530303cc691" containerName="watcher-api-log" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.761492 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="facae815-e01a-4641-b07c-3530303cc691" containerName="watcher-api-log" Feb 19 18:52:17 crc kubenswrapper[4749]: E0219 18:52:17.761510 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="facae815-e01a-4641-b07c-3530303cc691" containerName="watcher-api" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.761516 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="facae815-e01a-4641-b07c-3530303cc691" containerName="watcher-api" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.761698 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="facae815-e01a-4641-b07c-3530303cc691" containerName="watcher-api" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.761716 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="facae815-e01a-4641-b07c-3530303cc691" containerName="watcher-api-log" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.762654 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.764959 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.767661 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.806926 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebdea16c-c5e5-437d-a23c-8d710b197088-config-data\") pod \"watcher-api-0\" (UID: \"ebdea16c-c5e5-437d-a23c-8d710b197088\") " pod="openstack/watcher-api-0" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.807046 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebdea16c-c5e5-437d-a23c-8d710b197088-logs\") pod \"watcher-api-0\" (UID: \"ebdea16c-c5e5-437d-a23c-8d710b197088\") " pod="openstack/watcher-api-0" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.807140 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tjdd\" (UniqueName: \"kubernetes.io/projected/ebdea16c-c5e5-437d-a23c-8d710b197088-kube-api-access-7tjdd\") pod \"watcher-api-0\" (UID: \"ebdea16c-c5e5-437d-a23c-8d710b197088\") " pod="openstack/watcher-api-0" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.807188 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebdea16c-c5e5-437d-a23c-8d710b197088-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"ebdea16c-c5e5-437d-a23c-8d710b197088\") " pod="openstack/watcher-api-0" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.807336 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ebdea16c-c5e5-437d-a23c-8d710b197088-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"ebdea16c-c5e5-437d-a23c-8d710b197088\") " pod="openstack/watcher-api-0" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.917197 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebdea16c-c5e5-437d-a23c-8d710b197088-config-data\") pod \"watcher-api-0\" (UID: \"ebdea16c-c5e5-437d-a23c-8d710b197088\") " pod="openstack/watcher-api-0" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.917653 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebdea16c-c5e5-437d-a23c-8d710b197088-logs\") pod \"watcher-api-0\" (UID: \"ebdea16c-c5e5-437d-a23c-8d710b197088\") " pod="openstack/watcher-api-0" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.917680 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tjdd\" (UniqueName: \"kubernetes.io/projected/ebdea16c-c5e5-437d-a23c-8d710b197088-kube-api-access-7tjdd\") pod \"watcher-api-0\" (UID: \"ebdea16c-c5e5-437d-a23c-8d710b197088\") " pod="openstack/watcher-api-0" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.917699 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebdea16c-c5e5-437d-a23c-8d710b197088-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"ebdea16c-c5e5-437d-a23c-8d710b197088\") " pod="openstack/watcher-api-0" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.917802 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ebdea16c-c5e5-437d-a23c-8d710b197088-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"ebdea16c-c5e5-437d-a23c-8d710b197088\") " pod="openstack/watcher-api-0" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.918120 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebdea16c-c5e5-437d-a23c-8d710b197088-logs\") pod \"watcher-api-0\" (UID: \"ebdea16c-c5e5-437d-a23c-8d710b197088\") " pod="openstack/watcher-api-0" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.931348 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ebdea16c-c5e5-437d-a23c-8d710b197088-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"ebdea16c-c5e5-437d-a23c-8d710b197088\") " pod="openstack/watcher-api-0" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.931457 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebdea16c-c5e5-437d-a23c-8d710b197088-config-data\") pod \"watcher-api-0\" (UID: \"ebdea16c-c5e5-437d-a23c-8d710b197088\") " pod="openstack/watcher-api-0" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.943882 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebdea16c-c5e5-437d-a23c-8d710b197088-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"ebdea16c-c5e5-437d-a23c-8d710b197088\") " pod="openstack/watcher-api-0" Feb 19 18:52:17 crc kubenswrapper[4749]: I0219 18:52:17.944132 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tjdd\" (UniqueName: \"kubernetes.io/projected/ebdea16c-c5e5-437d-a23c-8d710b197088-kube-api-access-7tjdd\") pod \"watcher-api-0\" (UID: \"ebdea16c-c5e5-437d-a23c-8d710b197088\") " pod="openstack/watcher-api-0" Feb 19 18:52:18 crc kubenswrapper[4749]: I0219 18:52:18.079804 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 18:52:18 crc kubenswrapper[4749]: I0219 18:52:18.418705 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e","Type":"ContainerStarted","Data":"168ac2ff6309d655007aef520878e87ef5536f7fcfee61e78aca70aeed5b6341"} Feb 19 18:52:18 crc kubenswrapper[4749]: I0219 18:52:18.426055 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7e26f3eb-2933-4e84-9be8-2529ff29fc93","Type":"ContainerStarted","Data":"c7bfe50a5bece5a853c839e17c62830ca96a1e0b6c78f9ccb3cd22ae7b470397"} Feb 19 18:52:18 crc kubenswrapper[4749]: I0219 18:52:18.428782 4749 generic.go:334] "Generic (PLEG): container finished" podID="4e6e94fd-e114-41b2-8634-ca805b5e260f" containerID="5efeef4609cfd895356d5ac735bb91074bc2f5dc82f3c2ece8267ae058e92e5e" exitCode=0 Feb 19 18:52:18 crc kubenswrapper[4749]: I0219 18:52:18.428830 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-qr8r7" event={"ID":"4e6e94fd-e114-41b2-8634-ca805b5e260f","Type":"ContainerDied","Data":"5efeef4609cfd895356d5ac735bb91074bc2f5dc82f3c2ece8267ae058e92e5e"} Feb 19 18:52:18 crc kubenswrapper[4749]: I0219 18:52:18.435102 4749 generic.go:334] "Generic (PLEG): container finished" podID="763173db-176a-426d-bd85-e051d56ec5cf" containerID="cecf2d389ccc361882dad3f9a6bf6528e23aa044d6a9d743e749d18ef74520d2" exitCode=1 Feb 19 18:52:18 crc kubenswrapper[4749]: I0219 18:52:18.435139 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"763173db-176a-426d-bd85-e051d56ec5cf","Type":"ContainerDied","Data":"cecf2d389ccc361882dad3f9a6bf6528e23aa044d6a9d743e749d18ef74520d2"} Feb 19 18:52:18 crc kubenswrapper[4749]: I0219 18:52:18.435187 4749 scope.go:117] "RemoveContainer" containerID="017ee1107b65424619a4a0a0459814477d6930cc4863971e4911ed32a3f6c743" Feb 19 18:52:18 crc kubenswrapper[4749]: I0219 18:52:18.436332 4749 scope.go:117] "RemoveContainer" containerID="cecf2d389ccc361882dad3f9a6bf6528e23aa044d6a9d743e749d18ef74520d2" Feb 19 18:52:18 crc kubenswrapper[4749]: E0219 18:52:18.436788 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(763173db-176a-426d-bd85-e051d56ec5cf)\"" pod="openstack/watcher-decision-engine-0" podUID="763173db-176a-426d-bd85-e051d56ec5cf" Feb 19 18:52:18 crc kubenswrapper[4749]: I0219 18:52:18.449465 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.449445268 podStartE2EDuration="4.449445268s" podCreationTimestamp="2026-02-19 18:52:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:52:18.44163401 +0000 UTC m=+1112.402853964" watchObservedRunningTime="2026-02-19 18:52:18.449445268 +0000 UTC m=+1112.410665222" Feb 19 18:52:18 crc kubenswrapper[4749]: I0219 18:52:18.515319 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.515295657 podStartE2EDuration="4.515295657s" podCreationTimestamp="2026-02-19 18:52:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:52:18.488276772 +0000 UTC m=+1112.449496726" watchObservedRunningTime="2026-02-19 18:52:18.515295657 +0000 UTC m=+1112.476515611" Feb 19 18:52:18 crc kubenswrapper[4749]: I0219 18:52:18.533149 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 19 18:52:18 crc kubenswrapper[4749]: W0219 18:52:18.572386 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebdea16c_c5e5_437d_a23c_8d710b197088.slice/crio-57b34eb5808f62325e2d3e419ab2ce6a673bdb69fe2f15150519f058a015a7ee WatchSource:0}: Error finding container 57b34eb5808f62325e2d3e419ab2ce6a673bdb69fe2f15150519f058a015a7ee: Status 404 returned error can't find the container with id 57b34eb5808f62325e2d3e419ab2ce6a673bdb69fe2f15150519f058a015a7ee Feb 19 18:52:18 crc kubenswrapper[4749]: I0219 18:52:18.695839 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="facae815-e01a-4641-b07c-3530303cc691" path="/var/lib/kubelet/pods/facae815-e01a-4641-b07c-3530303cc691/volumes" Feb 19 18:52:19 crc kubenswrapper[4749]: I0219 18:52:19.445865 4749 generic.go:334] "Generic (PLEG): container finished" podID="180a38b2-06f7-49bb-8641-4d82c2e14183" containerID="aeb577c04e53e47cd1c67a35b32c7ae083c5ca53af319660498c257ead2a633e" exitCode=0 Feb 19 18:52:19 crc kubenswrapper[4749]: I0219 18:52:19.445932 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wsg2v" event={"ID":"180a38b2-06f7-49bb-8641-4d82c2e14183","Type":"ContainerDied","Data":"aeb577c04e53e47cd1c67a35b32c7ae083c5ca53af319660498c257ead2a633e"} Feb 19 18:52:19 crc kubenswrapper[4749]: I0219 18:52:19.448301 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"ebdea16c-c5e5-437d-a23c-8d710b197088","Type":"ContainerStarted","Data":"a0c6a0e375f952d13efff3835e318856f4b39cf1139c89d3be131f5c1346fcf2"} Feb 19 18:52:19 crc kubenswrapper[4749]: I0219 18:52:19.448550 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 19 18:52:19 crc kubenswrapper[4749]: I0219 18:52:19.448564 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"ebdea16c-c5e5-437d-a23c-8d710b197088","Type":"ContainerStarted","Data":"e259488edfb845b5bdfdc9dc7cbd72198f2564b425afee6ca4d61e66f4cffa87"} Feb 19 18:52:19 crc kubenswrapper[4749]: I0219 18:52:19.448577 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"ebdea16c-c5e5-437d-a23c-8d710b197088","Type":"ContainerStarted","Data":"57b34eb5808f62325e2d3e419ab2ce6a673bdb69fe2f15150519f058a015a7ee"} Feb 19 18:52:19 crc kubenswrapper[4749]: I0219 18:52:19.507627 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=2.5076063250000002 podStartE2EDuration="2.507606325s" podCreationTimestamp="2026-02-19 18:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:52:19.483460351 +0000 UTC m=+1113.444680325" watchObservedRunningTime="2026-02-19 18:52:19.507606325 +0000 UTC m=+1113.468826279" Feb 19 18:52:19 crc kubenswrapper[4749]: I0219 18:52:19.884502 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Feb 19 18:52:21 crc kubenswrapper[4749]: I0219 18:52:21.899710 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 19 18:52:22 crc kubenswrapper[4749]: I0219 18:52:22.346147 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:52:22 crc kubenswrapper[4749]: I0219 18:52:22.346226 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:52:22 crc kubenswrapper[4749]: I0219 18:52:22.452109 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7b4d589db8-c89ft" Feb 19 18:52:22 crc kubenswrapper[4749]: I0219 18:52:22.452212 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7b4d589db8-c89ft" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.080339 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.229096 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-qr8r7" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.235805 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wsg2v" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.323148 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e6e94fd-e114-41b2-8634-ca805b5e260f-config-data\") pod \"4e6e94fd-e114-41b2-8634-ca805b5e260f\" (UID: \"4e6e94fd-e114-41b2-8634-ca805b5e260f\") " Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.323466 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7h2b\" (UniqueName: \"kubernetes.io/projected/4e6e94fd-e114-41b2-8634-ca805b5e260f-kube-api-access-d7h2b\") pod \"4e6e94fd-e114-41b2-8634-ca805b5e260f\" (UID: \"4e6e94fd-e114-41b2-8634-ca805b5e260f\") " Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.323542 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e6e94fd-e114-41b2-8634-ca805b5e260f-combined-ca-bundle\") pod \"4e6e94fd-e114-41b2-8634-ca805b5e260f\" (UID: \"4e6e94fd-e114-41b2-8634-ca805b5e260f\") " Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.323569 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e6e94fd-e114-41b2-8634-ca805b5e260f-logs\") pod \"4e6e94fd-e114-41b2-8634-ca805b5e260f\" (UID: \"4e6e94fd-e114-41b2-8634-ca805b5e260f\") " Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.323674 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e6e94fd-e114-41b2-8634-ca805b5e260f-scripts\") pod \"4e6e94fd-e114-41b2-8634-ca805b5e260f\" (UID: \"4e6e94fd-e114-41b2-8634-ca805b5e260f\") " Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.326394 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e6e94fd-e114-41b2-8634-ca805b5e260f-logs" (OuterVolumeSpecName: "logs") pod "4e6e94fd-e114-41b2-8634-ca805b5e260f" (UID: "4e6e94fd-e114-41b2-8634-ca805b5e260f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.331258 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e6e94fd-e114-41b2-8634-ca805b5e260f-kube-api-access-d7h2b" (OuterVolumeSpecName: "kube-api-access-d7h2b") pod "4e6e94fd-e114-41b2-8634-ca805b5e260f" (UID: "4e6e94fd-e114-41b2-8634-ca805b5e260f"). InnerVolumeSpecName "kube-api-access-d7h2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.345979 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e6e94fd-e114-41b2-8634-ca805b5e260f-scripts" (OuterVolumeSpecName: "scripts") pod "4e6e94fd-e114-41b2-8634-ca805b5e260f" (UID: "4e6e94fd-e114-41b2-8634-ca805b5e260f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.355162 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e6e94fd-e114-41b2-8634-ca805b5e260f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e6e94fd-e114-41b2-8634-ca805b5e260f" (UID: "4e6e94fd-e114-41b2-8634-ca805b5e260f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.377671 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e6e94fd-e114-41b2-8634-ca805b5e260f-config-data" (OuterVolumeSpecName: "config-data") pod "4e6e94fd-e114-41b2-8634-ca805b5e260f" (UID: "4e6e94fd-e114-41b2-8634-ca805b5e260f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.426495 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-scripts\") pod \"180a38b2-06f7-49bb-8641-4d82c2e14183\" (UID: \"180a38b2-06f7-49bb-8641-4d82c2e14183\") " Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.426691 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-fernet-keys\") pod \"180a38b2-06f7-49bb-8641-4d82c2e14183\" (UID: \"180a38b2-06f7-49bb-8641-4d82c2e14183\") " Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.426783 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-combined-ca-bundle\") pod \"180a38b2-06f7-49bb-8641-4d82c2e14183\" (UID: \"180a38b2-06f7-49bb-8641-4d82c2e14183\") " Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.426935 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-credential-keys\") pod \"180a38b2-06f7-49bb-8641-4d82c2e14183\" (UID: \"180a38b2-06f7-49bb-8641-4d82c2e14183\") " Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.426989 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-config-data\") pod \"180a38b2-06f7-49bb-8641-4d82c2e14183\" (UID: \"180a38b2-06f7-49bb-8641-4d82c2e14183\") " Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.427157 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csxng\" (UniqueName: \"kubernetes.io/projected/180a38b2-06f7-49bb-8641-4d82c2e14183-kube-api-access-csxng\") pod \"180a38b2-06f7-49bb-8641-4d82c2e14183\" (UID: \"180a38b2-06f7-49bb-8641-4d82c2e14183\") " Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.427678 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e6e94fd-e114-41b2-8634-ca805b5e260f-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.427705 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7h2b\" (UniqueName: \"kubernetes.io/projected/4e6e94fd-e114-41b2-8634-ca805b5e260f-kube-api-access-d7h2b\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.427722 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e6e94fd-e114-41b2-8634-ca805b5e260f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.427734 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e6e94fd-e114-41b2-8634-ca805b5e260f-logs\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.427745 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e6e94fd-e114-41b2-8634-ca805b5e260f-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.432622 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/180a38b2-06f7-49bb-8641-4d82c2e14183-kube-api-access-csxng" (OuterVolumeSpecName: "kube-api-access-csxng") pod "180a38b2-06f7-49bb-8641-4d82c2e14183" (UID: "180a38b2-06f7-49bb-8641-4d82c2e14183"). InnerVolumeSpecName "kube-api-access-csxng". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.435162 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "180a38b2-06f7-49bb-8641-4d82c2e14183" (UID: "180a38b2-06f7-49bb-8641-4d82c2e14183"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.435190 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-scripts" (OuterVolumeSpecName: "scripts") pod "180a38b2-06f7-49bb-8641-4d82c2e14183" (UID: "180a38b2-06f7-49bb-8641-4d82c2e14183"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.437560 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "180a38b2-06f7-49bb-8641-4d82c2e14183" (UID: "180a38b2-06f7-49bb-8641-4d82c2e14183"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.458824 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-config-data" (OuterVolumeSpecName: "config-data") pod "180a38b2-06f7-49bb-8641-4d82c2e14183" (UID: "180a38b2-06f7-49bb-8641-4d82c2e14183"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.461672 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "180a38b2-06f7-49bb-8641-4d82c2e14183" (UID: "180a38b2-06f7-49bb-8641-4d82c2e14183"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.501012 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wsg2v" event={"ID":"180a38b2-06f7-49bb-8641-4d82c2e14183","Type":"ContainerDied","Data":"98473bfe28f6412d6adc4f301b7e3d51eae06583ed4a1fa49c3c7531e1f240c4"} Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.501296 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98473bfe28f6412d6adc4f301b7e3d51eae06583ed4a1fa49c3c7531e1f240c4" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.501124 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wsg2v" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.503831 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66d13d51-29b5-463f-873d-4a586878e0c4","Type":"ContainerStarted","Data":"51b4a12ec7e314d65b826d2e0230a1e857723da769725f939a15b6b32f547477"} Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.505146 4749 generic.go:334] "Generic (PLEG): container finished" podID="9efa7d7f-28c0-4bd7-ae4f-f988968544c0" containerID="59151c6bf16b4a152bc622f54e0507052951f8cc27cdb87f310817d937e10c53" exitCode=0 Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.505200 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jvrxf" event={"ID":"9efa7d7f-28c0-4bd7-ae4f-f988968544c0","Type":"ContainerDied","Data":"59151c6bf16b4a152bc622f54e0507052951f8cc27cdb87f310817d937e10c53"} Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.507368 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-qr8r7" event={"ID":"4e6e94fd-e114-41b2-8634-ca805b5e260f","Type":"ContainerDied","Data":"61100cbda9d361ef97e18e799345a72bb4279dd423e3032bbf4fa2a109dabb29"} Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.507423 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61100cbda9d361ef97e18e799345a72bb4279dd423e3032bbf4fa2a109dabb29" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.507422 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-qr8r7" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.529437 4749 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.529477 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.529495 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csxng\" (UniqueName: \"kubernetes.io/projected/180a38b2-06f7-49bb-8641-4d82c2e14183-kube-api-access-csxng\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.529508 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.529520 4749 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.529532 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180a38b2-06f7-49bb-8641-4d82c2e14183-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.552424 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.552481 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 18:52:23 crc kubenswrapper[4749]: I0219 18:52:23.553174 4749 scope.go:117] "RemoveContainer" containerID="cecf2d389ccc361882dad3f9a6bf6528e23aa044d6a9d743e749d18ef74520d2" Feb 19 18:52:23 crc kubenswrapper[4749]: E0219 18:52:23.553481 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(763173db-176a-426d-bd85-e051d56ec5cf)\"" pod="openstack/watcher-decision-engine-0" podUID="763173db-176a-426d-bd85-e051d56ec5cf" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.356323 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-78cffb9ffd-xjhv2"] Feb 19 18:52:24 crc kubenswrapper[4749]: E0219 18:52:24.358184 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6e94fd-e114-41b2-8634-ca805b5e260f" containerName="placement-db-sync" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.358281 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6e94fd-e114-41b2-8634-ca805b5e260f" containerName="placement-db-sync" Feb 19 18:52:24 crc kubenswrapper[4749]: E0219 18:52:24.358388 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="180a38b2-06f7-49bb-8641-4d82c2e14183" containerName="keystone-bootstrap" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.358468 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="180a38b2-06f7-49bb-8641-4d82c2e14183" containerName="keystone-bootstrap" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.358777 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6e94fd-e114-41b2-8634-ca805b5e260f" containerName="placement-db-sync" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.358891 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="180a38b2-06f7-49bb-8641-4d82c2e14183" containerName="keystone-bootstrap" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.360186 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.362314 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.362438 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.362776 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.362953 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-fdrlz" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.362951 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.389340 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-78cffb9ffd-xjhv2"] Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.443318 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-59976cccdd-n7pz6"] Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.444846 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.449278 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.449925 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.450071 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.450443 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-xqgf2" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.453281 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.458471 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.462840 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-59976cccdd-n7pz6"] Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.545609 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-combined-ca-bundle\") pod \"placement-78cffb9ffd-xjhv2\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.545675 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-logs\") pod \"placement-78cffb9ffd-xjhv2\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.545700 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/559ba668-4259-4ec1-a8b7-e6ab6b78d4b6-combined-ca-bundle\") pod \"keystone-59976cccdd-n7pz6\" (UID: \"559ba668-4259-4ec1-a8b7-e6ab6b78d4b6\") " pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.545757 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/559ba668-4259-4ec1-a8b7-e6ab6b78d4b6-public-tls-certs\") pod \"keystone-59976cccdd-n7pz6\" (UID: \"559ba668-4259-4ec1-a8b7-e6ab6b78d4b6\") " pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.545790 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-public-tls-certs\") pod \"placement-78cffb9ffd-xjhv2\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.545809 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/559ba668-4259-4ec1-a8b7-e6ab6b78d4b6-internal-tls-certs\") pod \"keystone-59976cccdd-n7pz6\" (UID: \"559ba668-4259-4ec1-a8b7-e6ab6b78d4b6\") " pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.545830 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/559ba668-4259-4ec1-a8b7-e6ab6b78d4b6-credential-keys\") pod \"keystone-59976cccdd-n7pz6\" (UID: \"559ba668-4259-4ec1-a8b7-e6ab6b78d4b6\") " pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.545855 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-scripts\") pod \"placement-78cffb9ffd-xjhv2\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.545875 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-internal-tls-certs\") pod \"placement-78cffb9ffd-xjhv2\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.545901 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-config-data\") pod \"placement-78cffb9ffd-xjhv2\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.545917 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/559ba668-4259-4ec1-a8b7-e6ab6b78d4b6-scripts\") pod \"keystone-59976cccdd-n7pz6\" (UID: \"559ba668-4259-4ec1-a8b7-e6ab6b78d4b6\") " pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.545961 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kntrq\" (UniqueName: \"kubernetes.io/projected/559ba668-4259-4ec1-a8b7-e6ab6b78d4b6-kube-api-access-kntrq\") pod \"keystone-59976cccdd-n7pz6\" (UID: \"559ba668-4259-4ec1-a8b7-e6ab6b78d4b6\") " pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.546100 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsdjw\" (UniqueName: \"kubernetes.io/projected/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-kube-api-access-qsdjw\") pod \"placement-78cffb9ffd-xjhv2\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.546236 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/559ba668-4259-4ec1-a8b7-e6ab6b78d4b6-fernet-keys\") pod \"keystone-59976cccdd-n7pz6\" (UID: \"559ba668-4259-4ec1-a8b7-e6ab6b78d4b6\") " pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.546281 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/559ba668-4259-4ec1-a8b7-e6ab6b78d4b6-config-data\") pod \"keystone-59976cccdd-n7pz6\" (UID: \"559ba668-4259-4ec1-a8b7-e6ab6b78d4b6\") " pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.648163 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/559ba668-4259-4ec1-a8b7-e6ab6b78d4b6-public-tls-certs\") pod \"keystone-59976cccdd-n7pz6\" (UID: \"559ba668-4259-4ec1-a8b7-e6ab6b78d4b6\") " pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.648223 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-public-tls-certs\") pod \"placement-78cffb9ffd-xjhv2\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.648248 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/559ba668-4259-4ec1-a8b7-e6ab6b78d4b6-internal-tls-certs\") pod \"keystone-59976cccdd-n7pz6\" (UID: \"559ba668-4259-4ec1-a8b7-e6ab6b78d4b6\") " pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.648269 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/559ba668-4259-4ec1-a8b7-e6ab6b78d4b6-credential-keys\") pod \"keystone-59976cccdd-n7pz6\" (UID: \"559ba668-4259-4ec1-a8b7-e6ab6b78d4b6\") " pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.648298 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-scripts\") pod \"placement-78cffb9ffd-xjhv2\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.648321 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-internal-tls-certs\") pod \"placement-78cffb9ffd-xjhv2\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.648344 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-config-data\") pod \"placement-78cffb9ffd-xjhv2\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.648357 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/559ba668-4259-4ec1-a8b7-e6ab6b78d4b6-scripts\") pod \"keystone-59976cccdd-n7pz6\" (UID: \"559ba668-4259-4ec1-a8b7-e6ab6b78d4b6\") " pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.648405 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kntrq\" (UniqueName: \"kubernetes.io/projected/559ba668-4259-4ec1-a8b7-e6ab6b78d4b6-kube-api-access-kntrq\") pod \"keystone-59976cccdd-n7pz6\" (UID: \"559ba668-4259-4ec1-a8b7-e6ab6b78d4b6\") " pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.648430 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsdjw\" (UniqueName: \"kubernetes.io/projected/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-kube-api-access-qsdjw\") pod \"placement-78cffb9ffd-xjhv2\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.648453 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/559ba668-4259-4ec1-a8b7-e6ab6b78d4b6-fernet-keys\") pod \"keystone-59976cccdd-n7pz6\" (UID: \"559ba668-4259-4ec1-a8b7-e6ab6b78d4b6\") " pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.648473 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/559ba668-4259-4ec1-a8b7-e6ab6b78d4b6-config-data\") pod \"keystone-59976cccdd-n7pz6\" (UID: \"559ba668-4259-4ec1-a8b7-e6ab6b78d4b6\") " pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.648513 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-combined-ca-bundle\") pod \"placement-78cffb9ffd-xjhv2\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.648534 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-logs\") pod \"placement-78cffb9ffd-xjhv2\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.648548 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/559ba668-4259-4ec1-a8b7-e6ab6b78d4b6-combined-ca-bundle\") pod \"keystone-59976cccdd-n7pz6\" (UID: \"559ba668-4259-4ec1-a8b7-e6ab6b78d4b6\") " pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.649682 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-logs\") pod \"placement-78cffb9ffd-xjhv2\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.655394 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/559ba668-4259-4ec1-a8b7-e6ab6b78d4b6-combined-ca-bundle\") pod \"keystone-59976cccdd-n7pz6\" (UID: \"559ba668-4259-4ec1-a8b7-e6ab6b78d4b6\") " pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.659999 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/559ba668-4259-4ec1-a8b7-e6ab6b78d4b6-internal-tls-certs\") pod \"keystone-59976cccdd-n7pz6\" (UID: \"559ba668-4259-4ec1-a8b7-e6ab6b78d4b6\") " pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.661493 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/559ba668-4259-4ec1-a8b7-e6ab6b78d4b6-scripts\") pod \"keystone-59976cccdd-n7pz6\" (UID: \"559ba668-4259-4ec1-a8b7-e6ab6b78d4b6\") " pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.661493 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-scripts\") pod \"placement-78cffb9ffd-xjhv2\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.661896 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-config-data\") pod \"placement-78cffb9ffd-xjhv2\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.661926 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-public-tls-certs\") pod \"placement-78cffb9ffd-xjhv2\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.662070 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-combined-ca-bundle\") pod \"placement-78cffb9ffd-xjhv2\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.662477 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-internal-tls-certs\") pod \"placement-78cffb9ffd-xjhv2\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.662997 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/559ba668-4259-4ec1-a8b7-e6ab6b78d4b6-fernet-keys\") pod \"keystone-59976cccdd-n7pz6\" (UID: \"559ba668-4259-4ec1-a8b7-e6ab6b78d4b6\") " pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.670882 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsdjw\" (UniqueName: \"kubernetes.io/projected/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-kube-api-access-qsdjw\") pod \"placement-78cffb9ffd-xjhv2\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.675709 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/559ba668-4259-4ec1-a8b7-e6ab6b78d4b6-config-data\") pod \"keystone-59976cccdd-n7pz6\" (UID: \"559ba668-4259-4ec1-a8b7-e6ab6b78d4b6\") " pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.676229 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kntrq\" (UniqueName: \"kubernetes.io/projected/559ba668-4259-4ec1-a8b7-e6ab6b78d4b6-kube-api-access-kntrq\") pod \"keystone-59976cccdd-n7pz6\" (UID: \"559ba668-4259-4ec1-a8b7-e6ab6b78d4b6\") " pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.696935 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/559ba668-4259-4ec1-a8b7-e6ab6b78d4b6-credential-keys\") pod \"keystone-59976cccdd-n7pz6\" (UID: \"559ba668-4259-4ec1-a8b7-e6ab6b78d4b6\") " pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.703752 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/559ba668-4259-4ec1-a8b7-e6ab6b78d4b6-public-tls-certs\") pod \"keystone-59976cccdd-n7pz6\" (UID: \"559ba668-4259-4ec1-a8b7-e6ab6b78d4b6\") " pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.721152 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.764926 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.863137 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jvrxf" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.886309 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.923508 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.931849 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.931896 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.986356 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.986390 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 18:52:24 crc kubenswrapper[4749]: I0219 18:52:24.988710 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.003878 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.034275 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.043215 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.058621 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkr2c\" (UniqueName: \"kubernetes.io/projected/9efa7d7f-28c0-4bd7-ae4f-f988968544c0-kube-api-access-hkr2c\") pod \"9efa7d7f-28c0-4bd7-ae4f-f988968544c0\" (UID: \"9efa7d7f-28c0-4bd7-ae4f-f988968544c0\") " Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.058691 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9efa7d7f-28c0-4bd7-ae4f-f988968544c0-combined-ca-bundle\") pod \"9efa7d7f-28c0-4bd7-ae4f-f988968544c0\" (UID: \"9efa7d7f-28c0-4bd7-ae4f-f988968544c0\") " Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.058758 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9efa7d7f-28c0-4bd7-ae4f-f988968544c0-config\") pod \"9efa7d7f-28c0-4bd7-ae4f-f988968544c0\" (UID: \"9efa7d7f-28c0-4bd7-ae4f-f988968544c0\") " Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.077356 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9efa7d7f-28c0-4bd7-ae4f-f988968544c0-kube-api-access-hkr2c" (OuterVolumeSpecName: "kube-api-access-hkr2c") pod "9efa7d7f-28c0-4bd7-ae4f-f988968544c0" (UID: "9efa7d7f-28c0-4bd7-ae4f-f988968544c0"). InnerVolumeSpecName "kube-api-access-hkr2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.121215 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9efa7d7f-28c0-4bd7-ae4f-f988968544c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9efa7d7f-28c0-4bd7-ae4f-f988968544c0" (UID: "9efa7d7f-28c0-4bd7-ae4f-f988968544c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.124201 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9efa7d7f-28c0-4bd7-ae4f-f988968544c0-config" (OuterVolumeSpecName: "config") pod "9efa7d7f-28c0-4bd7-ae4f-f988968544c0" (UID: "9efa7d7f-28c0-4bd7-ae4f-f988968544c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.164473 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkr2c\" (UniqueName: \"kubernetes.io/projected/9efa7d7f-28c0-4bd7-ae4f-f988968544c0-kube-api-access-hkr2c\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.164701 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9efa7d7f-28c0-4bd7-ae4f-f988968544c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.164782 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9efa7d7f-28c0-4bd7-ae4f-f988968544c0-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.211738 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-78cffb9ffd-xjhv2"] Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.300136 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-59976cccdd-n7pz6"] Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.556909 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-59976cccdd-n7pz6" event={"ID":"559ba668-4259-4ec1-a8b7-e6ab6b78d4b6","Type":"ContainerStarted","Data":"19b4d47ee37ba1c72e4ba07e2f3d9aaa267ec910a2ca1187d8118bf0af0a7fd8"} Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.556959 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-59976cccdd-n7pz6" event={"ID":"559ba668-4259-4ec1-a8b7-e6ab6b78d4b6","Type":"ContainerStarted","Data":"75f47ed63e06e8a05cf9128852a0151025cc1cf233773fc4feb3c9a126adf5e8"} Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.558255 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.579397 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jvrxf" event={"ID":"9efa7d7f-28c0-4bd7-ae4f-f988968544c0","Type":"ContainerDied","Data":"0d743b7b2b16f0298b31b12e733f020cef7960475ba08e5103cab7a47bb20256"} Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.579705 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d743b7b2b16f0298b31b12e733f020cef7960475ba08e5103cab7a47bb20256" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.579775 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jvrxf" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.600773 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78cffb9ffd-xjhv2" event={"ID":"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5","Type":"ContainerStarted","Data":"44a454c44ae72f405d6bfcd1bed6c789721d79e3f3df21d430e1b22966408630"} Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.600820 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.600833 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78cffb9ffd-xjhv2" event={"ID":"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5","Type":"ContainerStarted","Data":"750433d7989330751a636d9ae82300173cddae73dcd48458f86abc04cf9e9dc0"} Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.600848 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.601121 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.601222 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.609164 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-59976cccdd-n7pz6" podStartSLOduration=1.6091417319999999 podStartE2EDuration="1.609141732s" podCreationTimestamp="2026-02-19 18:52:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:52:25.592397684 +0000 UTC m=+1119.553617638" watchObservedRunningTime="2026-02-19 18:52:25.609141732 +0000 UTC m=+1119.570361676" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.708296 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.926948 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-687898f47c-94xfz"] Feb 19 18:52:25 crc kubenswrapper[4749]: E0219 18:52:25.927509 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9efa7d7f-28c0-4bd7-ae4f-f988968544c0" containerName="neutron-db-sync" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.927528 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9efa7d7f-28c0-4bd7-ae4f-f988968544c0" containerName="neutron-db-sync" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.927737 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9efa7d7f-28c0-4bd7-ae4f-f988968544c0" containerName="neutron-db-sync" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.928739 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-687898f47c-94xfz" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.937866 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-687898f47c-94xfz"] Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.983920 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-config\") pod \"dnsmasq-dns-687898f47c-94xfz\" (UID: \"463b2048-d918-4de3-834d-b6899813a604\") " pod="openstack/dnsmasq-dns-687898f47c-94xfz" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.984015 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-dns-svc\") pod \"dnsmasq-dns-687898f47c-94xfz\" (UID: \"463b2048-d918-4de3-834d-b6899813a604\") " pod="openstack/dnsmasq-dns-687898f47c-94xfz" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.984066 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-dns-swift-storage-0\") pod \"dnsmasq-dns-687898f47c-94xfz\" (UID: \"463b2048-d918-4de3-834d-b6899813a604\") " pod="openstack/dnsmasq-dns-687898f47c-94xfz" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.984090 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcz79\" (UniqueName: \"kubernetes.io/projected/463b2048-d918-4de3-834d-b6899813a604-kube-api-access-hcz79\") pod \"dnsmasq-dns-687898f47c-94xfz\" (UID: \"463b2048-d918-4de3-834d-b6899813a604\") " pod="openstack/dnsmasq-dns-687898f47c-94xfz" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.984139 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-ovsdbserver-nb\") pod \"dnsmasq-dns-687898f47c-94xfz\" (UID: \"463b2048-d918-4de3-834d-b6899813a604\") " pod="openstack/dnsmasq-dns-687898f47c-94xfz" Feb 19 18:52:25 crc kubenswrapper[4749]: I0219 18:52:25.984178 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-ovsdbserver-sb\") pod \"dnsmasq-dns-687898f47c-94xfz\" (UID: \"463b2048-d918-4de3-834d-b6899813a604\") " pod="openstack/dnsmasq-dns-687898f47c-94xfz" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.077987 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-64c65fc786-5rs9n"] Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.079779 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64c65fc786-5rs9n" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.091447 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-h7wtm" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.091891 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.092499 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-config\") pod \"dnsmasq-dns-687898f47c-94xfz\" (UID: \"463b2048-d918-4de3-834d-b6899813a604\") " pod="openstack/dnsmasq-dns-687898f47c-94xfz" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.092581 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-dns-svc\") pod \"dnsmasq-dns-687898f47c-94xfz\" (UID: \"463b2048-d918-4de3-834d-b6899813a604\") " pod="openstack/dnsmasq-dns-687898f47c-94xfz" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.092617 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-dns-swift-storage-0\") pod \"dnsmasq-dns-687898f47c-94xfz\" (UID: \"463b2048-d918-4de3-834d-b6899813a604\") " pod="openstack/dnsmasq-dns-687898f47c-94xfz" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.092640 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcz79\" (UniqueName: \"kubernetes.io/projected/463b2048-d918-4de3-834d-b6899813a604-kube-api-access-hcz79\") pod \"dnsmasq-dns-687898f47c-94xfz\" (UID: \"463b2048-d918-4de3-834d-b6899813a604\") " pod="openstack/dnsmasq-dns-687898f47c-94xfz" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.092678 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-ovsdbserver-nb\") pod \"dnsmasq-dns-687898f47c-94xfz\" (UID: \"463b2048-d918-4de3-834d-b6899813a604\") " pod="openstack/dnsmasq-dns-687898f47c-94xfz" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.092696 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-ovsdbserver-sb\") pod \"dnsmasq-dns-687898f47c-94xfz\" (UID: \"463b2048-d918-4de3-834d-b6899813a604\") " pod="openstack/dnsmasq-dns-687898f47c-94xfz" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.092754 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.092949 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.093495 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-ovsdbserver-sb\") pod \"dnsmasq-dns-687898f47c-94xfz\" (UID: \"463b2048-d918-4de3-834d-b6899813a604\") " pod="openstack/dnsmasq-dns-687898f47c-94xfz" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.094011 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-dns-swift-storage-0\") pod \"dnsmasq-dns-687898f47c-94xfz\" (UID: \"463b2048-d918-4de3-834d-b6899813a604\") " pod="openstack/dnsmasq-dns-687898f47c-94xfz" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.094224 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-ovsdbserver-nb\") pod \"dnsmasq-dns-687898f47c-94xfz\" (UID: \"463b2048-d918-4de3-834d-b6899813a604\") " pod="openstack/dnsmasq-dns-687898f47c-94xfz" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.094711 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-config\") pod \"dnsmasq-dns-687898f47c-94xfz\" (UID: \"463b2048-d918-4de3-834d-b6899813a604\") " pod="openstack/dnsmasq-dns-687898f47c-94xfz" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.095315 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-dns-svc\") pod \"dnsmasq-dns-687898f47c-94xfz\" (UID: \"463b2048-d918-4de3-834d-b6899813a604\") " pod="openstack/dnsmasq-dns-687898f47c-94xfz" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.119824 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-64c65fc786-5rs9n"] Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.140895 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcz79\" (UniqueName: \"kubernetes.io/projected/463b2048-d918-4de3-834d-b6899813a604-kube-api-access-hcz79\") pod \"dnsmasq-dns-687898f47c-94xfz\" (UID: \"463b2048-d918-4de3-834d-b6899813a604\") " pod="openstack/dnsmasq-dns-687898f47c-94xfz" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.198182 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cd0afcb9-06ed-4981-a030-34f26ae748c3-httpd-config\") pod \"neutron-64c65fc786-5rs9n\" (UID: \"cd0afcb9-06ed-4981-a030-34f26ae748c3\") " pod="openstack/neutron-64c65fc786-5rs9n" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.198221 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd0afcb9-06ed-4981-a030-34f26ae748c3-ovndb-tls-certs\") pod \"neutron-64c65fc786-5rs9n\" (UID: \"cd0afcb9-06ed-4981-a030-34f26ae748c3\") " pod="openstack/neutron-64c65fc786-5rs9n" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.198286 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8czks\" (UniqueName: \"kubernetes.io/projected/cd0afcb9-06ed-4981-a030-34f26ae748c3-kube-api-access-8czks\") pod \"neutron-64c65fc786-5rs9n\" (UID: \"cd0afcb9-06ed-4981-a030-34f26ae748c3\") " pod="openstack/neutron-64c65fc786-5rs9n" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.198335 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd0afcb9-06ed-4981-a030-34f26ae748c3-combined-ca-bundle\") pod \"neutron-64c65fc786-5rs9n\" (UID: \"cd0afcb9-06ed-4981-a030-34f26ae748c3\") " pod="openstack/neutron-64c65fc786-5rs9n" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.198375 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd0afcb9-06ed-4981-a030-34f26ae748c3-config\") pod \"neutron-64c65fc786-5rs9n\" (UID: \"cd0afcb9-06ed-4981-a030-34f26ae748c3\") " pod="openstack/neutron-64c65fc786-5rs9n" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.272460 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-687898f47c-94xfz" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.299443 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd0afcb9-06ed-4981-a030-34f26ae748c3-config\") pod \"neutron-64c65fc786-5rs9n\" (UID: \"cd0afcb9-06ed-4981-a030-34f26ae748c3\") " pod="openstack/neutron-64c65fc786-5rs9n" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.299523 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cd0afcb9-06ed-4981-a030-34f26ae748c3-httpd-config\") pod \"neutron-64c65fc786-5rs9n\" (UID: \"cd0afcb9-06ed-4981-a030-34f26ae748c3\") " pod="openstack/neutron-64c65fc786-5rs9n" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.299547 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd0afcb9-06ed-4981-a030-34f26ae748c3-ovndb-tls-certs\") pod \"neutron-64c65fc786-5rs9n\" (UID: \"cd0afcb9-06ed-4981-a030-34f26ae748c3\") " pod="openstack/neutron-64c65fc786-5rs9n" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.299612 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8czks\" (UniqueName: \"kubernetes.io/projected/cd0afcb9-06ed-4981-a030-34f26ae748c3-kube-api-access-8czks\") pod \"neutron-64c65fc786-5rs9n\" (UID: \"cd0afcb9-06ed-4981-a030-34f26ae748c3\") " pod="openstack/neutron-64c65fc786-5rs9n" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.299661 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd0afcb9-06ed-4981-a030-34f26ae748c3-combined-ca-bundle\") pod \"neutron-64c65fc786-5rs9n\" (UID: \"cd0afcb9-06ed-4981-a030-34f26ae748c3\") " pod="openstack/neutron-64c65fc786-5rs9n" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.306749 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd0afcb9-06ed-4981-a030-34f26ae748c3-combined-ca-bundle\") pod \"neutron-64c65fc786-5rs9n\" (UID: \"cd0afcb9-06ed-4981-a030-34f26ae748c3\") " pod="openstack/neutron-64c65fc786-5rs9n" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.306998 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cd0afcb9-06ed-4981-a030-34f26ae748c3-httpd-config\") pod \"neutron-64c65fc786-5rs9n\" (UID: \"cd0afcb9-06ed-4981-a030-34f26ae748c3\") " pod="openstack/neutron-64c65fc786-5rs9n" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.311912 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd0afcb9-06ed-4981-a030-34f26ae748c3-ovndb-tls-certs\") pod \"neutron-64c65fc786-5rs9n\" (UID: \"cd0afcb9-06ed-4981-a030-34f26ae748c3\") " pod="openstack/neutron-64c65fc786-5rs9n" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.321206 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd0afcb9-06ed-4981-a030-34f26ae748c3-config\") pod \"neutron-64c65fc786-5rs9n\" (UID: \"cd0afcb9-06ed-4981-a030-34f26ae748c3\") " pod="openstack/neutron-64c65fc786-5rs9n" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.340254 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8czks\" (UniqueName: \"kubernetes.io/projected/cd0afcb9-06ed-4981-a030-34f26ae748c3-kube-api-access-8czks\") pod \"neutron-64c65fc786-5rs9n\" (UID: \"cd0afcb9-06ed-4981-a030-34f26ae748c3\") " pod="openstack/neutron-64c65fc786-5rs9n" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.414296 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64c65fc786-5rs9n" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.633637 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78cffb9ffd-xjhv2" event={"ID":"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5","Type":"ContainerStarted","Data":"7608bdcb27a2cebde3e0f9c777ade7b95e122464ac03687e36db379b339f2e1a"} Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.633946 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.634420 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:52:26 crc kubenswrapper[4749]: I0219 18:52:26.713255 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-78cffb9ffd-xjhv2" podStartSLOduration=2.713238711 podStartE2EDuration="2.713238711s" podCreationTimestamp="2026-02-19 18:52:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:52:26.666411454 +0000 UTC m=+1120.627631408" watchObservedRunningTime="2026-02-19 18:52:26.713238711 +0000 UTC m=+1120.674458665" Feb 19 18:52:27 crc kubenswrapper[4749]: I0219 18:52:27.242563 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-687898f47c-94xfz"] Feb 19 18:52:27 crc kubenswrapper[4749]: I0219 18:52:27.272081 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-64c65fc786-5rs9n"] Feb 19 18:52:27 crc kubenswrapper[4749]: I0219 18:52:27.663800 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-687898f47c-94xfz" event={"ID":"463b2048-d918-4de3-834d-b6899813a604","Type":"ContainerStarted","Data":"013b5b22db710aaa5152174cfb1c99cedf56c97f065975447ab7f94d49e26da8"} Feb 19 18:52:27 crc kubenswrapper[4749]: I0219 18:52:27.666855 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 18:52:27 crc kubenswrapper[4749]: I0219 18:52:27.666874 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 18:52:27 crc kubenswrapper[4749]: I0219 18:52:27.667667 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64c65fc786-5rs9n" event={"ID":"cd0afcb9-06ed-4981-a030-34f26ae748c3","Type":"ContainerStarted","Data":"5778186b98cdccbb043af081d747247c9984d10e46b707a090a14f8a3b85013d"} Feb 19 18:52:27 crc kubenswrapper[4749]: I0219 18:52:27.668300 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 18:52:27 crc kubenswrapper[4749]: I0219 18:52:27.668329 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.080270 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.289384 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.390738 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-85c7d94649-hz2gq"] Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.392885 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85c7d94649-hz2gq" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.397381 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.397623 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.405376 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-85c7d94649-hz2gq"] Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.452083 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc4f02b-0135-46b5-ad46-aa2a9ce82f54-public-tls-certs\") pod \"neutron-85c7d94649-hz2gq\" (UID: \"3bc4f02b-0135-46b5-ad46-aa2a9ce82f54\") " pod="openstack/neutron-85c7d94649-hz2gq" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.452129 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtnzj\" (UniqueName: \"kubernetes.io/projected/3bc4f02b-0135-46b5-ad46-aa2a9ce82f54-kube-api-access-dtnzj\") pod \"neutron-85c7d94649-hz2gq\" (UID: \"3bc4f02b-0135-46b5-ad46-aa2a9ce82f54\") " pod="openstack/neutron-85c7d94649-hz2gq" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.452148 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc4f02b-0135-46b5-ad46-aa2a9ce82f54-ovndb-tls-certs\") pod \"neutron-85c7d94649-hz2gq\" (UID: \"3bc4f02b-0135-46b5-ad46-aa2a9ce82f54\") " pod="openstack/neutron-85c7d94649-hz2gq" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.452208 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3bc4f02b-0135-46b5-ad46-aa2a9ce82f54-config\") pod \"neutron-85c7d94649-hz2gq\" (UID: \"3bc4f02b-0135-46b5-ad46-aa2a9ce82f54\") " pod="openstack/neutron-85c7d94649-hz2gq" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.452271 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bc4f02b-0135-46b5-ad46-aa2a9ce82f54-combined-ca-bundle\") pod \"neutron-85c7d94649-hz2gq\" (UID: \"3bc4f02b-0135-46b5-ad46-aa2a9ce82f54\") " pod="openstack/neutron-85c7d94649-hz2gq" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.452336 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3bc4f02b-0135-46b5-ad46-aa2a9ce82f54-httpd-config\") pod \"neutron-85c7d94649-hz2gq\" (UID: \"3bc4f02b-0135-46b5-ad46-aa2a9ce82f54\") " pod="openstack/neutron-85c7d94649-hz2gq" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.452376 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc4f02b-0135-46b5-ad46-aa2a9ce82f54-internal-tls-certs\") pod \"neutron-85c7d94649-hz2gq\" (UID: \"3bc4f02b-0135-46b5-ad46-aa2a9ce82f54\") " pod="openstack/neutron-85c7d94649-hz2gq" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.554535 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3bc4f02b-0135-46b5-ad46-aa2a9ce82f54-httpd-config\") pod \"neutron-85c7d94649-hz2gq\" (UID: \"3bc4f02b-0135-46b5-ad46-aa2a9ce82f54\") " pod="openstack/neutron-85c7d94649-hz2gq" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.554609 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc4f02b-0135-46b5-ad46-aa2a9ce82f54-internal-tls-certs\") pod \"neutron-85c7d94649-hz2gq\" (UID: \"3bc4f02b-0135-46b5-ad46-aa2a9ce82f54\") " pod="openstack/neutron-85c7d94649-hz2gq" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.554636 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc4f02b-0135-46b5-ad46-aa2a9ce82f54-public-tls-certs\") pod \"neutron-85c7d94649-hz2gq\" (UID: \"3bc4f02b-0135-46b5-ad46-aa2a9ce82f54\") " pod="openstack/neutron-85c7d94649-hz2gq" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.554657 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtnzj\" (UniqueName: \"kubernetes.io/projected/3bc4f02b-0135-46b5-ad46-aa2a9ce82f54-kube-api-access-dtnzj\") pod \"neutron-85c7d94649-hz2gq\" (UID: \"3bc4f02b-0135-46b5-ad46-aa2a9ce82f54\") " pod="openstack/neutron-85c7d94649-hz2gq" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.554675 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc4f02b-0135-46b5-ad46-aa2a9ce82f54-ovndb-tls-certs\") pod \"neutron-85c7d94649-hz2gq\" (UID: \"3bc4f02b-0135-46b5-ad46-aa2a9ce82f54\") " pod="openstack/neutron-85c7d94649-hz2gq" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.554727 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3bc4f02b-0135-46b5-ad46-aa2a9ce82f54-config\") pod \"neutron-85c7d94649-hz2gq\" (UID: \"3bc4f02b-0135-46b5-ad46-aa2a9ce82f54\") " pod="openstack/neutron-85c7d94649-hz2gq" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.554758 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bc4f02b-0135-46b5-ad46-aa2a9ce82f54-combined-ca-bundle\") pod \"neutron-85c7d94649-hz2gq\" (UID: \"3bc4f02b-0135-46b5-ad46-aa2a9ce82f54\") " pod="openstack/neutron-85c7d94649-hz2gq" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.561511 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc4f02b-0135-46b5-ad46-aa2a9ce82f54-public-tls-certs\") pod \"neutron-85c7d94649-hz2gq\" (UID: \"3bc4f02b-0135-46b5-ad46-aa2a9ce82f54\") " pod="openstack/neutron-85c7d94649-hz2gq" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.571151 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bc4f02b-0135-46b5-ad46-aa2a9ce82f54-combined-ca-bundle\") pod \"neutron-85c7d94649-hz2gq\" (UID: \"3bc4f02b-0135-46b5-ad46-aa2a9ce82f54\") " pod="openstack/neutron-85c7d94649-hz2gq" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.571532 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3bc4f02b-0135-46b5-ad46-aa2a9ce82f54-httpd-config\") pod \"neutron-85c7d94649-hz2gq\" (UID: \"3bc4f02b-0135-46b5-ad46-aa2a9ce82f54\") " pod="openstack/neutron-85c7d94649-hz2gq" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.572393 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3bc4f02b-0135-46b5-ad46-aa2a9ce82f54-config\") pod \"neutron-85c7d94649-hz2gq\" (UID: \"3bc4f02b-0135-46b5-ad46-aa2a9ce82f54\") " pod="openstack/neutron-85c7d94649-hz2gq" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.573010 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc4f02b-0135-46b5-ad46-aa2a9ce82f54-ovndb-tls-certs\") pod \"neutron-85c7d94649-hz2gq\" (UID: \"3bc4f02b-0135-46b5-ad46-aa2a9ce82f54\") " pod="openstack/neutron-85c7d94649-hz2gq" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.575383 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtnzj\" (UniqueName: \"kubernetes.io/projected/3bc4f02b-0135-46b5-ad46-aa2a9ce82f54-kube-api-access-dtnzj\") pod \"neutron-85c7d94649-hz2gq\" (UID: \"3bc4f02b-0135-46b5-ad46-aa2a9ce82f54\") " pod="openstack/neutron-85c7d94649-hz2gq" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.575531 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc4f02b-0135-46b5-ad46-aa2a9ce82f54-internal-tls-certs\") pod \"neutron-85c7d94649-hz2gq\" (UID: \"3bc4f02b-0135-46b5-ad46-aa2a9ce82f54\") " pod="openstack/neutron-85c7d94649-hz2gq" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.702737 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-548647668b-bwckt"] Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.704944 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-548647668b-bwckt" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.720787 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.724989 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85c7d94649-hz2gq" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.738244 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-548647668b-bwckt"] Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.861121 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7dd258c-a64d-49cc-acf0-5bf79f10e8a5-scripts\") pod \"placement-548647668b-bwckt\" (UID: \"d7dd258c-a64d-49cc-acf0-5bf79f10e8a5\") " pod="openstack/placement-548647668b-bwckt" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.861536 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6msk2\" (UniqueName: \"kubernetes.io/projected/d7dd258c-a64d-49cc-acf0-5bf79f10e8a5-kube-api-access-6msk2\") pod \"placement-548647668b-bwckt\" (UID: \"d7dd258c-a64d-49cc-acf0-5bf79f10e8a5\") " pod="openstack/placement-548647668b-bwckt" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.861580 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7dd258c-a64d-49cc-acf0-5bf79f10e8a5-internal-tls-certs\") pod \"placement-548647668b-bwckt\" (UID: \"d7dd258c-a64d-49cc-acf0-5bf79f10e8a5\") " pod="openstack/placement-548647668b-bwckt" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.861669 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7dd258c-a64d-49cc-acf0-5bf79f10e8a5-logs\") pod \"placement-548647668b-bwckt\" (UID: \"d7dd258c-a64d-49cc-acf0-5bf79f10e8a5\") " pod="openstack/placement-548647668b-bwckt" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.861713 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7dd258c-a64d-49cc-acf0-5bf79f10e8a5-public-tls-certs\") pod \"placement-548647668b-bwckt\" (UID: \"d7dd258c-a64d-49cc-acf0-5bf79f10e8a5\") " pod="openstack/placement-548647668b-bwckt" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.861738 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7dd258c-a64d-49cc-acf0-5bf79f10e8a5-combined-ca-bundle\") pod \"placement-548647668b-bwckt\" (UID: \"d7dd258c-a64d-49cc-acf0-5bf79f10e8a5\") " pod="openstack/placement-548647668b-bwckt" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.861758 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7dd258c-a64d-49cc-acf0-5bf79f10e8a5-config-data\") pod \"placement-548647668b-bwckt\" (UID: \"d7dd258c-a64d-49cc-acf0-5bf79f10e8a5\") " pod="openstack/placement-548647668b-bwckt" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.963591 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7dd258c-a64d-49cc-acf0-5bf79f10e8a5-public-tls-certs\") pod \"placement-548647668b-bwckt\" (UID: \"d7dd258c-a64d-49cc-acf0-5bf79f10e8a5\") " pod="openstack/placement-548647668b-bwckt" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.964416 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7dd258c-a64d-49cc-acf0-5bf79f10e8a5-combined-ca-bundle\") pod \"placement-548647668b-bwckt\" (UID: \"d7dd258c-a64d-49cc-acf0-5bf79f10e8a5\") " pod="openstack/placement-548647668b-bwckt" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.964451 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7dd258c-a64d-49cc-acf0-5bf79f10e8a5-config-data\") pod \"placement-548647668b-bwckt\" (UID: \"d7dd258c-a64d-49cc-acf0-5bf79f10e8a5\") " pod="openstack/placement-548647668b-bwckt" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.964902 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7dd258c-a64d-49cc-acf0-5bf79f10e8a5-scripts\") pod \"placement-548647668b-bwckt\" (UID: \"d7dd258c-a64d-49cc-acf0-5bf79f10e8a5\") " pod="openstack/placement-548647668b-bwckt" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.965363 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6msk2\" (UniqueName: \"kubernetes.io/projected/d7dd258c-a64d-49cc-acf0-5bf79f10e8a5-kube-api-access-6msk2\") pod \"placement-548647668b-bwckt\" (UID: \"d7dd258c-a64d-49cc-acf0-5bf79f10e8a5\") " pod="openstack/placement-548647668b-bwckt" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.965399 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7dd258c-a64d-49cc-acf0-5bf79f10e8a5-internal-tls-certs\") pod \"placement-548647668b-bwckt\" (UID: \"d7dd258c-a64d-49cc-acf0-5bf79f10e8a5\") " pod="openstack/placement-548647668b-bwckt" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.965461 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7dd258c-a64d-49cc-acf0-5bf79f10e8a5-logs\") pod \"placement-548647668b-bwckt\" (UID: \"d7dd258c-a64d-49cc-acf0-5bf79f10e8a5\") " pod="openstack/placement-548647668b-bwckt" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.965774 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7dd258c-a64d-49cc-acf0-5bf79f10e8a5-logs\") pod \"placement-548647668b-bwckt\" (UID: \"d7dd258c-a64d-49cc-acf0-5bf79f10e8a5\") " pod="openstack/placement-548647668b-bwckt" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.970314 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7dd258c-a64d-49cc-acf0-5bf79f10e8a5-config-data\") pod \"placement-548647668b-bwckt\" (UID: \"d7dd258c-a64d-49cc-acf0-5bf79f10e8a5\") " pod="openstack/placement-548647668b-bwckt" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.973447 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7dd258c-a64d-49cc-acf0-5bf79f10e8a5-internal-tls-certs\") pod \"placement-548647668b-bwckt\" (UID: \"d7dd258c-a64d-49cc-acf0-5bf79f10e8a5\") " pod="openstack/placement-548647668b-bwckt" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.976776 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7dd258c-a64d-49cc-acf0-5bf79f10e8a5-combined-ca-bundle\") pod \"placement-548647668b-bwckt\" (UID: \"d7dd258c-a64d-49cc-acf0-5bf79f10e8a5\") " pod="openstack/placement-548647668b-bwckt" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.980371 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7dd258c-a64d-49cc-acf0-5bf79f10e8a5-public-tls-certs\") pod \"placement-548647668b-bwckt\" (UID: \"d7dd258c-a64d-49cc-acf0-5bf79f10e8a5\") " pod="openstack/placement-548647668b-bwckt" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.980660 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7dd258c-a64d-49cc-acf0-5bf79f10e8a5-scripts\") pod \"placement-548647668b-bwckt\" (UID: \"d7dd258c-a64d-49cc-acf0-5bf79f10e8a5\") " pod="openstack/placement-548647668b-bwckt" Feb 19 18:52:28 crc kubenswrapper[4749]: I0219 18:52:28.992642 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6msk2\" (UniqueName: \"kubernetes.io/projected/d7dd258c-a64d-49cc-acf0-5bf79f10e8a5-kube-api-access-6msk2\") pod \"placement-548647668b-bwckt\" (UID: \"d7dd258c-a64d-49cc-acf0-5bf79f10e8a5\") " pod="openstack/placement-548647668b-bwckt" Feb 19 18:52:29 crc kubenswrapper[4749]: I0219 18:52:29.039436 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-548647668b-bwckt" Feb 19 18:52:29 crc kubenswrapper[4749]: I0219 18:52:29.525570 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-85c7d94649-hz2gq"] Feb 19 18:52:29 crc kubenswrapper[4749]: W0219 18:52:29.528950 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bc4f02b_0135_46b5_ad46_aa2a9ce82f54.slice/crio-c1333ecf59e44a104464da645335330bcc948fdbab4c7cd7a767026217762861 WatchSource:0}: Error finding container c1333ecf59e44a104464da645335330bcc948fdbab4c7cd7a767026217762861: Status 404 returned error can't find the container with id c1333ecf59e44a104464da645335330bcc948fdbab4c7cd7a767026217762861 Feb 19 18:52:29 crc kubenswrapper[4749]: I0219 18:52:29.591780 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-548647668b-bwckt"] Feb 19 18:52:29 crc kubenswrapper[4749]: W0219 18:52:29.608330 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7dd258c_a64d_49cc_acf0_5bf79f10e8a5.slice/crio-382db8fd3ff681b778af1b7ad243bafba691923209a2657ce03a91757265e619 WatchSource:0}: Error finding container 382db8fd3ff681b778af1b7ad243bafba691923209a2657ce03a91757265e619: Status 404 returned error can't find the container with id 382db8fd3ff681b778af1b7ad243bafba691923209a2657ce03a91757265e619 Feb 19 18:52:29 crc kubenswrapper[4749]: I0219 18:52:29.703558 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85c7d94649-hz2gq" event={"ID":"3bc4f02b-0135-46b5-ad46-aa2a9ce82f54","Type":"ContainerStarted","Data":"c1333ecf59e44a104464da645335330bcc948fdbab4c7cd7a767026217762861"} Feb 19 18:52:29 crc kubenswrapper[4749]: I0219 18:52:29.710848 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-548647668b-bwckt" event={"ID":"d7dd258c-a64d-49cc-acf0-5bf79f10e8a5","Type":"ContainerStarted","Data":"382db8fd3ff681b778af1b7ad243bafba691923209a2657ce03a91757265e619"} Feb 19 18:52:30 crc kubenswrapper[4749]: I0219 18:52:30.745145 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f8dgh" event={"ID":"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d","Type":"ContainerStarted","Data":"2afb535e3c37a3548a563720815df50da4e49fd0ce861c86afb0dccb93d27954"} Feb 19 18:52:30 crc kubenswrapper[4749]: I0219 18:52:30.785317 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64c65fc786-5rs9n" event={"ID":"cd0afcb9-06ed-4981-a030-34f26ae748c3","Type":"ContainerStarted","Data":"7679c4c78ad17569a830bc78b91549f45e3b99b32012c7029ba9b8e4c4c3aaab"} Feb 19 18:52:30 crc kubenswrapper[4749]: I0219 18:52:30.796746 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-f8dgh" podStartSLOduration=8.169544955 podStartE2EDuration="47.796726546s" podCreationTimestamp="2026-02-19 18:51:43 +0000 UTC" firstStartedPulling="2026-02-19 18:51:46.207469885 +0000 UTC m=+1080.168689839" lastFinishedPulling="2026-02-19 18:52:25.834651476 +0000 UTC m=+1119.795871430" observedRunningTime="2026-02-19 18:52:30.768393005 +0000 UTC m=+1124.729612969" watchObservedRunningTime="2026-02-19 18:52:30.796726546 +0000 UTC m=+1124.757946500" Feb 19 18:52:30 crc kubenswrapper[4749]: I0219 18:52:30.808096 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85c7d94649-hz2gq" event={"ID":"3bc4f02b-0135-46b5-ad46-aa2a9ce82f54","Type":"ContainerStarted","Data":"2c79f2a9fe1adbdb0baf895817a69a5e6a096a0b9c045611671fbb4cf2ab46ee"} Feb 19 18:52:30 crc kubenswrapper[4749]: I0219 18:52:30.819479 4749 generic.go:334] "Generic (PLEG): container finished" podID="463b2048-d918-4de3-834d-b6899813a604" containerID="da0e8cb375b3aaa51c4f256052df56da7007a9e5503dc1ef1a0a03fdf04aefa0" exitCode=0 Feb 19 18:52:30 crc kubenswrapper[4749]: I0219 18:52:30.819545 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-687898f47c-94xfz" event={"ID":"463b2048-d918-4de3-834d-b6899813a604","Type":"ContainerDied","Data":"da0e8cb375b3aaa51c4f256052df56da7007a9e5503dc1ef1a0a03fdf04aefa0"} Feb 19 18:52:30 crc kubenswrapper[4749]: I0219 18:52:30.835361 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-548647668b-bwckt" event={"ID":"d7dd258c-a64d-49cc-acf0-5bf79f10e8a5","Type":"ContainerStarted","Data":"ca947258e53b1aec857f48b3b17c790b23d56a3a802a20d2eb6de2b187eb25b9"} Feb 19 18:52:32 crc kubenswrapper[4749]: I0219 18:52:32.583774 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 19 18:52:32 crc kubenswrapper[4749]: I0219 18:52:32.586117 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="ebdea16c-c5e5-437d-a23c-8d710b197088" containerName="watcher-api" containerID="cri-o://a0c6a0e375f952d13efff3835e318856f4b39cf1139c89d3be131f5c1346fcf2" gracePeriod=30 Feb 19 18:52:32 crc kubenswrapper[4749]: I0219 18:52:32.586334 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="ebdea16c-c5e5-437d-a23c-8d710b197088" containerName="watcher-api-log" containerID="cri-o://e259488edfb845b5bdfdc9dc7cbd72198f2564b425afee6ca4d61e66f4cffa87" gracePeriod=30 Feb 19 18:52:32 crc kubenswrapper[4749]: I0219 18:52:32.883692 4749 generic.go:334] "Generic (PLEG): container finished" podID="ebdea16c-c5e5-437d-a23c-8d710b197088" containerID="e259488edfb845b5bdfdc9dc7cbd72198f2564b425afee6ca4d61e66f4cffa87" exitCode=143 Feb 19 18:52:32 crc kubenswrapper[4749]: I0219 18:52:32.883914 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"ebdea16c-c5e5-437d-a23c-8d710b197088","Type":"ContainerDied","Data":"e259488edfb845b5bdfdc9dc7cbd72198f2564b425afee6ca4d61e66f4cffa87"} Feb 19 18:52:34 crc kubenswrapper[4749]: I0219 18:52:34.466641 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="ebdea16c-c5e5-437d-a23c-8d710b197088" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.172:9322/\": read tcp 10.217.0.2:41794->10.217.0.172:9322: read: connection reset by peer" Feb 19 18:52:34 crc kubenswrapper[4749]: I0219 18:52:34.466686 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="ebdea16c-c5e5-437d-a23c-8d710b197088" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.172:9322/\": read tcp 10.217.0.2:41792->10.217.0.172:9322: read: connection reset by peer" Feb 19 18:52:34 crc kubenswrapper[4749]: I0219 18:52:34.730453 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 18:52:34 crc kubenswrapper[4749]: I0219 18:52:34.730573 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 18:52:34 crc kubenswrapper[4749]: I0219 18:52:34.736822 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 18:52:34 crc kubenswrapper[4749]: I0219 18:52:34.765534 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:52:34 crc kubenswrapper[4749]: I0219 18:52:34.773161 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 18:52:34 crc kubenswrapper[4749]: I0219 18:52:34.773277 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 18:52:34 crc kubenswrapper[4749]: I0219 18:52:34.773567 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 18:52:34 crc kubenswrapper[4749]: I0219 18:52:34.928015 4749 generic.go:334] "Generic (PLEG): container finished" podID="ebdea16c-c5e5-437d-a23c-8d710b197088" containerID="a0c6a0e375f952d13efff3835e318856f4b39cf1139c89d3be131f5c1346fcf2" exitCode=0 Feb 19 18:52:34 crc kubenswrapper[4749]: I0219 18:52:34.928062 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"ebdea16c-c5e5-437d-a23c-8d710b197088","Type":"ContainerDied","Data":"a0c6a0e375f952d13efff3835e318856f4b39cf1139c89d3be131f5c1346fcf2"} Feb 19 18:52:35 crc kubenswrapper[4749]: I0219 18:52:35.292483 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7b4d589db8-c89ft" Feb 19 18:52:37 crc kubenswrapper[4749]: I0219 18:52:37.083695 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:52:37 crc kubenswrapper[4749]: I0219 18:52:37.417590 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7b4d589db8-c89ft" Feb 19 18:52:37 crc kubenswrapper[4749]: I0219 18:52:37.489887 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-866f5f4f4b-zfvsn"] Feb 19 18:52:37 crc kubenswrapper[4749]: I0219 18:52:37.957577 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64c65fc786-5rs9n" event={"ID":"cd0afcb9-06ed-4981-a030-34f26ae748c3","Type":"ContainerStarted","Data":"8d09a764d722e755c6a433d6318e930d5aff9b5a5b875bc9822a77e9a76fd842"} Feb 19 18:52:37 crc kubenswrapper[4749]: I0219 18:52:37.957737 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-866f5f4f4b-zfvsn" podUID="f8dcae23-3df3-4de3-8a0a-499c15a90daa" containerName="horizon-log" containerID="cri-o://7ed458c93c10e8a5d70696643628559a1656a862171ba6383208264fc426b8b5" gracePeriod=30 Feb 19 18:52:37 crc kubenswrapper[4749]: I0219 18:52:37.957833 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-866f5f4f4b-zfvsn" podUID="f8dcae23-3df3-4de3-8a0a-499c15a90daa" containerName="horizon" containerID="cri-o://2b8e7df8773bc6af522d0c6c7d572642c3a5fdcfab6d7b7b20a03d6c230947c7" gracePeriod=30 Feb 19 18:52:37 crc kubenswrapper[4749]: I0219 18:52:37.984378 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-64c65fc786-5rs9n" podStartSLOduration=11.984354229000001 podStartE2EDuration="11.984354229s" podCreationTimestamp="2026-02-19 18:52:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:52:37.976287964 +0000 UTC m=+1131.937507928" watchObservedRunningTime="2026-02-19 18:52:37.984354229 +0000 UTC m=+1131.945574193" Feb 19 18:52:38 crc kubenswrapper[4749]: I0219 18:52:38.197940 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 18:52:38 crc kubenswrapper[4749]: I0219 18:52:38.287723 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ebdea16c-c5e5-437d-a23c-8d710b197088-custom-prometheus-ca\") pod \"ebdea16c-c5e5-437d-a23c-8d710b197088\" (UID: \"ebdea16c-c5e5-437d-a23c-8d710b197088\") " Feb 19 18:52:38 crc kubenswrapper[4749]: I0219 18:52:38.288152 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebdea16c-c5e5-437d-a23c-8d710b197088-logs\") pod \"ebdea16c-c5e5-437d-a23c-8d710b197088\" (UID: \"ebdea16c-c5e5-437d-a23c-8d710b197088\") " Feb 19 18:52:38 crc kubenswrapper[4749]: I0219 18:52:38.288206 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tjdd\" (UniqueName: \"kubernetes.io/projected/ebdea16c-c5e5-437d-a23c-8d710b197088-kube-api-access-7tjdd\") pod \"ebdea16c-c5e5-437d-a23c-8d710b197088\" (UID: \"ebdea16c-c5e5-437d-a23c-8d710b197088\") " Feb 19 18:52:38 crc kubenswrapper[4749]: I0219 18:52:38.288255 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebdea16c-c5e5-437d-a23c-8d710b197088-combined-ca-bundle\") pod \"ebdea16c-c5e5-437d-a23c-8d710b197088\" (UID: \"ebdea16c-c5e5-437d-a23c-8d710b197088\") " Feb 19 18:52:38 crc kubenswrapper[4749]: I0219 18:52:38.288293 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebdea16c-c5e5-437d-a23c-8d710b197088-config-data\") pod \"ebdea16c-c5e5-437d-a23c-8d710b197088\" (UID: \"ebdea16c-c5e5-437d-a23c-8d710b197088\") " Feb 19 18:52:38 crc kubenswrapper[4749]: I0219 18:52:38.288548 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebdea16c-c5e5-437d-a23c-8d710b197088-logs" (OuterVolumeSpecName: "logs") pod "ebdea16c-c5e5-437d-a23c-8d710b197088" (UID: "ebdea16c-c5e5-437d-a23c-8d710b197088"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:52:38 crc kubenswrapper[4749]: I0219 18:52:38.288913 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebdea16c-c5e5-437d-a23c-8d710b197088-logs\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:38 crc kubenswrapper[4749]: I0219 18:52:38.314339 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebdea16c-c5e5-437d-a23c-8d710b197088-kube-api-access-7tjdd" (OuterVolumeSpecName: "kube-api-access-7tjdd") pod "ebdea16c-c5e5-437d-a23c-8d710b197088" (UID: "ebdea16c-c5e5-437d-a23c-8d710b197088"). InnerVolumeSpecName "kube-api-access-7tjdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:52:38 crc kubenswrapper[4749]: I0219 18:52:38.321559 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebdea16c-c5e5-437d-a23c-8d710b197088-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "ebdea16c-c5e5-437d-a23c-8d710b197088" (UID: "ebdea16c-c5e5-437d-a23c-8d710b197088"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:38 crc kubenswrapper[4749]: I0219 18:52:38.353108 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebdea16c-c5e5-437d-a23c-8d710b197088-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebdea16c-c5e5-437d-a23c-8d710b197088" (UID: "ebdea16c-c5e5-437d-a23c-8d710b197088"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:38 crc kubenswrapper[4749]: I0219 18:52:38.353353 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebdea16c-c5e5-437d-a23c-8d710b197088-config-data" (OuterVolumeSpecName: "config-data") pod "ebdea16c-c5e5-437d-a23c-8d710b197088" (UID: "ebdea16c-c5e5-437d-a23c-8d710b197088"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:38 crc kubenswrapper[4749]: I0219 18:52:38.390701 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tjdd\" (UniqueName: \"kubernetes.io/projected/ebdea16c-c5e5-437d-a23c-8d710b197088-kube-api-access-7tjdd\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:38 crc kubenswrapper[4749]: I0219 18:52:38.390732 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebdea16c-c5e5-437d-a23c-8d710b197088-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:38 crc kubenswrapper[4749]: I0219 18:52:38.390742 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebdea16c-c5e5-437d-a23c-8d710b197088-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:38 crc kubenswrapper[4749]: I0219 18:52:38.390751 4749 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ebdea16c-c5e5-437d-a23c-8d710b197088-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:38 crc kubenswrapper[4749]: I0219 18:52:38.679524 4749 scope.go:117] "RemoveContainer" containerID="cecf2d389ccc361882dad3f9a6bf6528e23aa044d6a9d743e749d18ef74520d2" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.023021 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85c7d94649-hz2gq" event={"ID":"3bc4f02b-0135-46b5-ad46-aa2a9ce82f54","Type":"ContainerStarted","Data":"a92131655ad771bc30681224b634aa8445b9ac714e9434fa81907224d810137f"} Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.024333 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-85c7d94649-hz2gq" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.060447 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-85c7d94649-hz2gq" podStartSLOduration=11.060424886 podStartE2EDuration="11.060424886s" podCreationTimestamp="2026-02-19 18:52:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:52:39.051544578 +0000 UTC m=+1133.012764552" watchObservedRunningTime="2026-02-19 18:52:39.060424886 +0000 UTC m=+1133.021644830" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.106346 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"ebdea16c-c5e5-437d-a23c-8d710b197088","Type":"ContainerDied","Data":"57b34eb5808f62325e2d3e419ab2ce6a673bdb69fe2f15150519f058a015a7ee"} Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.106403 4749 scope.go:117] "RemoveContainer" containerID="a0c6a0e375f952d13efff3835e318856f4b39cf1139c89d3be131f5c1346fcf2" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.106562 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.150184 4749 generic.go:334] "Generic (PLEG): container finished" podID="f8dcae23-3df3-4de3-8a0a-499c15a90daa" containerID="2b8e7df8773bc6af522d0c6c7d572642c3a5fdcfab6d7b7b20a03d6c230947c7" exitCode=0 Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.151084 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-866f5f4f4b-zfvsn" event={"ID":"f8dcae23-3df3-4de3-8a0a-499c15a90daa","Type":"ContainerDied","Data":"2b8e7df8773bc6af522d0c6c7d572642c3a5fdcfab6d7b7b20a03d6c230947c7"} Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.151112 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-64c65fc786-5rs9n" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.208976 4749 scope.go:117] "RemoveContainer" containerID="e259488edfb845b5bdfdc9dc7cbd72198f2564b425afee6ca4d61e66f4cffa87" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.284176 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.301845 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.368871 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Feb 19 18:52:39 crc kubenswrapper[4749]: E0219 18:52:39.369619 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebdea16c-c5e5-437d-a23c-8d710b197088" containerName="watcher-api" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.369633 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebdea16c-c5e5-437d-a23c-8d710b197088" containerName="watcher-api" Feb 19 18:52:39 crc kubenswrapper[4749]: E0219 18:52:39.369653 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebdea16c-c5e5-437d-a23c-8d710b197088" containerName="watcher-api-log" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.369659 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebdea16c-c5e5-437d-a23c-8d710b197088" containerName="watcher-api-log" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.371015 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebdea16c-c5e5-437d-a23c-8d710b197088" containerName="watcher-api-log" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.371126 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebdea16c-c5e5-437d-a23c-8d710b197088" containerName="watcher-api" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.372341 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.375192 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.375393 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.375553 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.384437 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.415086 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " pod="openstack/watcher-api-0" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.415334 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-logs\") pod \"watcher-api-0\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " pod="openstack/watcher-api-0" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.415417 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-config-data\") pod \"watcher-api-0\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " pod="openstack/watcher-api-0" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.415493 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdbzq\" (UniqueName: \"kubernetes.io/projected/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-kube-api-access-kdbzq\") pod \"watcher-api-0\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " pod="openstack/watcher-api-0" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.415569 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " pod="openstack/watcher-api-0" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.415658 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " pod="openstack/watcher-api-0" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.415731 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-public-tls-certs\") pod \"watcher-api-0\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " pod="openstack/watcher-api-0" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.517682 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " pod="openstack/watcher-api-0" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.517746 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-public-tls-certs\") pod \"watcher-api-0\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " pod="openstack/watcher-api-0" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.517913 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " pod="openstack/watcher-api-0" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.517969 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-logs\") pod \"watcher-api-0\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " pod="openstack/watcher-api-0" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.518011 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-config-data\") pod \"watcher-api-0\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " pod="openstack/watcher-api-0" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.518080 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdbzq\" (UniqueName: \"kubernetes.io/projected/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-kube-api-access-kdbzq\") pod \"watcher-api-0\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " pod="openstack/watcher-api-0" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.518145 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " pod="openstack/watcher-api-0" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.518386 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-logs\") pod \"watcher-api-0\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " pod="openstack/watcher-api-0" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.523791 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-public-tls-certs\") pod \"watcher-api-0\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " pod="openstack/watcher-api-0" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.525110 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " pod="openstack/watcher-api-0" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.526144 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " pod="openstack/watcher-api-0" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.528271 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-config-data\") pod \"watcher-api-0\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " pod="openstack/watcher-api-0" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.534811 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " pod="openstack/watcher-api-0" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.548524 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdbzq\" (UniqueName: \"kubernetes.io/projected/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-kube-api-access-kdbzq\") pod \"watcher-api-0\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " pod="openstack/watcher-api-0" Feb 19 18:52:39 crc kubenswrapper[4749]: I0219 18:52:39.693454 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 18:52:40 crc kubenswrapper[4749]: I0219 18:52:40.163456 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-548647668b-bwckt" event={"ID":"d7dd258c-a64d-49cc-acf0-5bf79f10e8a5","Type":"ContainerStarted","Data":"c36fa814c17b9f1d274e1b1d69b791143a39f19a949f14b2d522fd572c674f05"} Feb 19 18:52:40 crc kubenswrapper[4749]: I0219 18:52:40.165275 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-548647668b-bwckt" Feb 19 18:52:40 crc kubenswrapper[4749]: I0219 18:52:40.165543 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-548647668b-bwckt" Feb 19 18:52:40 crc kubenswrapper[4749]: I0219 18:52:40.167987 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"763173db-176a-426d-bd85-e051d56ec5cf","Type":"ContainerStarted","Data":"bb744c3795b7bbfd4b38ce2f51e278731316bb1a81d475e683fa156086c9ffdd"} Feb 19 18:52:40 crc kubenswrapper[4749]: I0219 18:52:40.181764 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66d13d51-29b5-463f-873d-4a586878e0c4","Type":"ContainerStarted","Data":"1884f8aec9b918e3e847629d5b23ad3ef1071c728b2864d3092f53a1e68e86ae"} Feb 19 18:52:40 crc kubenswrapper[4749]: I0219 18:52:40.181885 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="66d13d51-29b5-463f-873d-4a586878e0c4" containerName="ceilometer-central-agent" containerID="cri-o://6ce9e7be68ee2c73ea58411c5582a5214befdbaa9b34e3fc190b6c493b5cc976" gracePeriod=30 Feb 19 18:52:40 crc kubenswrapper[4749]: I0219 18:52:40.181980 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 18:52:40 crc kubenswrapper[4749]: I0219 18:52:40.182038 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="66d13d51-29b5-463f-873d-4a586878e0c4" containerName="proxy-httpd" containerID="cri-o://1884f8aec9b918e3e847629d5b23ad3ef1071c728b2864d3092f53a1e68e86ae" gracePeriod=30 Feb 19 18:52:40 crc kubenswrapper[4749]: I0219 18:52:40.182080 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="66d13d51-29b5-463f-873d-4a586878e0c4" containerName="sg-core" containerID="cri-o://51b4a12ec7e314d65b826d2e0230a1e857723da769725f939a15b6b32f547477" gracePeriod=30 Feb 19 18:52:40 crc kubenswrapper[4749]: I0219 18:52:40.182113 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="66d13d51-29b5-463f-873d-4a586878e0c4" containerName="ceilometer-notification-agent" containerID="cri-o://d9d38f858e6edc203dc8df1bf5835ed813e32cef59fc673493c9bd967a7663f5" gracePeriod=30 Feb 19 18:52:40 crc kubenswrapper[4749]: I0219 18:52:40.194171 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 19 18:52:40 crc kubenswrapper[4749]: I0219 18:52:40.199885 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-548647668b-bwckt" podStartSLOduration=12.199863792 podStartE2EDuration="12.199863792s" podCreationTimestamp="2026-02-19 18:52:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:52:40.196810047 +0000 UTC m=+1134.158030001" watchObservedRunningTime="2026-02-19 18:52:40.199863792 +0000 UTC m=+1134.161083746" Feb 19 18:52:40 crc kubenswrapper[4749]: I0219 18:52:40.200137 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cmbgb" event={"ID":"5701fcc2-ae2a-4017-8991-3470421ff234","Type":"ContainerStarted","Data":"fd16a22fd1e877553f10b8e81bace93a1789b59cfab3e6a9937e8cdb3cbf4092"} Feb 19 18:52:40 crc kubenswrapper[4749]: I0219 18:52:40.215738 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-687898f47c-94xfz" event={"ID":"463b2048-d918-4de3-834d-b6899813a604","Type":"ContainerStarted","Data":"559448d79682b63b3e3f5289796e09dc6c05b11dae7e863c5ac2cdb3e5792652"} Feb 19 18:52:40 crc kubenswrapper[4749]: I0219 18:52:40.215785 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-687898f47c-94xfz" Feb 19 18:52:40 crc kubenswrapper[4749]: I0219 18:52:40.256397 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.512190566 podStartE2EDuration="57.256377409s" podCreationTimestamp="2026-02-19 18:51:43 +0000 UTC" firstStartedPulling="2026-02-19 18:51:46.206793038 +0000 UTC m=+1080.168012992" lastFinishedPulling="2026-02-19 18:52:38.950979871 +0000 UTC m=+1132.912199835" observedRunningTime="2026-02-19 18:52:40.252258674 +0000 UTC m=+1134.213478638" watchObservedRunningTime="2026-02-19 18:52:40.256377409 +0000 UTC m=+1134.217597363" Feb 19 18:52:40 crc kubenswrapper[4749]: I0219 18:52:40.284611 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-687898f47c-94xfz" podStartSLOduration=15.284595027 podStartE2EDuration="15.284595027s" podCreationTimestamp="2026-02-19 18:52:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:52:40.271292226 +0000 UTC m=+1134.232512180" watchObservedRunningTime="2026-02-19 18:52:40.284595027 +0000 UTC m=+1134.245814981" Feb 19 18:52:40 crc kubenswrapper[4749]: I0219 18:52:40.299765 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-cmbgb" podStartSLOduration=3.589218743 podStartE2EDuration="57.29974778s" podCreationTimestamp="2026-02-19 18:51:43 +0000 UTC" firstStartedPulling="2026-02-19 18:51:45.235356611 +0000 UTC m=+1079.196576565" lastFinishedPulling="2026-02-19 18:52:38.945885638 +0000 UTC m=+1132.907105602" observedRunningTime="2026-02-19 18:52:40.289164965 +0000 UTC m=+1134.250384919" watchObservedRunningTime="2026-02-19 18:52:40.29974778 +0000 UTC m=+1134.260967734" Feb 19 18:52:40 crc kubenswrapper[4749]: I0219 18:52:40.689103 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebdea16c-c5e5-437d-a23c-8d710b197088" path="/var/lib/kubelet/pods/ebdea16c-c5e5-437d-a23c-8d710b197088/volumes" Feb 19 18:52:41 crc kubenswrapper[4749]: I0219 18:52:41.224267 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"5d0300f8-4f96-4a80-ab8c-ebfec24dce10","Type":"ContainerStarted","Data":"407cee09108dc1abb8c7926b286e7267ac6421007336a636481ae314b11fc6cb"} Feb 19 18:52:41 crc kubenswrapper[4749]: I0219 18:52:41.224909 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"5d0300f8-4f96-4a80-ab8c-ebfec24dce10","Type":"ContainerStarted","Data":"0b2d83fedc147ae1ae77fd48b66a049fc9f67e5140ec6175f6d725c0bd11c5c4"} Feb 19 18:52:41 crc kubenswrapper[4749]: I0219 18:52:41.224926 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"5d0300f8-4f96-4a80-ab8c-ebfec24dce10","Type":"ContainerStarted","Data":"dc6872dc284041b62c24ae595013b41da166ccf5b9ebfe7df81d5208d24962f9"} Feb 19 18:52:41 crc kubenswrapper[4749]: I0219 18:52:41.224967 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 19 18:52:41 crc kubenswrapper[4749]: I0219 18:52:41.226371 4749 generic.go:334] "Generic (PLEG): container finished" podID="d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d" containerID="2afb535e3c37a3548a563720815df50da4e49fd0ce861c86afb0dccb93d27954" exitCode=0 Feb 19 18:52:41 crc kubenswrapper[4749]: I0219 18:52:41.226449 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f8dgh" event={"ID":"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d","Type":"ContainerDied","Data":"2afb535e3c37a3548a563720815df50da4e49fd0ce861c86afb0dccb93d27954"} Feb 19 18:52:41 crc kubenswrapper[4749]: I0219 18:52:41.229724 4749 generic.go:334] "Generic (PLEG): container finished" podID="66d13d51-29b5-463f-873d-4a586878e0c4" containerID="1884f8aec9b918e3e847629d5b23ad3ef1071c728b2864d3092f53a1e68e86ae" exitCode=0 Feb 19 18:52:41 crc kubenswrapper[4749]: I0219 18:52:41.229761 4749 generic.go:334] "Generic (PLEG): container finished" podID="66d13d51-29b5-463f-873d-4a586878e0c4" containerID="51b4a12ec7e314d65b826d2e0230a1e857723da769725f939a15b6b32f547477" exitCode=2 Feb 19 18:52:41 crc kubenswrapper[4749]: I0219 18:52:41.229770 4749 generic.go:334] "Generic (PLEG): container finished" podID="66d13d51-29b5-463f-873d-4a586878e0c4" containerID="6ce9e7be68ee2c73ea58411c5582a5214befdbaa9b34e3fc190b6c493b5cc976" exitCode=0 Feb 19 18:52:41 crc kubenswrapper[4749]: I0219 18:52:41.229791 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66d13d51-29b5-463f-873d-4a586878e0c4","Type":"ContainerDied","Data":"1884f8aec9b918e3e847629d5b23ad3ef1071c728b2864d3092f53a1e68e86ae"} Feb 19 18:52:41 crc kubenswrapper[4749]: I0219 18:52:41.229825 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66d13d51-29b5-463f-873d-4a586878e0c4","Type":"ContainerDied","Data":"51b4a12ec7e314d65b826d2e0230a1e857723da769725f939a15b6b32f547477"} Feb 19 18:52:41 crc kubenswrapper[4749]: I0219 18:52:41.229838 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66d13d51-29b5-463f-873d-4a586878e0c4","Type":"ContainerDied","Data":"6ce9e7be68ee2c73ea58411c5582a5214befdbaa9b34e3fc190b6c493b5cc976"} Feb 19 18:52:41 crc kubenswrapper[4749]: I0219 18:52:41.248718 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=2.248696338 podStartE2EDuration="2.248696338s" podCreationTimestamp="2026-02-19 18:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:52:41.242368561 +0000 UTC m=+1135.203588535" watchObservedRunningTime="2026-02-19 18:52:41.248696338 +0000 UTC m=+1135.209916302" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.246790 4749 generic.go:334] "Generic (PLEG): container finished" podID="66d13d51-29b5-463f-873d-4a586878e0c4" containerID="d9d38f858e6edc203dc8df1bf5835ed813e32cef59fc673493c9bd967a7663f5" exitCode=0 Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.246875 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66d13d51-29b5-463f-873d-4a586878e0c4","Type":"ContainerDied","Data":"d9d38f858e6edc203dc8df1bf5835ed813e32cef59fc673493c9bd967a7663f5"} Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.247393 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.346902 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-866f5f4f4b-zfvsn" podUID="f8dcae23-3df3-4de3-8a0a-499c15a90daa" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.166:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.166:8443: connect: connection refused" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.521418 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-548647668b-bwckt" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.527276 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.638302 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f8dgh" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.683659 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzbdf\" (UniqueName: \"kubernetes.io/projected/66d13d51-29b5-463f-873d-4a586878e0c4-kube-api-access-pzbdf\") pod \"66d13d51-29b5-463f-873d-4a586878e0c4\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.683739 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d13d51-29b5-463f-873d-4a586878e0c4-combined-ca-bundle\") pod \"66d13d51-29b5-463f-873d-4a586878e0c4\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.683770 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66d13d51-29b5-463f-873d-4a586878e0c4-log-httpd\") pod \"66d13d51-29b5-463f-873d-4a586878e0c4\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.683791 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66d13d51-29b5-463f-873d-4a586878e0c4-sg-core-conf-yaml\") pod \"66d13d51-29b5-463f-873d-4a586878e0c4\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.683825 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d13d51-29b5-463f-873d-4a586878e0c4-config-data\") pod \"66d13d51-29b5-463f-873d-4a586878e0c4\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.683884 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66d13d51-29b5-463f-873d-4a586878e0c4-scripts\") pod \"66d13d51-29b5-463f-873d-4a586878e0c4\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.683943 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66d13d51-29b5-463f-873d-4a586878e0c4-run-httpd\") pod \"66d13d51-29b5-463f-873d-4a586878e0c4\" (UID: \"66d13d51-29b5-463f-873d-4a586878e0c4\") " Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.684238 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66d13d51-29b5-463f-873d-4a586878e0c4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "66d13d51-29b5-463f-873d-4a586878e0c4" (UID: "66d13d51-29b5-463f-873d-4a586878e0c4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.684622 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66d13d51-29b5-463f-873d-4a586878e0c4-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.684883 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66d13d51-29b5-463f-873d-4a586878e0c4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "66d13d51-29b5-463f-873d-4a586878e0c4" (UID: "66d13d51-29b5-463f-873d-4a586878e0c4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.692153 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66d13d51-29b5-463f-873d-4a586878e0c4-scripts" (OuterVolumeSpecName: "scripts") pod "66d13d51-29b5-463f-873d-4a586878e0c4" (UID: "66d13d51-29b5-463f-873d-4a586878e0c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.692246 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66d13d51-29b5-463f-873d-4a586878e0c4-kube-api-access-pzbdf" (OuterVolumeSpecName: "kube-api-access-pzbdf") pod "66d13d51-29b5-463f-873d-4a586878e0c4" (UID: "66d13d51-29b5-463f-873d-4a586878e0c4"). InnerVolumeSpecName "kube-api-access-pzbdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.719195 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66d13d51-29b5-463f-873d-4a586878e0c4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "66d13d51-29b5-463f-873d-4a586878e0c4" (UID: "66d13d51-29b5-463f-873d-4a586878e0c4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.772262 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66d13d51-29b5-463f-873d-4a586878e0c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66d13d51-29b5-463f-873d-4a586878e0c4" (UID: "66d13d51-29b5-463f-873d-4a586878e0c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.785286 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-db-sync-config-data\") pod \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\" (UID: \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\") " Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.785358 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-config-data\") pod \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\" (UID: \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\") " Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.785393 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-scripts\") pod \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\" (UID: \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\") " Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.785472 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh6wq\" (UniqueName: \"kubernetes.io/projected/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-kube-api-access-xh6wq\") pod \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\" (UID: \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\") " Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.785527 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-etc-machine-id\") pod \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\" (UID: \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\") " Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.785549 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-combined-ca-bundle\") pod \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\" (UID: \"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d\") " Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.786070 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzbdf\" (UniqueName: \"kubernetes.io/projected/66d13d51-29b5-463f-873d-4a586878e0c4-kube-api-access-pzbdf\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.786088 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d13d51-29b5-463f-873d-4a586878e0c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.786098 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66d13d51-29b5-463f-873d-4a586878e0c4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.786108 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66d13d51-29b5-463f-873d-4a586878e0c4-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.786116 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66d13d51-29b5-463f-873d-4a586878e0c4-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.787521 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d" (UID: "d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.791408 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-kube-api-access-xh6wq" (OuterVolumeSpecName: "kube-api-access-xh6wq") pod "d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d" (UID: "d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d"). InnerVolumeSpecName "kube-api-access-xh6wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.793036 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-scripts" (OuterVolumeSpecName: "scripts") pod "d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d" (UID: "d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.793146 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d" (UID: "d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.808240 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66d13d51-29b5-463f-873d-4a586878e0c4-config-data" (OuterVolumeSpecName: "config-data") pod "66d13d51-29b5-463f-873d-4a586878e0c4" (UID: "66d13d51-29b5-463f-873d-4a586878e0c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.816116 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d" (UID: "d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.843060 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-config-data" (OuterVolumeSpecName: "config-data") pod "d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d" (UID: "d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.887354 4749 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.887389 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.887400 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.887410 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh6wq\" (UniqueName: \"kubernetes.io/projected/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-kube-api-access-xh6wq\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.887420 4749 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.887428 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:42 crc kubenswrapper[4749]: I0219 18:52:42.887439 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d13d51-29b5-463f-873d-4a586878e0c4-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.081932 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="ebdea16c-c5e5-437d-a23c-8d710b197088" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.172:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.081936 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="ebdea16c-c5e5-437d-a23c-8d710b197088" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.172:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.260800 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66d13d51-29b5-463f-873d-4a586878e0c4","Type":"ContainerDied","Data":"995fa8fcb59ed8027892c9e5f75bee09ddddc8ec01d3286d286b049176900f50"} Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.260881 4749 scope.go:117] "RemoveContainer" containerID="1884f8aec9b918e3e847629d5b23ad3ef1071c728b2864d3092f53a1e68e86ae" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.261111 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.278601 4749 generic.go:334] "Generic (PLEG): container finished" podID="763173db-176a-426d-bd85-e051d56ec5cf" containerID="bb744c3795b7bbfd4b38ce2f51e278731316bb1a81d475e683fa156086c9ffdd" exitCode=1 Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.278676 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"763173db-176a-426d-bd85-e051d56ec5cf","Type":"ContainerDied","Data":"bb744c3795b7bbfd4b38ce2f51e278731316bb1a81d475e683fa156086c9ffdd"} Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.279501 4749 scope.go:117] "RemoveContainer" containerID="bb744c3795b7bbfd4b38ce2f51e278731316bb1a81d475e683fa156086c9ffdd" Feb 19 18:52:43 crc kubenswrapper[4749]: E0219 18:52:43.279888 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(763173db-176a-426d-bd85-e051d56ec5cf)\"" pod="openstack/watcher-decision-engine-0" podUID="763173db-176a-426d-bd85-e051d56ec5cf" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.296776 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f8dgh" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.298292 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f8dgh" event={"ID":"d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d","Type":"ContainerDied","Data":"16f9265adf1eeb0e3748eff49f78899440489e65956dcd213849389af444239e"} Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.298349 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16f9265adf1eeb0e3748eff49f78899440489e65956dcd213849389af444239e" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.309629 4749 scope.go:117] "RemoveContainer" containerID="51b4a12ec7e314d65b826d2e0230a1e857723da769725f939a15b6b32f547477" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.381019 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.393112 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.408097 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:52:43 crc kubenswrapper[4749]: E0219 18:52:43.408511 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d13d51-29b5-463f-873d-4a586878e0c4" containerName="proxy-httpd" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.408527 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d13d51-29b5-463f-873d-4a586878e0c4" containerName="proxy-httpd" Feb 19 18:52:43 crc kubenswrapper[4749]: E0219 18:52:43.408543 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d13d51-29b5-463f-873d-4a586878e0c4" containerName="sg-core" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.408549 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d13d51-29b5-463f-873d-4a586878e0c4" containerName="sg-core" Feb 19 18:52:43 crc kubenswrapper[4749]: E0219 18:52:43.408560 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d13d51-29b5-463f-873d-4a586878e0c4" containerName="ceilometer-notification-agent" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.408566 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d13d51-29b5-463f-873d-4a586878e0c4" containerName="ceilometer-notification-agent" Feb 19 18:52:43 crc kubenswrapper[4749]: E0219 18:52:43.408585 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d" containerName="cinder-db-sync" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.408591 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d" containerName="cinder-db-sync" Feb 19 18:52:43 crc kubenswrapper[4749]: E0219 18:52:43.408605 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d13d51-29b5-463f-873d-4a586878e0c4" containerName="ceilometer-central-agent" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.408610 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d13d51-29b5-463f-873d-4a586878e0c4" containerName="ceilometer-central-agent" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.408772 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="66d13d51-29b5-463f-873d-4a586878e0c4" containerName="ceilometer-central-agent" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.408790 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="66d13d51-29b5-463f-873d-4a586878e0c4" containerName="ceilometer-notification-agent" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.408801 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="66d13d51-29b5-463f-873d-4a586878e0c4" containerName="sg-core" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.408809 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d" containerName="cinder-db-sync" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.408819 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="66d13d51-29b5-463f-873d-4a586878e0c4" containerName="proxy-httpd" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.410592 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.413106 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.413532 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.417679 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.419969 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7838589-f49f-4b20-a019-48db9b2ae719-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " pod="openstack/ceilometer-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.420020 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7838589-f49f-4b20-a019-48db9b2ae719-config-data\") pod \"ceilometer-0\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " pod="openstack/ceilometer-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.420081 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7838589-f49f-4b20-a019-48db9b2ae719-scripts\") pod \"ceilometer-0\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " pod="openstack/ceilometer-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.420109 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7838589-f49f-4b20-a019-48db9b2ae719-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " pod="openstack/ceilometer-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.420237 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgkms\" (UniqueName: \"kubernetes.io/projected/b7838589-f49f-4b20-a019-48db9b2ae719-kube-api-access-cgkms\") pod \"ceilometer-0\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " pod="openstack/ceilometer-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.420381 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7838589-f49f-4b20-a019-48db9b2ae719-log-httpd\") pod \"ceilometer-0\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " pod="openstack/ceilometer-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.420443 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7838589-f49f-4b20-a019-48db9b2ae719-run-httpd\") pod \"ceilometer-0\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " pod="openstack/ceilometer-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.427496 4749 scope.go:117] "RemoveContainer" containerID="d9d38f858e6edc203dc8df1bf5835ed813e32cef59fc673493c9bd967a7663f5" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.494802 4749 scope.go:117] "RemoveContainer" containerID="6ce9e7be68ee2c73ea58411c5582a5214befdbaa9b34e3fc190b6c493b5cc976" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.526718 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7838589-f49f-4b20-a019-48db9b2ae719-log-httpd\") pod \"ceilometer-0\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " pod="openstack/ceilometer-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.526817 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7838589-f49f-4b20-a019-48db9b2ae719-run-httpd\") pod \"ceilometer-0\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " pod="openstack/ceilometer-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.527049 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7838589-f49f-4b20-a019-48db9b2ae719-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " pod="openstack/ceilometer-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.527139 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7838589-f49f-4b20-a019-48db9b2ae719-config-data\") pod \"ceilometer-0\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " pod="openstack/ceilometer-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.527240 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7838589-f49f-4b20-a019-48db9b2ae719-scripts\") pod \"ceilometer-0\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " pod="openstack/ceilometer-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.527309 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7838589-f49f-4b20-a019-48db9b2ae719-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " pod="openstack/ceilometer-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.527375 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgkms\" (UniqueName: \"kubernetes.io/projected/b7838589-f49f-4b20-a019-48db9b2ae719-kube-api-access-cgkms\") pod \"ceilometer-0\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " pod="openstack/ceilometer-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.528345 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7838589-f49f-4b20-a019-48db9b2ae719-log-httpd\") pod \"ceilometer-0\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " pod="openstack/ceilometer-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.533567 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7838589-f49f-4b20-a019-48db9b2ae719-run-httpd\") pod \"ceilometer-0\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " pod="openstack/ceilometer-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.552133 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7838589-f49f-4b20-a019-48db9b2ae719-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " pod="openstack/ceilometer-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.552253 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.552299 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.552315 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.552338 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.552697 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7838589-f49f-4b20-a019-48db9b2ae719-scripts\") pod \"ceilometer-0\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " pod="openstack/ceilometer-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.569082 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7838589-f49f-4b20-a019-48db9b2ae719-config-data\") pod \"ceilometer-0\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " pod="openstack/ceilometer-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.569729 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7838589-f49f-4b20-a019-48db9b2ae719-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " pod="openstack/ceilometer-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.580243 4749 scope.go:117] "RemoveContainer" containerID="cecf2d389ccc361882dad3f9a6bf6528e23aa044d6a9d743e749d18ef74520d2" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.581089 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgkms\" (UniqueName: \"kubernetes.io/projected/b7838589-f49f-4b20-a019-48db9b2ae719-kube-api-access-cgkms\") pod \"ceilometer-0\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " pod="openstack/ceilometer-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.590555 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.631295 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.639595 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.646566 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-4t4t8" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.647476 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.680102 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.685564 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.737082 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e78771f8-d9d8-4ca8-9295-dda27179f45c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e78771f8-d9d8-4ca8-9295-dda27179f45c\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.737146 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e78771f8-d9d8-4ca8-9295-dda27179f45c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e78771f8-d9d8-4ca8-9295-dda27179f45c\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.737206 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e78771f8-d9d8-4ca8-9295-dda27179f45c-config-data\") pod \"cinder-scheduler-0\" (UID: \"e78771f8-d9d8-4ca8-9295-dda27179f45c\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.737235 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e78771f8-d9d8-4ca8-9295-dda27179f45c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e78771f8-d9d8-4ca8-9295-dda27179f45c\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.737288 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcz94\" (UniqueName: \"kubernetes.io/projected/e78771f8-d9d8-4ca8-9295-dda27179f45c-kube-api-access-qcz94\") pod \"cinder-scheduler-0\" (UID: \"e78771f8-d9d8-4ca8-9295-dda27179f45c\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.737415 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e78771f8-d9d8-4ca8-9295-dda27179f45c-scripts\") pod \"cinder-scheduler-0\" (UID: \"e78771f8-d9d8-4ca8-9295-dda27179f45c\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.766599 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.771093 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-687898f47c-94xfz"] Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.771337 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-687898f47c-94xfz" podUID="463b2048-d918-4de3-834d-b6899813a604" containerName="dnsmasq-dns" containerID="cri-o://559448d79682b63b3e3f5289796e09dc6c05b11dae7e863c5ac2cdb3e5792652" gracePeriod=10 Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.859215 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e78771f8-d9d8-4ca8-9295-dda27179f45c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e78771f8-d9d8-4ca8-9295-dda27179f45c\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.859327 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e78771f8-d9d8-4ca8-9295-dda27179f45c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e78771f8-d9d8-4ca8-9295-dda27179f45c\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.859457 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e78771f8-d9d8-4ca8-9295-dda27179f45c-config-data\") pod \"cinder-scheduler-0\" (UID: \"e78771f8-d9d8-4ca8-9295-dda27179f45c\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.859492 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e78771f8-d9d8-4ca8-9295-dda27179f45c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e78771f8-d9d8-4ca8-9295-dda27179f45c\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.859538 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcz94\" (UniqueName: \"kubernetes.io/projected/e78771f8-d9d8-4ca8-9295-dda27179f45c-kube-api-access-qcz94\") pod \"cinder-scheduler-0\" (UID: \"e78771f8-d9d8-4ca8-9295-dda27179f45c\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.859600 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e78771f8-d9d8-4ca8-9295-dda27179f45c-scripts\") pod \"cinder-scheduler-0\" (UID: \"e78771f8-d9d8-4ca8-9295-dda27179f45c\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.864110 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c7576c49c-zd5s9"] Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.866101 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.867095 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e78771f8-d9d8-4ca8-9295-dda27179f45c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e78771f8-d9d8-4ca8-9295-dda27179f45c\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.874754 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e78771f8-d9d8-4ca8-9295-dda27179f45c-scripts\") pod \"cinder-scheduler-0\" (UID: \"e78771f8-d9d8-4ca8-9295-dda27179f45c\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.879975 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e78771f8-d9d8-4ca8-9295-dda27179f45c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e78771f8-d9d8-4ca8-9295-dda27179f45c\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.882137 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e78771f8-d9d8-4ca8-9295-dda27179f45c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e78771f8-d9d8-4ca8-9295-dda27179f45c\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.886868 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e78771f8-d9d8-4ca8-9295-dda27179f45c-config-data\") pod \"cinder-scheduler-0\" (UID: \"e78771f8-d9d8-4ca8-9295-dda27179f45c\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.907658 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcz94\" (UniqueName: \"kubernetes.io/projected/e78771f8-d9d8-4ca8-9295-dda27179f45c-kube-api-access-qcz94\") pod \"cinder-scheduler-0\" (UID: \"e78771f8-d9d8-4ca8-9295-dda27179f45c\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.913075 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7576c49c-zd5s9"] Feb 19 18:52:43 crc kubenswrapper[4749]: I0219 18:52:43.984161 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.008498 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.012492 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.044327 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.059199 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.109039 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7576c49c-zd5s9\" (UID: \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\") " pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.109091 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-dns-svc\") pod \"dnsmasq-dns-5c7576c49c-zd5s9\" (UID: \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\") " pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.109109 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7576c49c-zd5s9\" (UID: \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\") " pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.109215 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88cwh\" (UniqueName: \"kubernetes.io/projected/fc4a9e03-efdd-49ff-b106-8eb68a971e78-kube-api-access-88cwh\") pod \"dnsmasq-dns-5c7576c49c-zd5s9\" (UID: \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\") " pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.109248 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-config\") pod \"dnsmasq-dns-5c7576c49c-zd5s9\" (UID: \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\") " pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.109303 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7576c49c-zd5s9\" (UID: \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\") " pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" Feb 19 18:52:44 crc kubenswrapper[4749]: E0219 18:52:44.210165 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod463b2048_d918_4de3_834d_b6899813a604.slice/crio-conmon-559448d79682b63b3e3f5289796e09dc6c05b11dae7e863c5ac2cdb3e5792652.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod463b2048_d918_4de3_834d_b6899813a604.slice/crio-559448d79682b63b3e3f5289796e09dc6c05b11dae7e863c5ac2cdb3e5792652.scope\": RecentStats: unable to find data in memory cache]" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.210326 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lltmb\" (UniqueName: \"kubernetes.io/projected/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-kube-api-access-lltmb\") pod \"cinder-api-0\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " pod="openstack/cinder-api-0" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.210390 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7576c49c-zd5s9\" (UID: \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\") " pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.210418 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-scripts\") pod \"cinder-api-0\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " pod="openstack/cinder-api-0" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.210439 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7576c49c-zd5s9\" (UID: \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\") " pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.210458 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-dns-svc\") pod \"dnsmasq-dns-5c7576c49c-zd5s9\" (UID: \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\") " pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.210476 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7576c49c-zd5s9\" (UID: \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\") " pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.210539 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " pod="openstack/cinder-api-0" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.210571 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-config-data\") pod \"cinder-api-0\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " pod="openstack/cinder-api-0" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.210589 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88cwh\" (UniqueName: \"kubernetes.io/projected/fc4a9e03-efdd-49ff-b106-8eb68a971e78-kube-api-access-88cwh\") pod \"dnsmasq-dns-5c7576c49c-zd5s9\" (UID: \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\") " pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.210607 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-logs\") pod \"cinder-api-0\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " pod="openstack/cinder-api-0" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.210636 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-config-data-custom\") pod \"cinder-api-0\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " pod="openstack/cinder-api-0" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.210657 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-config\") pod \"dnsmasq-dns-5c7576c49c-zd5s9\" (UID: \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\") " pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.210674 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " pod="openstack/cinder-api-0" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.211537 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7576c49c-zd5s9\" (UID: \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\") " pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.212065 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7576c49c-zd5s9\" (UID: \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\") " pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.212756 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-dns-svc\") pod \"dnsmasq-dns-5c7576c49c-zd5s9\" (UID: \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\") " pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.213551 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-config\") pod \"dnsmasq-dns-5c7576c49c-zd5s9\" (UID: \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\") " pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.213887 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7576c49c-zd5s9\" (UID: \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\") " pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.258646 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88cwh\" (UniqueName: \"kubernetes.io/projected/fc4a9e03-efdd-49ff-b106-8eb68a971e78-kube-api-access-88cwh\") pod \"dnsmasq-dns-5c7576c49c-zd5s9\" (UID: \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\") " pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.312402 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " pod="openstack/cinder-api-0" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.312768 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-config-data\") pod \"cinder-api-0\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " pod="openstack/cinder-api-0" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.312823 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-logs\") pod \"cinder-api-0\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " pod="openstack/cinder-api-0" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.312876 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-config-data-custom\") pod \"cinder-api-0\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " pod="openstack/cinder-api-0" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.312904 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " pod="openstack/cinder-api-0" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.312970 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lltmb\" (UniqueName: \"kubernetes.io/projected/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-kube-api-access-lltmb\") pod \"cinder-api-0\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " pod="openstack/cinder-api-0" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.313149 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-scripts\") pod \"cinder-api-0\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " pod="openstack/cinder-api-0" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.313545 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " pod="openstack/cinder-api-0" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.314430 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-logs\") pod \"cinder-api-0\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " pod="openstack/cinder-api-0" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.317090 4749 generic.go:334] "Generic (PLEG): container finished" podID="463b2048-d918-4de3-834d-b6899813a604" containerID="559448d79682b63b3e3f5289796e09dc6c05b11dae7e863c5ac2cdb3e5792652" exitCode=0 Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.317173 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-687898f47c-94xfz" event={"ID":"463b2048-d918-4de3-834d-b6899813a604","Type":"ContainerDied","Data":"559448d79682b63b3e3f5289796e09dc6c05b11dae7e863c5ac2cdb3e5792652"} Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.317401 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-scripts\") pod \"cinder-api-0\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " pod="openstack/cinder-api-0" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.319258 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-config-data-custom\") pod \"cinder-api-0\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " pod="openstack/cinder-api-0" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.320340 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-config-data\") pod \"cinder-api-0\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " pod="openstack/cinder-api-0" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.323062 4749 scope.go:117] "RemoveContainer" containerID="bb744c3795b7bbfd4b38ce2f51e278731316bb1a81d475e683fa156086c9ffdd" Feb 19 18:52:44 crc kubenswrapper[4749]: E0219 18:52:44.323412 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(763173db-176a-426d-bd85-e051d56ec5cf)\"" pod="openstack/watcher-decision-engine-0" podUID="763173db-176a-426d-bd85-e051d56ec5cf" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.323618 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " pod="openstack/cinder-api-0" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.336614 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lltmb\" (UniqueName: \"kubernetes.io/projected/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-kube-api-access-lltmb\") pod \"cinder-api-0\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " pod="openstack/cinder-api-0" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.364657 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.413758 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.448302 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.594865 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:52:44 crc kubenswrapper[4749]: W0219 18:52:44.595569 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7838589_f49f_4b20_a019_48db9b2ae719.slice/crio-42f7884fb0c8432df808b91db0331a986e3c47ed84d9ff236ea4e2105527090f WatchSource:0}: Error finding container 42f7884fb0c8432df808b91db0331a986e3c47ed84d9ff236ea4e2105527090f: Status 404 returned error can't find the container with id 42f7884fb0c8432df808b91db0331a986e3c47ed84d9ff236ea4e2105527090f Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.640590 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-687898f47c-94xfz" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.689474 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66d13d51-29b5-463f-873d-4a586878e0c4" path="/var/lib/kubelet/pods/66d13d51-29b5-463f-873d-4a586878e0c4/volumes" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.695387 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.820848 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-ovsdbserver-nb\") pod \"463b2048-d918-4de3-834d-b6899813a604\" (UID: \"463b2048-d918-4de3-834d-b6899813a604\") " Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.821683 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcz79\" (UniqueName: \"kubernetes.io/projected/463b2048-d918-4de3-834d-b6899813a604-kube-api-access-hcz79\") pod \"463b2048-d918-4de3-834d-b6899813a604\" (UID: \"463b2048-d918-4de3-834d-b6899813a604\") " Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.821761 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-ovsdbserver-sb\") pod \"463b2048-d918-4de3-834d-b6899813a604\" (UID: \"463b2048-d918-4de3-834d-b6899813a604\") " Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.822007 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-dns-swift-storage-0\") pod \"463b2048-d918-4de3-834d-b6899813a604\" (UID: \"463b2048-d918-4de3-834d-b6899813a604\") " Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.822095 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-config\") pod \"463b2048-d918-4de3-834d-b6899813a604\" (UID: \"463b2048-d918-4de3-834d-b6899813a604\") " Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.822136 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-dns-svc\") pod \"463b2048-d918-4de3-834d-b6899813a604\" (UID: \"463b2048-d918-4de3-834d-b6899813a604\") " Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.825484 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/463b2048-d918-4de3-834d-b6899813a604-kube-api-access-hcz79" (OuterVolumeSpecName: "kube-api-access-hcz79") pod "463b2048-d918-4de3-834d-b6899813a604" (UID: "463b2048-d918-4de3-834d-b6899813a604"). InnerVolumeSpecName "kube-api-access-hcz79". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.891550 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 18:52:44 crc kubenswrapper[4749]: W0219 18:52:44.902076 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode78771f8_d9d8_4ca8_9295_dda27179f45c.slice/crio-81fdcafe37332ff6aba335e95290b2db8d8b7caf1b119cacee6a45ab0f792976 WatchSource:0}: Error finding container 81fdcafe37332ff6aba335e95290b2db8d8b7caf1b119cacee6a45ab0f792976: Status 404 returned error can't find the container with id 81fdcafe37332ff6aba335e95290b2db8d8b7caf1b119cacee6a45ab0f792976 Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.925200 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcz79\" (UniqueName: \"kubernetes.io/projected/463b2048-d918-4de3-834d-b6899813a604-kube-api-access-hcz79\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.958645 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "463b2048-d918-4de3-834d-b6899813a604" (UID: "463b2048-d918-4de3-834d-b6899813a604"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.971215 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-config" (OuterVolumeSpecName: "config") pod "463b2048-d918-4de3-834d-b6899813a604" (UID: "463b2048-d918-4de3-834d-b6899813a604"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.980238 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "463b2048-d918-4de3-834d-b6899813a604" (UID: "463b2048-d918-4de3-834d-b6899813a604"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.980328 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "463b2048-d918-4de3-834d-b6899813a604" (UID: "463b2048-d918-4de3-834d-b6899813a604"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:52:44 crc kubenswrapper[4749]: I0219 18:52:44.983312 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "463b2048-d918-4de3-834d-b6899813a604" (UID: "463b2048-d918-4de3-834d-b6899813a604"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:52:45 crc kubenswrapper[4749]: I0219 18:52:45.021293 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7576c49c-zd5s9"] Feb 19 18:52:45 crc kubenswrapper[4749]: I0219 18:52:45.029491 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:45 crc kubenswrapper[4749]: I0219 18:52:45.029520 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:45 crc kubenswrapper[4749]: I0219 18:52:45.029533 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:45 crc kubenswrapper[4749]: I0219 18:52:45.029542 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:45 crc kubenswrapper[4749]: I0219 18:52:45.029551 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/463b2048-d918-4de3-834d-b6899813a604-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:45 crc kubenswrapper[4749]: I0219 18:52:45.102560 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 18:52:45 crc kubenswrapper[4749]: I0219 18:52:45.355304 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e78771f8-d9d8-4ca8-9295-dda27179f45c","Type":"ContainerStarted","Data":"81fdcafe37332ff6aba335e95290b2db8d8b7caf1b119cacee6a45ab0f792976"} Feb 19 18:52:45 crc kubenswrapper[4749]: I0219 18:52:45.361465 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7838589-f49f-4b20-a019-48db9b2ae719","Type":"ContainerStarted","Data":"f19d5a7c0676700b8bb0a7f9196177a390c33f2bf5755b61c48ef371da44afe0"} Feb 19 18:52:45 crc kubenswrapper[4749]: I0219 18:52:45.361505 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7838589-f49f-4b20-a019-48db9b2ae719","Type":"ContainerStarted","Data":"42f7884fb0c8432df808b91db0331a986e3c47ed84d9ff236ea4e2105527090f"} Feb 19 18:52:45 crc kubenswrapper[4749]: I0219 18:52:45.380436 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-687898f47c-94xfz" event={"ID":"463b2048-d918-4de3-834d-b6899813a604","Type":"ContainerDied","Data":"013b5b22db710aaa5152174cfb1c99cedf56c97f065975447ab7f94d49e26da8"} Feb 19 18:52:45 crc kubenswrapper[4749]: I0219 18:52:45.380496 4749 scope.go:117] "RemoveContainer" containerID="559448d79682b63b3e3f5289796e09dc6c05b11dae7e863c5ac2cdb3e5792652" Feb 19 18:52:45 crc kubenswrapper[4749]: I0219 18:52:45.380607 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-687898f47c-94xfz" Feb 19 18:52:45 crc kubenswrapper[4749]: I0219 18:52:45.385494 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b5f8f37a-aa8a-4408-87c2-3a820d6909fd","Type":"ContainerStarted","Data":"d961ef8fd852cf8e66225045a9cc14cf447f5824afd1339fe657504fbc696078"} Feb 19 18:52:45 crc kubenswrapper[4749]: I0219 18:52:45.388583 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" event={"ID":"fc4a9e03-efdd-49ff-b106-8eb68a971e78","Type":"ContainerStarted","Data":"ca273837daf4b2553c7bbff285efb4b148bca0fd3e1480156447a15ec89a8f8b"} Feb 19 18:52:45 crc kubenswrapper[4749]: I0219 18:52:45.389180 4749 scope.go:117] "RemoveContainer" containerID="bb744c3795b7bbfd4b38ce2f51e278731316bb1a81d475e683fa156086c9ffdd" Feb 19 18:52:45 crc kubenswrapper[4749]: E0219 18:52:45.389439 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(763173db-176a-426d-bd85-e051d56ec5cf)\"" pod="openstack/watcher-decision-engine-0" podUID="763173db-176a-426d-bd85-e051d56ec5cf" Feb 19 18:52:45 crc kubenswrapper[4749]: I0219 18:52:45.451795 4749 scope.go:117] "RemoveContainer" containerID="da0e8cb375b3aaa51c4f256052df56da7007a9e5503dc1ef1a0a03fdf04aefa0" Feb 19 18:52:45 crc kubenswrapper[4749]: I0219 18:52:45.500241 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-687898f47c-94xfz"] Feb 19 18:52:45 crc kubenswrapper[4749]: I0219 18:52:45.509187 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-687898f47c-94xfz"] Feb 19 18:52:45 crc kubenswrapper[4749]: I0219 18:52:45.959460 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 18:52:46 crc kubenswrapper[4749]: I0219 18:52:46.445141 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b5f8f37a-aa8a-4408-87c2-3a820d6909fd","Type":"ContainerStarted","Data":"8e6d5df6c451429f0da23b70de820a3495c134ca677021837d73c6a4f27d69e4"} Feb 19 18:52:46 crc kubenswrapper[4749]: I0219 18:52:46.453833 4749 generic.go:334] "Generic (PLEG): container finished" podID="5701fcc2-ae2a-4017-8991-3470421ff234" containerID="fd16a22fd1e877553f10b8e81bace93a1789b59cfab3e6a9937e8cdb3cbf4092" exitCode=0 Feb 19 18:52:46 crc kubenswrapper[4749]: I0219 18:52:46.453902 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cmbgb" event={"ID":"5701fcc2-ae2a-4017-8991-3470421ff234","Type":"ContainerDied","Data":"fd16a22fd1e877553f10b8e81bace93a1789b59cfab3e6a9937e8cdb3cbf4092"} Feb 19 18:52:46 crc kubenswrapper[4749]: I0219 18:52:46.469251 4749 generic.go:334] "Generic (PLEG): container finished" podID="fc4a9e03-efdd-49ff-b106-8eb68a971e78" containerID="c3e0067df74bedc91b407f6ad00c88d8df83995229796215add530418da3086a" exitCode=0 Feb 19 18:52:46 crc kubenswrapper[4749]: I0219 18:52:46.469329 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" event={"ID":"fc4a9e03-efdd-49ff-b106-8eb68a971e78","Type":"ContainerDied","Data":"c3e0067df74bedc91b407f6ad00c88d8df83995229796215add530418da3086a"} Feb 19 18:52:46 crc kubenswrapper[4749]: I0219 18:52:46.479034 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e78771f8-d9d8-4ca8-9295-dda27179f45c","Type":"ContainerStarted","Data":"81559378aa5cae39127229a527270851fc8cc05fb6eb870d68a51ac2e36acba8"} Feb 19 18:52:46 crc kubenswrapper[4749]: I0219 18:52:46.490113 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7838589-f49f-4b20-a019-48db9b2ae719","Type":"ContainerStarted","Data":"f8643b66de6291bf89ae58c0eba3442fc66922713ac6fb99b8e06f8e3cea4fcb"} Feb 19 18:52:46 crc kubenswrapper[4749]: I0219 18:52:46.705991 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="463b2048-d918-4de3-834d-b6899813a604" path="/var/lib/kubelet/pods/463b2048-d918-4de3-834d-b6899813a604/volumes" Feb 19 18:52:47 crc kubenswrapper[4749]: I0219 18:52:47.512818 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7838589-f49f-4b20-a019-48db9b2ae719","Type":"ContainerStarted","Data":"bbc77b191ed97fab0dcdefa3a50b1f37a01e3e78e7bfdf81eca25489fd4165d4"} Feb 19 18:52:47 crc kubenswrapper[4749]: I0219 18:52:47.514390 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b5f8f37a-aa8a-4408-87c2-3a820d6909fd","Type":"ContainerStarted","Data":"28dda44d4a2934dc868365570fe87bb612c38fa232beb99f9403169c67392f6e"} Feb 19 18:52:47 crc kubenswrapper[4749]: I0219 18:52:47.514498 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="b5f8f37a-aa8a-4408-87c2-3a820d6909fd" containerName="cinder-api-log" containerID="cri-o://8e6d5df6c451429f0da23b70de820a3495c134ca677021837d73c6a4f27d69e4" gracePeriod=30 Feb 19 18:52:47 crc kubenswrapper[4749]: I0219 18:52:47.514529 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 19 18:52:47 crc kubenswrapper[4749]: I0219 18:52:47.514557 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="b5f8f37a-aa8a-4408-87c2-3a820d6909fd" containerName="cinder-api" containerID="cri-o://28dda44d4a2934dc868365570fe87bb612c38fa232beb99f9403169c67392f6e" gracePeriod=30 Feb 19 18:52:47 crc kubenswrapper[4749]: I0219 18:52:47.515895 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" event={"ID":"fc4a9e03-efdd-49ff-b106-8eb68a971e78","Type":"ContainerStarted","Data":"6a59099c8e518b01c51fd287ff6818ccaae6707b1565035a93a141d072e44b16"} Feb 19 18:52:47 crc kubenswrapper[4749]: I0219 18:52:47.516095 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" Feb 19 18:52:47 crc kubenswrapper[4749]: I0219 18:52:47.519201 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e78771f8-d9d8-4ca8-9295-dda27179f45c","Type":"ContainerStarted","Data":"c217810bc994ed64df8a5fb18536c7b49979ce42ac393e0fc0530f121332c7a1"} Feb 19 18:52:47 crc kubenswrapper[4749]: I0219 18:52:47.549379 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.549355733 podStartE2EDuration="4.549355733s" podCreationTimestamp="2026-02-19 18:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:52:47.537830752 +0000 UTC m=+1141.499050706" watchObservedRunningTime="2026-02-19 18:52:47.549355733 +0000 UTC m=+1141.510575687" Feb 19 18:52:47 crc kubenswrapper[4749]: I0219 18:52:47.560779 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.226420198 podStartE2EDuration="4.560763641s" podCreationTimestamp="2026-02-19 18:52:43 +0000 UTC" firstStartedPulling="2026-02-19 18:52:44.905891234 +0000 UTC m=+1138.867111188" lastFinishedPulling="2026-02-19 18:52:45.240234677 +0000 UTC m=+1139.201454631" observedRunningTime="2026-02-19 18:52:47.557988424 +0000 UTC m=+1141.519208388" watchObservedRunningTime="2026-02-19 18:52:47.560763641 +0000 UTC m=+1141.521983595" Feb 19 18:52:47 crc kubenswrapper[4749]: I0219 18:52:47.585193 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" podStartSLOduration=4.585177023 podStartE2EDuration="4.585177023s" podCreationTimestamp="2026-02-19 18:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:52:47.580706138 +0000 UTC m=+1141.541926092" watchObservedRunningTime="2026-02-19 18:52:47.585177023 +0000 UTC m=+1141.546396977" Feb 19 18:52:47 crc kubenswrapper[4749]: I0219 18:52:47.982413 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cmbgb" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.018448 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5701fcc2-ae2a-4017-8991-3470421ff234-combined-ca-bundle\") pod \"5701fcc2-ae2a-4017-8991-3470421ff234\" (UID: \"5701fcc2-ae2a-4017-8991-3470421ff234\") " Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.018549 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5701fcc2-ae2a-4017-8991-3470421ff234-db-sync-config-data\") pod \"5701fcc2-ae2a-4017-8991-3470421ff234\" (UID: \"5701fcc2-ae2a-4017-8991-3470421ff234\") " Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.018694 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvv7t\" (UniqueName: \"kubernetes.io/projected/5701fcc2-ae2a-4017-8991-3470421ff234-kube-api-access-vvv7t\") pod \"5701fcc2-ae2a-4017-8991-3470421ff234\" (UID: \"5701fcc2-ae2a-4017-8991-3470421ff234\") " Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.024528 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5701fcc2-ae2a-4017-8991-3470421ff234-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5701fcc2-ae2a-4017-8991-3470421ff234" (UID: "5701fcc2-ae2a-4017-8991-3470421ff234"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.025114 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5701fcc2-ae2a-4017-8991-3470421ff234-kube-api-access-vvv7t" (OuterVolumeSpecName: "kube-api-access-vvv7t") pod "5701fcc2-ae2a-4017-8991-3470421ff234" (UID: "5701fcc2-ae2a-4017-8991-3470421ff234"). InnerVolumeSpecName "kube-api-access-vvv7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.057084 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5701fcc2-ae2a-4017-8991-3470421ff234-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5701fcc2-ae2a-4017-8991-3470421ff234" (UID: "5701fcc2-ae2a-4017-8991-3470421ff234"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.093379 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.134688 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-combined-ca-bundle\") pod \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.135097 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lltmb\" (UniqueName: \"kubernetes.io/projected/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-kube-api-access-lltmb\") pod \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.135131 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-config-data-custom\") pod \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.135188 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-logs\") pod \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.135240 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-config-data\") pod \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.135290 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-etc-machine-id\") pod \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.135415 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-scripts\") pod \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\" (UID: \"b5f8f37a-aa8a-4408-87c2-3a820d6909fd\") " Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.136087 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b5f8f37a-aa8a-4408-87c2-3a820d6909fd" (UID: "b5f8f37a-aa8a-4408-87c2-3a820d6909fd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.136350 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5701fcc2-ae2a-4017-8991-3470421ff234-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.136371 4749 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5701fcc2-ae2a-4017-8991-3470421ff234-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.136384 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvv7t\" (UniqueName: \"kubernetes.io/projected/5701fcc2-ae2a-4017-8991-3470421ff234-kube-api-access-vvv7t\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.136800 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-logs" (OuterVolumeSpecName: "logs") pod "b5f8f37a-aa8a-4408-87c2-3a820d6909fd" (UID: "b5f8f37a-aa8a-4408-87c2-3a820d6909fd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.141463 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-kube-api-access-lltmb" (OuterVolumeSpecName: "kube-api-access-lltmb") pod "b5f8f37a-aa8a-4408-87c2-3a820d6909fd" (UID: "b5f8f37a-aa8a-4408-87c2-3a820d6909fd"). InnerVolumeSpecName "kube-api-access-lltmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.155805 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b5f8f37a-aa8a-4408-87c2-3a820d6909fd" (UID: "b5f8f37a-aa8a-4408-87c2-3a820d6909fd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.156673 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-scripts" (OuterVolumeSpecName: "scripts") pod "b5f8f37a-aa8a-4408-87c2-3a820d6909fd" (UID: "b5f8f37a-aa8a-4408-87c2-3a820d6909fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.175479 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5f8f37a-aa8a-4408-87c2-3a820d6909fd" (UID: "b5f8f37a-aa8a-4408-87c2-3a820d6909fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.208414 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-config-data" (OuterVolumeSpecName: "config-data") pod "b5f8f37a-aa8a-4408-87c2-3a820d6909fd" (UID: "b5f8f37a-aa8a-4408-87c2-3a820d6909fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.237734 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.237780 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.237792 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lltmb\" (UniqueName: \"kubernetes.io/projected/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-kube-api-access-lltmb\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.237803 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.237811 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-logs\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.237819 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.237827 4749 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b5f8f37a-aa8a-4408-87c2-3a820d6909fd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.542065 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cmbgb" event={"ID":"5701fcc2-ae2a-4017-8991-3470421ff234","Type":"ContainerDied","Data":"37a318854400c75cd258f5bde67fc9ea170b4c0c935cec65eda508fb2ec8e8bc"} Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.542104 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37a318854400c75cd258f5bde67fc9ea170b4c0c935cec65eda508fb2ec8e8bc" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.542181 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cmbgb" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.547432 4749 generic.go:334] "Generic (PLEG): container finished" podID="b5f8f37a-aa8a-4408-87c2-3a820d6909fd" containerID="28dda44d4a2934dc868365570fe87bb612c38fa232beb99f9403169c67392f6e" exitCode=0 Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.547456 4749 generic.go:334] "Generic (PLEG): container finished" podID="b5f8f37a-aa8a-4408-87c2-3a820d6909fd" containerID="8e6d5df6c451429f0da23b70de820a3495c134ca677021837d73c6a4f27d69e4" exitCode=143 Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.547484 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.547541 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b5f8f37a-aa8a-4408-87c2-3a820d6909fd","Type":"ContainerDied","Data":"28dda44d4a2934dc868365570fe87bb612c38fa232beb99f9403169c67392f6e"} Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.547565 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b5f8f37a-aa8a-4408-87c2-3a820d6909fd","Type":"ContainerDied","Data":"8e6d5df6c451429f0da23b70de820a3495c134ca677021837d73c6a4f27d69e4"} Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.547576 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b5f8f37a-aa8a-4408-87c2-3a820d6909fd","Type":"ContainerDied","Data":"d961ef8fd852cf8e66225045a9cc14cf447f5824afd1339fe657504fbc696078"} Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.547589 4749 scope.go:117] "RemoveContainer" containerID="28dda44d4a2934dc868365570fe87bb612c38fa232beb99f9403169c67392f6e" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.589870 4749 scope.go:117] "RemoveContainer" containerID="8e6d5df6c451429f0da23b70de820a3495c134ca677021837d73c6a4f27d69e4" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.617180 4749 scope.go:117] "RemoveContainer" containerID="28dda44d4a2934dc868365570fe87bb612c38fa232beb99f9403169c67392f6e" Feb 19 18:52:48 crc kubenswrapper[4749]: E0219 18:52:48.617697 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28dda44d4a2934dc868365570fe87bb612c38fa232beb99f9403169c67392f6e\": container with ID starting with 28dda44d4a2934dc868365570fe87bb612c38fa232beb99f9403169c67392f6e not found: ID does not exist" containerID="28dda44d4a2934dc868365570fe87bb612c38fa232beb99f9403169c67392f6e" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.617735 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28dda44d4a2934dc868365570fe87bb612c38fa232beb99f9403169c67392f6e"} err="failed to get container status \"28dda44d4a2934dc868365570fe87bb612c38fa232beb99f9403169c67392f6e\": rpc error: code = NotFound desc = could not find container \"28dda44d4a2934dc868365570fe87bb612c38fa232beb99f9403169c67392f6e\": container with ID starting with 28dda44d4a2934dc868365570fe87bb612c38fa232beb99f9403169c67392f6e not found: ID does not exist" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.617763 4749 scope.go:117] "RemoveContainer" containerID="8e6d5df6c451429f0da23b70de820a3495c134ca677021837d73c6a4f27d69e4" Feb 19 18:52:48 crc kubenswrapper[4749]: E0219 18:52:48.619537 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e6d5df6c451429f0da23b70de820a3495c134ca677021837d73c6a4f27d69e4\": container with ID starting with 8e6d5df6c451429f0da23b70de820a3495c134ca677021837d73c6a4f27d69e4 not found: ID does not exist" containerID="8e6d5df6c451429f0da23b70de820a3495c134ca677021837d73c6a4f27d69e4" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.619594 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e6d5df6c451429f0da23b70de820a3495c134ca677021837d73c6a4f27d69e4"} err="failed to get container status \"8e6d5df6c451429f0da23b70de820a3495c134ca677021837d73c6a4f27d69e4\": rpc error: code = NotFound desc = could not find container \"8e6d5df6c451429f0da23b70de820a3495c134ca677021837d73c6a4f27d69e4\": container with ID starting with 8e6d5df6c451429f0da23b70de820a3495c134ca677021837d73c6a4f27d69e4 not found: ID does not exist" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.619616 4749 scope.go:117] "RemoveContainer" containerID="28dda44d4a2934dc868365570fe87bb612c38fa232beb99f9403169c67392f6e" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.620909 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28dda44d4a2934dc868365570fe87bb612c38fa232beb99f9403169c67392f6e"} err="failed to get container status \"28dda44d4a2934dc868365570fe87bb612c38fa232beb99f9403169c67392f6e\": rpc error: code = NotFound desc = could not find container \"28dda44d4a2934dc868365570fe87bb612c38fa232beb99f9403169c67392f6e\": container with ID starting with 28dda44d4a2934dc868365570fe87bb612c38fa232beb99f9403169c67392f6e not found: ID does not exist" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.620940 4749 scope.go:117] "RemoveContainer" containerID="8e6d5df6c451429f0da23b70de820a3495c134ca677021837d73c6a4f27d69e4" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.621730 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e6d5df6c451429f0da23b70de820a3495c134ca677021837d73c6a4f27d69e4"} err="failed to get container status \"8e6d5df6c451429f0da23b70de820a3495c134ca677021837d73c6a4f27d69e4\": rpc error: code = NotFound desc = could not find container \"8e6d5df6c451429f0da23b70de820a3495c134ca677021837d73c6a4f27d69e4\": container with ID starting with 8e6d5df6c451429f0da23b70de820a3495c134ca677021837d73c6a4f27d69e4 not found: ID does not exist" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.641693 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.656047 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.671744 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 19 18:52:48 crc kubenswrapper[4749]: E0219 18:52:48.672265 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5701fcc2-ae2a-4017-8991-3470421ff234" containerName="barbican-db-sync" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.672284 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5701fcc2-ae2a-4017-8991-3470421ff234" containerName="barbican-db-sync" Feb 19 18:52:48 crc kubenswrapper[4749]: E0219 18:52:48.672304 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463b2048-d918-4de3-834d-b6899813a604" containerName="init" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.672310 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="463b2048-d918-4de3-834d-b6899813a604" containerName="init" Feb 19 18:52:48 crc kubenswrapper[4749]: E0219 18:52:48.672335 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f8f37a-aa8a-4408-87c2-3a820d6909fd" containerName="cinder-api-log" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.672344 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f8f37a-aa8a-4408-87c2-3a820d6909fd" containerName="cinder-api-log" Feb 19 18:52:48 crc kubenswrapper[4749]: E0219 18:52:48.672359 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f8f37a-aa8a-4408-87c2-3a820d6909fd" containerName="cinder-api" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.672364 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f8f37a-aa8a-4408-87c2-3a820d6909fd" containerName="cinder-api" Feb 19 18:52:48 crc kubenswrapper[4749]: E0219 18:52:48.672372 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463b2048-d918-4de3-834d-b6899813a604" containerName="dnsmasq-dns" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.672377 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="463b2048-d918-4de3-834d-b6899813a604" containerName="dnsmasq-dns" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.672540 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="463b2048-d918-4de3-834d-b6899813a604" containerName="dnsmasq-dns" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.672552 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5701fcc2-ae2a-4017-8991-3470421ff234" containerName="barbican-db-sync" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.672562 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5f8f37a-aa8a-4408-87c2-3a820d6909fd" containerName="cinder-api" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.672572 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5f8f37a-aa8a-4408-87c2-3a820d6909fd" containerName="cinder-api-log" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.673560 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.676368 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.680285 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.684829 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.728204 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5f8f37a-aa8a-4408-87c2-3a820d6909fd" path="/var/lib/kubelet/pods/b5f8f37a-aa8a-4408-87c2-3a820d6909fd/volumes" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.732109 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.751910 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa8559a3-5137-4d82-a189-18e060db5fa5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"fa8559a3-5137-4d82-a189-18e060db5fa5\") " pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.751950 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa8559a3-5137-4d82-a189-18e060db5fa5-config-data-custom\") pod \"cinder-api-0\" (UID: \"fa8559a3-5137-4d82-a189-18e060db5fa5\") " pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.751984 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa8559a3-5137-4d82-a189-18e060db5fa5-logs\") pod \"cinder-api-0\" (UID: \"fa8559a3-5137-4d82-a189-18e060db5fa5\") " pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.752061 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa8559a3-5137-4d82-a189-18e060db5fa5-scripts\") pod \"cinder-api-0\" (UID: \"fa8559a3-5137-4d82-a189-18e060db5fa5\") " pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.752075 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8559a3-5137-4d82-a189-18e060db5fa5-config-data\") pod \"cinder-api-0\" (UID: \"fa8559a3-5137-4d82-a189-18e060db5fa5\") " pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.752107 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8559a3-5137-4d82-a189-18e060db5fa5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fa8559a3-5137-4d82-a189-18e060db5fa5\") " pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.752151 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fa8559a3-5137-4d82-a189-18e060db5fa5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fa8559a3-5137-4d82-a189-18e060db5fa5\") " pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.752170 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa8559a3-5137-4d82-a189-18e060db5fa5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"fa8559a3-5137-4d82-a189-18e060db5fa5\") " pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.752202 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sv8m\" (UniqueName: \"kubernetes.io/projected/fa8559a3-5137-4d82-a189-18e060db5fa5-kube-api-access-6sv8m\") pod \"cinder-api-0\" (UID: \"fa8559a3-5137-4d82-a189-18e060db5fa5\") " pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.776178 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7b845bddc9-bzwtz"] Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.778435 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7b845bddc9-bzwtz" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.785137 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-dxh5g" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.785335 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.785591 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.803058 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7b845bddc9-bzwtz"] Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.819000 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7d44554668-c49q8"] Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.820700 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7d44554668-c49q8" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.823033 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.835161 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7d44554668-c49q8"] Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.854832 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sv8m\" (UniqueName: \"kubernetes.io/projected/fa8559a3-5137-4d82-a189-18e060db5fa5-kube-api-access-6sv8m\") pod \"cinder-api-0\" (UID: \"fa8559a3-5137-4d82-a189-18e060db5fa5\") " pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.854904 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16aa5e20-01b7-401e-abfd-161e81af9c70-combined-ca-bundle\") pod \"barbican-worker-7b845bddc9-bzwtz\" (UID: \"16aa5e20-01b7-401e-abfd-161e81af9c70\") " pod="openstack/barbican-worker-7b845bddc9-bzwtz" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.854944 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stbrk\" (UniqueName: \"kubernetes.io/projected/16aa5e20-01b7-401e-abfd-161e81af9c70-kube-api-access-stbrk\") pod \"barbican-worker-7b845bddc9-bzwtz\" (UID: \"16aa5e20-01b7-401e-abfd-161e81af9c70\") " pod="openstack/barbican-worker-7b845bddc9-bzwtz" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.854985 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa8559a3-5137-4d82-a189-18e060db5fa5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"fa8559a3-5137-4d82-a189-18e060db5fa5\") " pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.855014 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa8559a3-5137-4d82-a189-18e060db5fa5-config-data-custom\") pod \"cinder-api-0\" (UID: \"fa8559a3-5137-4d82-a189-18e060db5fa5\") " pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.855187 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa8559a3-5137-4d82-a189-18e060db5fa5-logs\") pod \"cinder-api-0\" (UID: \"fa8559a3-5137-4d82-a189-18e060db5fa5\") " pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.855242 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ea2def9-7751-439c-8c18-05f3568cae9f-logs\") pod \"barbican-keystone-listener-7d44554668-c49q8\" (UID: \"2ea2def9-7751-439c-8c18-05f3568cae9f\") " pod="openstack/barbican-keystone-listener-7d44554668-c49q8" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.855303 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ea2def9-7751-439c-8c18-05f3568cae9f-config-data\") pod \"barbican-keystone-listener-7d44554668-c49q8\" (UID: \"2ea2def9-7751-439c-8c18-05f3568cae9f\") " pod="openstack/barbican-keystone-listener-7d44554668-c49q8" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.855353 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16aa5e20-01b7-401e-abfd-161e81af9c70-config-data-custom\") pod \"barbican-worker-7b845bddc9-bzwtz\" (UID: \"16aa5e20-01b7-401e-abfd-161e81af9c70\") " pod="openstack/barbican-worker-7b845bddc9-bzwtz" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.855412 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea2def9-7751-439c-8c18-05f3568cae9f-combined-ca-bundle\") pod \"barbican-keystone-listener-7d44554668-c49q8\" (UID: \"2ea2def9-7751-439c-8c18-05f3568cae9f\") " pod="openstack/barbican-keystone-listener-7d44554668-c49q8" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.855433 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7fhg\" (UniqueName: \"kubernetes.io/projected/2ea2def9-7751-439c-8c18-05f3568cae9f-kube-api-access-w7fhg\") pod \"barbican-keystone-listener-7d44554668-c49q8\" (UID: \"2ea2def9-7751-439c-8c18-05f3568cae9f\") " pod="openstack/barbican-keystone-listener-7d44554668-c49q8" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.855458 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16aa5e20-01b7-401e-abfd-161e81af9c70-config-data\") pod \"barbican-worker-7b845bddc9-bzwtz\" (UID: \"16aa5e20-01b7-401e-abfd-161e81af9c70\") " pod="openstack/barbican-worker-7b845bddc9-bzwtz" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.855481 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa8559a3-5137-4d82-a189-18e060db5fa5-scripts\") pod \"cinder-api-0\" (UID: \"fa8559a3-5137-4d82-a189-18e060db5fa5\") " pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.855499 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8559a3-5137-4d82-a189-18e060db5fa5-config-data\") pod \"cinder-api-0\" (UID: \"fa8559a3-5137-4d82-a189-18e060db5fa5\") " pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.855531 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ea2def9-7751-439c-8c18-05f3568cae9f-config-data-custom\") pod \"barbican-keystone-listener-7d44554668-c49q8\" (UID: \"2ea2def9-7751-439c-8c18-05f3568cae9f\") " pod="openstack/barbican-keystone-listener-7d44554668-c49q8" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.855547 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8559a3-5137-4d82-a189-18e060db5fa5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fa8559a3-5137-4d82-a189-18e060db5fa5\") " pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.855566 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16aa5e20-01b7-401e-abfd-161e81af9c70-logs\") pod \"barbican-worker-7b845bddc9-bzwtz\" (UID: \"16aa5e20-01b7-401e-abfd-161e81af9c70\") " pod="openstack/barbican-worker-7b845bddc9-bzwtz" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.855623 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa8559a3-5137-4d82-a189-18e060db5fa5-logs\") pod \"cinder-api-0\" (UID: \"fa8559a3-5137-4d82-a189-18e060db5fa5\") " pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.855643 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fa8559a3-5137-4d82-a189-18e060db5fa5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fa8559a3-5137-4d82-a189-18e060db5fa5\") " pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.855710 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa8559a3-5137-4d82-a189-18e060db5fa5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"fa8559a3-5137-4d82-a189-18e060db5fa5\") " pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.856608 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fa8559a3-5137-4d82-a189-18e060db5fa5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fa8559a3-5137-4d82-a189-18e060db5fa5\") " pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.862615 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa8559a3-5137-4d82-a189-18e060db5fa5-config-data-custom\") pod \"cinder-api-0\" (UID: \"fa8559a3-5137-4d82-a189-18e060db5fa5\") " pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.863113 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa8559a3-5137-4d82-a189-18e060db5fa5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"fa8559a3-5137-4d82-a189-18e060db5fa5\") " pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.865548 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa8559a3-5137-4d82-a189-18e060db5fa5-scripts\") pod \"cinder-api-0\" (UID: \"fa8559a3-5137-4d82-a189-18e060db5fa5\") " pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.868865 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8559a3-5137-4d82-a189-18e060db5fa5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fa8559a3-5137-4d82-a189-18e060db5fa5\") " pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.870694 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8559a3-5137-4d82-a189-18e060db5fa5-config-data\") pod \"cinder-api-0\" (UID: \"fa8559a3-5137-4d82-a189-18e060db5fa5\") " pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.874474 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa8559a3-5137-4d82-a189-18e060db5fa5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"fa8559a3-5137-4d82-a189-18e060db5fa5\") " pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.875474 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sv8m\" (UniqueName: \"kubernetes.io/projected/fa8559a3-5137-4d82-a189-18e060db5fa5-kube-api-access-6sv8m\") pod \"cinder-api-0\" (UID: \"fa8559a3-5137-4d82-a189-18e060db5fa5\") " pod="openstack/cinder-api-0" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.915197 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7576c49c-zd5s9"] Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.930360 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79b767c57c-tn7gd"] Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.931911 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.946898 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79b767c57c-tn7gd"] Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.957118 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea2def9-7751-439c-8c18-05f3568cae9f-combined-ca-bundle\") pod \"barbican-keystone-listener-7d44554668-c49q8\" (UID: \"2ea2def9-7751-439c-8c18-05f3568cae9f\") " pod="openstack/barbican-keystone-listener-7d44554668-c49q8" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.957154 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7fhg\" (UniqueName: \"kubernetes.io/projected/2ea2def9-7751-439c-8c18-05f3568cae9f-kube-api-access-w7fhg\") pod \"barbican-keystone-listener-7d44554668-c49q8\" (UID: \"2ea2def9-7751-439c-8c18-05f3568cae9f\") " pod="openstack/barbican-keystone-listener-7d44554668-c49q8" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.957179 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16aa5e20-01b7-401e-abfd-161e81af9c70-config-data\") pod \"barbican-worker-7b845bddc9-bzwtz\" (UID: \"16aa5e20-01b7-401e-abfd-161e81af9c70\") " pod="openstack/barbican-worker-7b845bddc9-bzwtz" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.957206 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ea2def9-7751-439c-8c18-05f3568cae9f-config-data-custom\") pod \"barbican-keystone-listener-7d44554668-c49q8\" (UID: \"2ea2def9-7751-439c-8c18-05f3568cae9f\") " pod="openstack/barbican-keystone-listener-7d44554668-c49q8" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.957227 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-dns-svc\") pod \"dnsmasq-dns-79b767c57c-tn7gd\" (UID: \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\") " pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.957245 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16aa5e20-01b7-401e-abfd-161e81af9c70-logs\") pod \"barbican-worker-7b845bddc9-bzwtz\" (UID: \"16aa5e20-01b7-401e-abfd-161e81af9c70\") " pod="openstack/barbican-worker-7b845bddc9-bzwtz" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.957284 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-ovsdbserver-nb\") pod \"dnsmasq-dns-79b767c57c-tn7gd\" (UID: \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\") " pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.957334 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-ovsdbserver-sb\") pod \"dnsmasq-dns-79b767c57c-tn7gd\" (UID: \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\") " pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.957352 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16aa5e20-01b7-401e-abfd-161e81af9c70-combined-ca-bundle\") pod \"barbican-worker-7b845bddc9-bzwtz\" (UID: \"16aa5e20-01b7-401e-abfd-161e81af9c70\") " pod="openstack/barbican-worker-7b845bddc9-bzwtz" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.957373 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmdzt\" (UniqueName: \"kubernetes.io/projected/ca3e53a8-410a-4b02-95e8-00a98704a7ff-kube-api-access-kmdzt\") pod \"dnsmasq-dns-79b767c57c-tn7gd\" (UID: \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\") " pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.957393 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stbrk\" (UniqueName: \"kubernetes.io/projected/16aa5e20-01b7-401e-abfd-161e81af9c70-kube-api-access-stbrk\") pod \"barbican-worker-7b845bddc9-bzwtz\" (UID: \"16aa5e20-01b7-401e-abfd-161e81af9c70\") " pod="openstack/barbican-worker-7b845bddc9-bzwtz" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.957418 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-config\") pod \"dnsmasq-dns-79b767c57c-tn7gd\" (UID: \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\") " pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.957439 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-dns-swift-storage-0\") pod \"dnsmasq-dns-79b767c57c-tn7gd\" (UID: \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\") " pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.957477 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ea2def9-7751-439c-8c18-05f3568cae9f-logs\") pod \"barbican-keystone-listener-7d44554668-c49q8\" (UID: \"2ea2def9-7751-439c-8c18-05f3568cae9f\") " pod="openstack/barbican-keystone-listener-7d44554668-c49q8" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.957508 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ea2def9-7751-439c-8c18-05f3568cae9f-config-data\") pod \"barbican-keystone-listener-7d44554668-c49q8\" (UID: \"2ea2def9-7751-439c-8c18-05f3568cae9f\") " pod="openstack/barbican-keystone-listener-7d44554668-c49q8" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.957532 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16aa5e20-01b7-401e-abfd-161e81af9c70-config-data-custom\") pod \"barbican-worker-7b845bddc9-bzwtz\" (UID: \"16aa5e20-01b7-401e-abfd-161e81af9c70\") " pod="openstack/barbican-worker-7b845bddc9-bzwtz" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.959492 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16aa5e20-01b7-401e-abfd-161e81af9c70-logs\") pod \"barbican-worker-7b845bddc9-bzwtz\" (UID: \"16aa5e20-01b7-401e-abfd-161e81af9c70\") " pod="openstack/barbican-worker-7b845bddc9-bzwtz" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.960065 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ea2def9-7751-439c-8c18-05f3568cae9f-logs\") pod \"barbican-keystone-listener-7d44554668-c49q8\" (UID: \"2ea2def9-7751-439c-8c18-05f3568cae9f\") " pod="openstack/barbican-keystone-listener-7d44554668-c49q8" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.960965 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea2def9-7751-439c-8c18-05f3568cae9f-combined-ca-bundle\") pod \"barbican-keystone-listener-7d44554668-c49q8\" (UID: \"2ea2def9-7751-439c-8c18-05f3568cae9f\") " pod="openstack/barbican-keystone-listener-7d44554668-c49q8" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.961248 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ea2def9-7751-439c-8c18-05f3568cae9f-config-data-custom\") pod \"barbican-keystone-listener-7d44554668-c49q8\" (UID: \"2ea2def9-7751-439c-8c18-05f3568cae9f\") " pod="openstack/barbican-keystone-listener-7d44554668-c49q8" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.962147 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16aa5e20-01b7-401e-abfd-161e81af9c70-config-data-custom\") pod \"barbican-worker-7b845bddc9-bzwtz\" (UID: \"16aa5e20-01b7-401e-abfd-161e81af9c70\") " pod="openstack/barbican-worker-7b845bddc9-bzwtz" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.963725 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16aa5e20-01b7-401e-abfd-161e81af9c70-combined-ca-bundle\") pod \"barbican-worker-7b845bddc9-bzwtz\" (UID: \"16aa5e20-01b7-401e-abfd-161e81af9c70\") " pod="openstack/barbican-worker-7b845bddc9-bzwtz" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.970486 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16aa5e20-01b7-401e-abfd-161e81af9c70-config-data\") pod \"barbican-worker-7b845bddc9-bzwtz\" (UID: \"16aa5e20-01b7-401e-abfd-161e81af9c70\") " pod="openstack/barbican-worker-7b845bddc9-bzwtz" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.971060 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ea2def9-7751-439c-8c18-05f3568cae9f-config-data\") pod \"barbican-keystone-listener-7d44554668-c49q8\" (UID: \"2ea2def9-7751-439c-8c18-05f3568cae9f\") " pod="openstack/barbican-keystone-listener-7d44554668-c49q8" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.989651 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7fhg\" (UniqueName: \"kubernetes.io/projected/2ea2def9-7751-439c-8c18-05f3568cae9f-kube-api-access-w7fhg\") pod \"barbican-keystone-listener-7d44554668-c49q8\" (UID: \"2ea2def9-7751-439c-8c18-05f3568cae9f\") " pod="openstack/barbican-keystone-listener-7d44554668-c49q8" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.997624 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stbrk\" (UniqueName: \"kubernetes.io/projected/16aa5e20-01b7-401e-abfd-161e81af9c70-kube-api-access-stbrk\") pod \"barbican-worker-7b845bddc9-bzwtz\" (UID: \"16aa5e20-01b7-401e-abfd-161e81af9c70\") " pod="openstack/barbican-worker-7b845bddc9-bzwtz" Feb 19 18:52:48 crc kubenswrapper[4749]: I0219 18:52:48.999465 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.049128 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.071074 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-dns-svc\") pod \"dnsmasq-dns-79b767c57c-tn7gd\" (UID: \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\") " pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.071145 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-ovsdbserver-nb\") pod \"dnsmasq-dns-79b767c57c-tn7gd\" (UID: \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\") " pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.071200 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-ovsdbserver-sb\") pod \"dnsmasq-dns-79b767c57c-tn7gd\" (UID: \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\") " pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.071221 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmdzt\" (UniqueName: \"kubernetes.io/projected/ca3e53a8-410a-4b02-95e8-00a98704a7ff-kube-api-access-kmdzt\") pod \"dnsmasq-dns-79b767c57c-tn7gd\" (UID: \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\") " pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.071247 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-config\") pod \"dnsmasq-dns-79b767c57c-tn7gd\" (UID: \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\") " pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.071274 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-dns-swift-storage-0\") pod \"dnsmasq-dns-79b767c57c-tn7gd\" (UID: \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\") " pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.072147 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-ovsdbserver-sb\") pod \"dnsmasq-dns-79b767c57c-tn7gd\" (UID: \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\") " pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.072549 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-dns-svc\") pod \"dnsmasq-dns-79b767c57c-tn7gd\" (UID: \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\") " pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.072684 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-ovsdbserver-nb\") pod \"dnsmasq-dns-79b767c57c-tn7gd\" (UID: \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\") " pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.072850 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-dns-swift-storage-0\") pod \"dnsmasq-dns-79b767c57c-tn7gd\" (UID: \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\") " pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.072887 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-config\") pod \"dnsmasq-dns-79b767c57c-tn7gd\" (UID: \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\") " pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.086802 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-dc7975d7b-jl97p"] Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.092488 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-dc7975d7b-jl97p" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.097121 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.105804 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-dc7975d7b-jl97p"] Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.114864 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7b845bddc9-bzwtz" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.133240 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmdzt\" (UniqueName: \"kubernetes.io/projected/ca3e53a8-410a-4b02-95e8-00a98704a7ff-kube-api-access-kmdzt\") pod \"dnsmasq-dns-79b767c57c-tn7gd\" (UID: \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\") " pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.155515 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7d44554668-c49q8" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.276791 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.286170 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee13f81f-9ab7-496e-b4e6-85a530c1774e-config-data-custom\") pod \"barbican-api-dc7975d7b-jl97p\" (UID: \"ee13f81f-9ab7-496e-b4e6-85a530c1774e\") " pod="openstack/barbican-api-dc7975d7b-jl97p" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.286232 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee13f81f-9ab7-496e-b4e6-85a530c1774e-config-data\") pod \"barbican-api-dc7975d7b-jl97p\" (UID: \"ee13f81f-9ab7-496e-b4e6-85a530c1774e\") " pod="openstack/barbican-api-dc7975d7b-jl97p" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.286257 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee13f81f-9ab7-496e-b4e6-85a530c1774e-logs\") pod \"barbican-api-dc7975d7b-jl97p\" (UID: \"ee13f81f-9ab7-496e-b4e6-85a530c1774e\") " pod="openstack/barbican-api-dc7975d7b-jl97p" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.286354 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zjfw\" (UniqueName: \"kubernetes.io/projected/ee13f81f-9ab7-496e-b4e6-85a530c1774e-kube-api-access-9zjfw\") pod \"barbican-api-dc7975d7b-jl97p\" (UID: \"ee13f81f-9ab7-496e-b4e6-85a530c1774e\") " pod="openstack/barbican-api-dc7975d7b-jl97p" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.286376 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee13f81f-9ab7-496e-b4e6-85a530c1774e-combined-ca-bundle\") pod \"barbican-api-dc7975d7b-jl97p\" (UID: \"ee13f81f-9ab7-496e-b4e6-85a530c1774e\") " pod="openstack/barbican-api-dc7975d7b-jl97p" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.389111 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee13f81f-9ab7-496e-b4e6-85a530c1774e-config-data-custom\") pod \"barbican-api-dc7975d7b-jl97p\" (UID: \"ee13f81f-9ab7-496e-b4e6-85a530c1774e\") " pod="openstack/barbican-api-dc7975d7b-jl97p" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.389174 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee13f81f-9ab7-496e-b4e6-85a530c1774e-config-data\") pod \"barbican-api-dc7975d7b-jl97p\" (UID: \"ee13f81f-9ab7-496e-b4e6-85a530c1774e\") " pod="openstack/barbican-api-dc7975d7b-jl97p" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.389197 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee13f81f-9ab7-496e-b4e6-85a530c1774e-logs\") pod \"barbican-api-dc7975d7b-jl97p\" (UID: \"ee13f81f-9ab7-496e-b4e6-85a530c1774e\") " pod="openstack/barbican-api-dc7975d7b-jl97p" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.389301 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zjfw\" (UniqueName: \"kubernetes.io/projected/ee13f81f-9ab7-496e-b4e6-85a530c1774e-kube-api-access-9zjfw\") pod \"barbican-api-dc7975d7b-jl97p\" (UID: \"ee13f81f-9ab7-496e-b4e6-85a530c1774e\") " pod="openstack/barbican-api-dc7975d7b-jl97p" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.389318 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee13f81f-9ab7-496e-b4e6-85a530c1774e-combined-ca-bundle\") pod \"barbican-api-dc7975d7b-jl97p\" (UID: \"ee13f81f-9ab7-496e-b4e6-85a530c1774e\") " pod="openstack/barbican-api-dc7975d7b-jl97p" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.390365 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee13f81f-9ab7-496e-b4e6-85a530c1774e-logs\") pod \"barbican-api-dc7975d7b-jl97p\" (UID: \"ee13f81f-9ab7-496e-b4e6-85a530c1774e\") " pod="openstack/barbican-api-dc7975d7b-jl97p" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.398657 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee13f81f-9ab7-496e-b4e6-85a530c1774e-combined-ca-bundle\") pod \"barbican-api-dc7975d7b-jl97p\" (UID: \"ee13f81f-9ab7-496e-b4e6-85a530c1774e\") " pod="openstack/barbican-api-dc7975d7b-jl97p" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.398844 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee13f81f-9ab7-496e-b4e6-85a530c1774e-config-data\") pod \"barbican-api-dc7975d7b-jl97p\" (UID: \"ee13f81f-9ab7-496e-b4e6-85a530c1774e\") " pod="openstack/barbican-api-dc7975d7b-jl97p" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.403606 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee13f81f-9ab7-496e-b4e6-85a530c1774e-config-data-custom\") pod \"barbican-api-dc7975d7b-jl97p\" (UID: \"ee13f81f-9ab7-496e-b4e6-85a530c1774e\") " pod="openstack/barbican-api-dc7975d7b-jl97p" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.409792 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zjfw\" (UniqueName: \"kubernetes.io/projected/ee13f81f-9ab7-496e-b4e6-85a530c1774e-kube-api-access-9zjfw\") pod \"barbican-api-dc7975d7b-jl97p\" (UID: \"ee13f81f-9ab7-496e-b4e6-85a530c1774e\") " pod="openstack/barbican-api-dc7975d7b-jl97p" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.562085 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" podUID="fc4a9e03-efdd-49ff-b106-8eb68a971e78" containerName="dnsmasq-dns" containerID="cri-o://6a59099c8e518b01c51fd287ff6818ccaae6707b1565035a93a141d072e44b16" gracePeriod=10 Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.563143 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7838589-f49f-4b20-a019-48db9b2ae719","Type":"ContainerStarted","Data":"a16f4e4d1acac26c195aeb2c3e2a960c0c0cedff3af620794bb6b1f4acec01c6"} Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.563481 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.592141 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.713743464 podStartE2EDuration="6.592124014s" podCreationTimestamp="2026-02-19 18:52:43 +0000 UTC" firstStartedPulling="2026-02-19 18:52:44.60268455 +0000 UTC m=+1138.563904504" lastFinishedPulling="2026-02-19 18:52:48.48106511 +0000 UTC m=+1142.442285054" observedRunningTime="2026-02-19 18:52:49.583014399 +0000 UTC m=+1143.544234363" watchObservedRunningTime="2026-02-19 18:52:49.592124014 +0000 UTC m=+1143.553343958" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.602701 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-dc7975d7b-jl97p" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.696632 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.710785 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.714406 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.818149 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7b845bddc9-bzwtz"] Feb 19 18:52:49 crc kubenswrapper[4749]: W0219 18:52:49.818362 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ea2def9_7751_439c_8c18_05f3568cae9f.slice/crio-a8dc5bd3683d2c3fc7afedcfb0458669e03b247c1c66fe6ae1705a37ca253e6e WatchSource:0}: Error finding container a8dc5bd3683d2c3fc7afedcfb0458669e03b247c1c66fe6ae1705a37ca253e6e: Status 404 returned error can't find the container with id a8dc5bd3683d2c3fc7afedcfb0458669e03b247c1c66fe6ae1705a37ca253e6e Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.833197 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7d44554668-c49q8"] Feb 19 18:52:49 crc kubenswrapper[4749]: I0219 18:52:49.934470 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79b767c57c-tn7gd"] Feb 19 18:52:50 crc kubenswrapper[4749]: I0219 18:52:50.131621 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-dc7975d7b-jl97p"] Feb 19 18:52:50 crc kubenswrapper[4749]: I0219 18:52:50.585951 4749 generic.go:334] "Generic (PLEG): container finished" podID="fc4a9e03-efdd-49ff-b106-8eb68a971e78" containerID="6a59099c8e518b01c51fd287ff6818ccaae6707b1565035a93a141d072e44b16" exitCode=0 Feb 19 18:52:50 crc kubenswrapper[4749]: I0219 18:52:50.586007 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" event={"ID":"fc4a9e03-efdd-49ff-b106-8eb68a971e78","Type":"ContainerDied","Data":"6a59099c8e518b01c51fd287ff6818ccaae6707b1565035a93a141d072e44b16"} Feb 19 18:52:50 crc kubenswrapper[4749]: I0219 18:52:50.587762 4749 generic.go:334] "Generic (PLEG): container finished" podID="ca3e53a8-410a-4b02-95e8-00a98704a7ff" containerID="6fa7334f35c9362ca76ecadac6663a643e0d396bf0ab803232661d936faa3302" exitCode=0 Feb 19 18:52:50 crc kubenswrapper[4749]: I0219 18:52:50.587804 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" event={"ID":"ca3e53a8-410a-4b02-95e8-00a98704a7ff","Type":"ContainerDied","Data":"6fa7334f35c9362ca76ecadac6663a643e0d396bf0ab803232661d936faa3302"} Feb 19 18:52:50 crc kubenswrapper[4749]: I0219 18:52:50.587820 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" event={"ID":"ca3e53a8-410a-4b02-95e8-00a98704a7ff","Type":"ContainerStarted","Data":"a21fb7fcde68258d7052ea52a31d36f2967e4c5e2d9e0707b7e0a6a0e66d9306"} Feb 19 18:52:50 crc kubenswrapper[4749]: I0219 18:52:50.601152 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-dc7975d7b-jl97p" event={"ID":"ee13f81f-9ab7-496e-b4e6-85a530c1774e","Type":"ContainerStarted","Data":"0a2a9e2dee36c64764d83eac26426bf3c2c16cae705484223aa2ae8560ebfb85"} Feb 19 18:52:50 crc kubenswrapper[4749]: I0219 18:52:50.601188 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-dc7975d7b-jl97p" event={"ID":"ee13f81f-9ab7-496e-b4e6-85a530c1774e","Type":"ContainerStarted","Data":"95ac553ea931a44055411e3d8c81b751e89748515d1db968bd47c76f3f2cd0ee"} Feb 19 18:52:50 crc kubenswrapper[4749]: I0219 18:52:50.603790 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fa8559a3-5137-4d82-a189-18e060db5fa5","Type":"ContainerStarted","Data":"3025b090602a33c12d6625f497662bfcff17357bcd67508c7cef84e3b154d0f9"} Feb 19 18:52:50 crc kubenswrapper[4749]: I0219 18:52:50.604102 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fa8559a3-5137-4d82-a189-18e060db5fa5","Type":"ContainerStarted","Data":"a2e6bc258e32c3fcd21bb3618da3b1d0873cfe39eacaa2cc549d973da9fe8da8"} Feb 19 18:52:50 crc kubenswrapper[4749]: I0219 18:52:50.606288 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d44554668-c49q8" event={"ID":"2ea2def9-7751-439c-8c18-05f3568cae9f","Type":"ContainerStarted","Data":"a8dc5bd3683d2c3fc7afedcfb0458669e03b247c1c66fe6ae1705a37ca253e6e"} Feb 19 18:52:50 crc kubenswrapper[4749]: I0219 18:52:50.612711 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b845bddc9-bzwtz" event={"ID":"16aa5e20-01b7-401e-abfd-161e81af9c70","Type":"ContainerStarted","Data":"373c976c510c766198217232ab4707613503e427bb83c63c355f57e5b5e9730a"} Feb 19 18:52:50 crc kubenswrapper[4749]: I0219 18:52:50.625965 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 19 18:52:50 crc kubenswrapper[4749]: I0219 18:52:50.717743 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" Feb 19 18:52:50 crc kubenswrapper[4749]: I0219 18:52:50.835731 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-dns-swift-storage-0\") pod \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\" (UID: \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\") " Feb 19 18:52:50 crc kubenswrapper[4749]: I0219 18:52:50.835877 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-config\") pod \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\" (UID: \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\") " Feb 19 18:52:50 crc kubenswrapper[4749]: I0219 18:52:50.835902 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88cwh\" (UniqueName: \"kubernetes.io/projected/fc4a9e03-efdd-49ff-b106-8eb68a971e78-kube-api-access-88cwh\") pod \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\" (UID: \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\") " Feb 19 18:52:50 crc kubenswrapper[4749]: I0219 18:52:50.835953 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-ovsdbserver-nb\") pod \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\" (UID: \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\") " Feb 19 18:52:50 crc kubenswrapper[4749]: I0219 18:52:50.835970 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-ovsdbserver-sb\") pod \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\" (UID: \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\") " Feb 19 18:52:50 crc kubenswrapper[4749]: I0219 18:52:50.836082 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-dns-svc\") pod \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\" (UID: \"fc4a9e03-efdd-49ff-b106-8eb68a971e78\") " Feb 19 18:52:50 crc kubenswrapper[4749]: I0219 18:52:50.840836 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc4a9e03-efdd-49ff-b106-8eb68a971e78-kube-api-access-88cwh" (OuterVolumeSpecName: "kube-api-access-88cwh") pod "fc4a9e03-efdd-49ff-b106-8eb68a971e78" (UID: "fc4a9e03-efdd-49ff-b106-8eb68a971e78"). InnerVolumeSpecName "kube-api-access-88cwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:52:50 crc kubenswrapper[4749]: I0219 18:52:50.937825 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88cwh\" (UniqueName: \"kubernetes.io/projected/fc4a9e03-efdd-49ff-b106-8eb68a971e78-kube-api-access-88cwh\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:50 crc kubenswrapper[4749]: I0219 18:52:50.952142 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-config" (OuterVolumeSpecName: "config") pod "fc4a9e03-efdd-49ff-b106-8eb68a971e78" (UID: "fc4a9e03-efdd-49ff-b106-8eb68a971e78"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:52:50 crc kubenswrapper[4749]: I0219 18:52:50.957258 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fc4a9e03-efdd-49ff-b106-8eb68a971e78" (UID: "fc4a9e03-efdd-49ff-b106-8eb68a971e78"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:52:50 crc kubenswrapper[4749]: I0219 18:52:50.965041 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fc4a9e03-efdd-49ff-b106-8eb68a971e78" (UID: "fc4a9e03-efdd-49ff-b106-8eb68a971e78"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:52:50 crc kubenswrapper[4749]: I0219 18:52:50.984555 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fc4a9e03-efdd-49ff-b106-8eb68a971e78" (UID: "fc4a9e03-efdd-49ff-b106-8eb68a971e78"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:52:50 crc kubenswrapper[4749]: I0219 18:52:50.989996 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fc4a9e03-efdd-49ff-b106-8eb68a971e78" (UID: "fc4a9e03-efdd-49ff-b106-8eb68a971e78"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:52:51 crc kubenswrapper[4749]: I0219 18:52:51.039730 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:51 crc kubenswrapper[4749]: I0219 18:52:51.039777 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:51 crc kubenswrapper[4749]: I0219 18:52:51.039790 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:51 crc kubenswrapper[4749]: I0219 18:52:51.039803 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:51 crc kubenswrapper[4749]: I0219 18:52:51.039814 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc4a9e03-efdd-49ff-b106-8eb68a971e78-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:51 crc kubenswrapper[4749]: I0219 18:52:51.624235 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" event={"ID":"ca3e53a8-410a-4b02-95e8-00a98704a7ff","Type":"ContainerStarted","Data":"e95b6ba4347f86f30103b4b060450d70f5978efb2ae54f9c884bd7cdfa7d80e9"} Feb 19 18:52:51 crc kubenswrapper[4749]: I0219 18:52:51.624362 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" Feb 19 18:52:51 crc kubenswrapper[4749]: I0219 18:52:51.628802 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-dc7975d7b-jl97p" event={"ID":"ee13f81f-9ab7-496e-b4e6-85a530c1774e","Type":"ContainerStarted","Data":"92679fa58c5c8a59bff0e74b3cd821d421856825aae656fe59c734eb412e5e04"} Feb 19 18:52:51 crc kubenswrapper[4749]: I0219 18:52:51.629462 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-dc7975d7b-jl97p" Feb 19 18:52:51 crc kubenswrapper[4749]: I0219 18:52:51.629499 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-dc7975d7b-jl97p" Feb 19 18:52:51 crc kubenswrapper[4749]: I0219 18:52:51.632209 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" Feb 19 18:52:51 crc kubenswrapper[4749]: I0219 18:52:51.632474 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7576c49c-zd5s9" event={"ID":"fc4a9e03-efdd-49ff-b106-8eb68a971e78","Type":"ContainerDied","Data":"ca273837daf4b2553c7bbff285efb4b148bca0fd3e1480156447a15ec89a8f8b"} Feb 19 18:52:51 crc kubenswrapper[4749]: I0219 18:52:51.632610 4749 scope.go:117] "RemoveContainer" containerID="6a59099c8e518b01c51fd287ff6818ccaae6707b1565035a93a141d072e44b16" Feb 19 18:52:51 crc kubenswrapper[4749]: I0219 18:52:51.644574 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" podStartSLOduration=3.644550725 podStartE2EDuration="3.644550725s" podCreationTimestamp="2026-02-19 18:52:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:52:51.639978108 +0000 UTC m=+1145.601198072" watchObservedRunningTime="2026-02-19 18:52:51.644550725 +0000 UTC m=+1145.605770679" Feb 19 18:52:51 crc kubenswrapper[4749]: I0219 18:52:51.680124 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-dc7975d7b-jl97p" podStartSLOduration=2.680105667 podStartE2EDuration="2.680105667s" podCreationTimestamp="2026-02-19 18:52:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:52:51.667776513 +0000 UTC m=+1145.628996487" watchObservedRunningTime="2026-02-19 18:52:51.680105667 +0000 UTC m=+1145.641325611" Feb 19 18:52:51 crc kubenswrapper[4749]: I0219 18:52:51.690422 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7576c49c-zd5s9"] Feb 19 18:52:51 crc kubenswrapper[4749]: I0219 18:52:51.698865 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c7576c49c-zd5s9"] Feb 19 18:52:51 crc kubenswrapper[4749]: I0219 18:52:51.884182 4749 scope.go:117] "RemoveContainer" containerID="c3e0067df74bedc91b407f6ad00c88d8df83995229796215add530418da3086a" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.066784 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6d97b8448-pl5l5"] Feb 19 18:52:52 crc kubenswrapper[4749]: E0219 18:52:52.067203 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc4a9e03-efdd-49ff-b106-8eb68a971e78" containerName="init" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.067221 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc4a9e03-efdd-49ff-b106-8eb68a971e78" containerName="init" Feb 19 18:52:52 crc kubenswrapper[4749]: E0219 18:52:52.067261 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc4a9e03-efdd-49ff-b106-8eb68a971e78" containerName="dnsmasq-dns" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.067272 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc4a9e03-efdd-49ff-b106-8eb68a971e78" containerName="dnsmasq-dns" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.067482 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc4a9e03-efdd-49ff-b106-8eb68a971e78" containerName="dnsmasq-dns" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.071981 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d97b8448-pl5l5" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.076704 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.078501 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.104161 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6d97b8448-pl5l5"] Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.164897 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f19ae41b-3804-434e-b4a6-d461167f9548-logs\") pod \"barbican-api-6d97b8448-pl5l5\" (UID: \"f19ae41b-3804-434e-b4a6-d461167f9548\") " pod="openstack/barbican-api-6d97b8448-pl5l5" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.165209 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f19ae41b-3804-434e-b4a6-d461167f9548-public-tls-certs\") pod \"barbican-api-6d97b8448-pl5l5\" (UID: \"f19ae41b-3804-434e-b4a6-d461167f9548\") " pod="openstack/barbican-api-6d97b8448-pl5l5" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.165243 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f19ae41b-3804-434e-b4a6-d461167f9548-config-data\") pod \"barbican-api-6d97b8448-pl5l5\" (UID: \"f19ae41b-3804-434e-b4a6-d461167f9548\") " pod="openstack/barbican-api-6d97b8448-pl5l5" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.165264 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls42d\" (UniqueName: \"kubernetes.io/projected/f19ae41b-3804-434e-b4a6-d461167f9548-kube-api-access-ls42d\") pod \"barbican-api-6d97b8448-pl5l5\" (UID: \"f19ae41b-3804-434e-b4a6-d461167f9548\") " pod="openstack/barbican-api-6d97b8448-pl5l5" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.165431 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f19ae41b-3804-434e-b4a6-d461167f9548-config-data-custom\") pod \"barbican-api-6d97b8448-pl5l5\" (UID: \"f19ae41b-3804-434e-b4a6-d461167f9548\") " pod="openstack/barbican-api-6d97b8448-pl5l5" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.165487 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f19ae41b-3804-434e-b4a6-d461167f9548-combined-ca-bundle\") pod \"barbican-api-6d97b8448-pl5l5\" (UID: \"f19ae41b-3804-434e-b4a6-d461167f9548\") " pod="openstack/barbican-api-6d97b8448-pl5l5" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.165519 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f19ae41b-3804-434e-b4a6-d461167f9548-internal-tls-certs\") pod \"barbican-api-6d97b8448-pl5l5\" (UID: \"f19ae41b-3804-434e-b4a6-d461167f9548\") " pod="openstack/barbican-api-6d97b8448-pl5l5" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.269659 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f19ae41b-3804-434e-b4a6-d461167f9548-config-data-custom\") pod \"barbican-api-6d97b8448-pl5l5\" (UID: \"f19ae41b-3804-434e-b4a6-d461167f9548\") " pod="openstack/barbican-api-6d97b8448-pl5l5" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.269998 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f19ae41b-3804-434e-b4a6-d461167f9548-combined-ca-bundle\") pod \"barbican-api-6d97b8448-pl5l5\" (UID: \"f19ae41b-3804-434e-b4a6-d461167f9548\") " pod="openstack/barbican-api-6d97b8448-pl5l5" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.270038 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f19ae41b-3804-434e-b4a6-d461167f9548-internal-tls-certs\") pod \"barbican-api-6d97b8448-pl5l5\" (UID: \"f19ae41b-3804-434e-b4a6-d461167f9548\") " pod="openstack/barbican-api-6d97b8448-pl5l5" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.270077 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f19ae41b-3804-434e-b4a6-d461167f9548-logs\") pod \"barbican-api-6d97b8448-pl5l5\" (UID: \"f19ae41b-3804-434e-b4a6-d461167f9548\") " pod="openstack/barbican-api-6d97b8448-pl5l5" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.270199 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f19ae41b-3804-434e-b4a6-d461167f9548-public-tls-certs\") pod \"barbican-api-6d97b8448-pl5l5\" (UID: \"f19ae41b-3804-434e-b4a6-d461167f9548\") " pod="openstack/barbican-api-6d97b8448-pl5l5" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.270227 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f19ae41b-3804-434e-b4a6-d461167f9548-config-data\") pod \"barbican-api-6d97b8448-pl5l5\" (UID: \"f19ae41b-3804-434e-b4a6-d461167f9548\") " pod="openstack/barbican-api-6d97b8448-pl5l5" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.270249 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls42d\" (UniqueName: \"kubernetes.io/projected/f19ae41b-3804-434e-b4a6-d461167f9548-kube-api-access-ls42d\") pod \"barbican-api-6d97b8448-pl5l5\" (UID: \"f19ae41b-3804-434e-b4a6-d461167f9548\") " pod="openstack/barbican-api-6d97b8448-pl5l5" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.270638 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f19ae41b-3804-434e-b4a6-d461167f9548-logs\") pod \"barbican-api-6d97b8448-pl5l5\" (UID: \"f19ae41b-3804-434e-b4a6-d461167f9548\") " pod="openstack/barbican-api-6d97b8448-pl5l5" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.275971 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f19ae41b-3804-434e-b4a6-d461167f9548-config-data-custom\") pod \"barbican-api-6d97b8448-pl5l5\" (UID: \"f19ae41b-3804-434e-b4a6-d461167f9548\") " pod="openstack/barbican-api-6d97b8448-pl5l5" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.276395 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f19ae41b-3804-434e-b4a6-d461167f9548-public-tls-certs\") pod \"barbican-api-6d97b8448-pl5l5\" (UID: \"f19ae41b-3804-434e-b4a6-d461167f9548\") " pod="openstack/barbican-api-6d97b8448-pl5l5" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.278245 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f19ae41b-3804-434e-b4a6-d461167f9548-combined-ca-bundle\") pod \"barbican-api-6d97b8448-pl5l5\" (UID: \"f19ae41b-3804-434e-b4a6-d461167f9548\") " pod="openstack/barbican-api-6d97b8448-pl5l5" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.278436 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f19ae41b-3804-434e-b4a6-d461167f9548-internal-tls-certs\") pod \"barbican-api-6d97b8448-pl5l5\" (UID: \"f19ae41b-3804-434e-b4a6-d461167f9548\") " pod="openstack/barbican-api-6d97b8448-pl5l5" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.279419 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f19ae41b-3804-434e-b4a6-d461167f9548-config-data\") pod \"barbican-api-6d97b8448-pl5l5\" (UID: \"f19ae41b-3804-434e-b4a6-d461167f9548\") " pod="openstack/barbican-api-6d97b8448-pl5l5" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.289406 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls42d\" (UniqueName: \"kubernetes.io/projected/f19ae41b-3804-434e-b4a6-d461167f9548-kube-api-access-ls42d\") pod \"barbican-api-6d97b8448-pl5l5\" (UID: \"f19ae41b-3804-434e-b4a6-d461167f9548\") " pod="openstack/barbican-api-6d97b8448-pl5l5" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.346202 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-866f5f4f4b-zfvsn" podUID="f8dcae23-3df3-4de3-8a0a-499c15a90daa" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.166:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.166:8443: connect: connection refused" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.413162 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d97b8448-pl5l5" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.651790 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d44554668-c49q8" event={"ID":"2ea2def9-7751-439c-8c18-05f3568cae9f","Type":"ContainerStarted","Data":"59afb24a43ed6992ee7cae97731e59d81d05379863f0e4108ba93ce3f80a90dd"} Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.652087 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d44554668-c49q8" event={"ID":"2ea2def9-7751-439c-8c18-05f3568cae9f","Type":"ContainerStarted","Data":"267c66d4d41f326a37c8799a5dfb63c4e019126ead7991c2d60bf33cb3bdd6bc"} Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.656622 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b845bddc9-bzwtz" event={"ID":"16aa5e20-01b7-401e-abfd-161e81af9c70","Type":"ContainerStarted","Data":"71e17b091a27364dc3f7c5f1a38ecfaee242a93853a48e0fc391f7ef6da75de3"} Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.656664 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b845bddc9-bzwtz" event={"ID":"16aa5e20-01b7-401e-abfd-161e81af9c70","Type":"ContainerStarted","Data":"3fa0cb6aabb32180eaff4cfedb86a5068fdc5ea3d9b4971b22edbde43359f362"} Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.665662 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fa8559a3-5137-4d82-a189-18e060db5fa5","Type":"ContainerStarted","Data":"f3bb4c20e04e0f0855b90315c973b51cfd39c5ff1651ea0915d2a5cb8a62690f"} Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.668367 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.680291 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7d44554668-c49q8" podStartSLOduration=2.603544468 podStartE2EDuration="4.680275736s" podCreationTimestamp="2026-02-19 18:52:48 +0000 UTC" firstStartedPulling="2026-02-19 18:52:49.820487579 +0000 UTC m=+1143.781707533" lastFinishedPulling="2026-02-19 18:52:51.897218847 +0000 UTC m=+1145.858438801" observedRunningTime="2026-02-19 18:52:52.678720322 +0000 UTC m=+1146.639940296" watchObservedRunningTime="2026-02-19 18:52:52.680275736 +0000 UTC m=+1146.641495680" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.691949 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc4a9e03-efdd-49ff-b106-8eb68a971e78" path="/var/lib/kubelet/pods/fc4a9e03-efdd-49ff-b106-8eb68a971e78/volumes" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.713772 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.7137482 podStartE2EDuration="4.7137482s" podCreationTimestamp="2026-02-19 18:52:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:52:52.698595727 +0000 UTC m=+1146.659815691" watchObservedRunningTime="2026-02-19 18:52:52.7137482 +0000 UTC m=+1146.674968154" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.726654 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7b845bddc9-bzwtz" podStartSLOduration=2.661626108 podStartE2EDuration="4.72663217s" podCreationTimestamp="2026-02-19 18:52:48 +0000 UTC" firstStartedPulling="2026-02-19 18:52:49.829304074 +0000 UTC m=+1143.790524028" lastFinishedPulling="2026-02-19 18:52:51.894310136 +0000 UTC m=+1145.855530090" observedRunningTime="2026-02-19 18:52:52.725301933 +0000 UTC m=+1146.686521887" watchObservedRunningTime="2026-02-19 18:52:52.72663217 +0000 UTC m=+1146.687852144" Feb 19 18:52:52 crc kubenswrapper[4749]: I0219 18:52:52.911836 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6d97b8448-pl5l5"] Feb 19 18:52:53 crc kubenswrapper[4749]: I0219 18:52:53.677201 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d97b8448-pl5l5" event={"ID":"f19ae41b-3804-434e-b4a6-d461167f9548","Type":"ContainerStarted","Data":"e8993d76f36dd5719ca3b8d2943ccbbfff8b26733080ea59cec2bdaa6b83d5c9"} Feb 19 18:52:54 crc kubenswrapper[4749]: I0219 18:52:54.203090 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 19 18:52:54 crc kubenswrapper[4749]: I0219 18:52:54.292827 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 18:52:54 crc kubenswrapper[4749]: I0219 18:52:54.688670 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e78771f8-d9d8-4ca8-9295-dda27179f45c" containerName="cinder-scheduler" containerID="cri-o://81559378aa5cae39127229a527270851fc8cc05fb6eb870d68a51ac2e36acba8" gracePeriod=30 Feb 19 18:52:54 crc kubenswrapper[4749]: I0219 18:52:54.690927 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e78771f8-d9d8-4ca8-9295-dda27179f45c" containerName="probe" containerID="cri-o://c217810bc994ed64df8a5fb18536c7b49979ce42ac393e0fc0530f121332c7a1" gracePeriod=30 Feb 19 18:52:54 crc kubenswrapper[4749]: I0219 18:52:54.695220 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d97b8448-pl5l5" event={"ID":"f19ae41b-3804-434e-b4a6-d461167f9548","Type":"ContainerStarted","Data":"aad60217a910e6c42b6e4db5a9ef6feb6ae5102e30f9ef136d8c280d581a0a40"} Feb 19 18:52:54 crc kubenswrapper[4749]: I0219 18:52:54.695297 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6d97b8448-pl5l5" Feb 19 18:52:54 crc kubenswrapper[4749]: I0219 18:52:54.695314 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6d97b8448-pl5l5" Feb 19 18:52:54 crc kubenswrapper[4749]: I0219 18:52:54.695324 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d97b8448-pl5l5" event={"ID":"f19ae41b-3804-434e-b4a6-d461167f9548","Type":"ContainerStarted","Data":"aba105812d4bad0e50948e4825b0dbd1e68b8d6134568896381876fcbb5d7c62"} Feb 19 18:52:54 crc kubenswrapper[4749]: I0219 18:52:54.718839 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6d97b8448-pl5l5" podStartSLOduration=2.718816749 podStartE2EDuration="2.718816749s" podCreationTimestamp="2026-02-19 18:52:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:52:54.713267215 +0000 UTC m=+1148.674487169" watchObservedRunningTime="2026-02-19 18:52:54.718816749 +0000 UTC m=+1148.680036713" Feb 19 18:52:55 crc kubenswrapper[4749]: I0219 18:52:55.703805 4749 generic.go:334] "Generic (PLEG): container finished" podID="e78771f8-d9d8-4ca8-9295-dda27179f45c" containerID="c217810bc994ed64df8a5fb18536c7b49979ce42ac393e0fc0530f121332c7a1" exitCode=0 Feb 19 18:52:55 crc kubenswrapper[4749]: I0219 18:52:55.704123 4749 generic.go:334] "Generic (PLEG): container finished" podID="e78771f8-d9d8-4ca8-9295-dda27179f45c" containerID="81559378aa5cae39127229a527270851fc8cc05fb6eb870d68a51ac2e36acba8" exitCode=0 Feb 19 18:52:55 crc kubenswrapper[4749]: I0219 18:52:55.703880 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e78771f8-d9d8-4ca8-9295-dda27179f45c","Type":"ContainerDied","Data":"c217810bc994ed64df8a5fb18536c7b49979ce42ac393e0fc0530f121332c7a1"} Feb 19 18:52:55 crc kubenswrapper[4749]: I0219 18:52:55.704186 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e78771f8-d9d8-4ca8-9295-dda27179f45c","Type":"ContainerDied","Data":"81559378aa5cae39127229a527270851fc8cc05fb6eb870d68a51ac2e36acba8"} Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.050670 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.052042 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.055929 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.105685 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e78771f8-d9d8-4ca8-9295-dda27179f45c-combined-ca-bundle\") pod \"e78771f8-d9d8-4ca8-9295-dda27179f45c\" (UID: \"e78771f8-d9d8-4ca8-9295-dda27179f45c\") " Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.105765 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e78771f8-d9d8-4ca8-9295-dda27179f45c-scripts\") pod \"e78771f8-d9d8-4ca8-9295-dda27179f45c\" (UID: \"e78771f8-d9d8-4ca8-9295-dda27179f45c\") " Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.105824 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e78771f8-d9d8-4ca8-9295-dda27179f45c-etc-machine-id\") pod \"e78771f8-d9d8-4ca8-9295-dda27179f45c\" (UID: \"e78771f8-d9d8-4ca8-9295-dda27179f45c\") " Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.105878 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcz94\" (UniqueName: \"kubernetes.io/projected/e78771f8-d9d8-4ca8-9295-dda27179f45c-kube-api-access-qcz94\") pod \"e78771f8-d9d8-4ca8-9295-dda27179f45c\" (UID: \"e78771f8-d9d8-4ca8-9295-dda27179f45c\") " Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.105935 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e78771f8-d9d8-4ca8-9295-dda27179f45c-config-data-custom\") pod \"e78771f8-d9d8-4ca8-9295-dda27179f45c\" (UID: \"e78771f8-d9d8-4ca8-9295-dda27179f45c\") " Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.105968 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e78771f8-d9d8-4ca8-9295-dda27179f45c-config-data\") pod \"e78771f8-d9d8-4ca8-9295-dda27179f45c\" (UID: \"e78771f8-d9d8-4ca8-9295-dda27179f45c\") " Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.110994 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e78771f8-d9d8-4ca8-9295-dda27179f45c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e78771f8-d9d8-4ca8-9295-dda27179f45c" (UID: "e78771f8-d9d8-4ca8-9295-dda27179f45c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.135221 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e78771f8-d9d8-4ca8-9295-dda27179f45c-scripts" (OuterVolumeSpecName: "scripts") pod "e78771f8-d9d8-4ca8-9295-dda27179f45c" (UID: "e78771f8-d9d8-4ca8-9295-dda27179f45c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.139721 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e78771f8-d9d8-4ca8-9295-dda27179f45c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e78771f8-d9d8-4ca8-9295-dda27179f45c" (UID: "e78771f8-d9d8-4ca8-9295-dda27179f45c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.141705 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e78771f8-d9d8-4ca8-9295-dda27179f45c-kube-api-access-qcz94" (OuterVolumeSpecName: "kube-api-access-qcz94") pod "e78771f8-d9d8-4ca8-9295-dda27179f45c" (UID: "e78771f8-d9d8-4ca8-9295-dda27179f45c"). InnerVolumeSpecName "kube-api-access-qcz94". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.180889 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e78771f8-d9d8-4ca8-9295-dda27179f45c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e78771f8-d9d8-4ca8-9295-dda27179f45c" (UID: "e78771f8-d9d8-4ca8-9295-dda27179f45c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.208275 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e78771f8-d9d8-4ca8-9295-dda27179f45c-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.208308 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e78771f8-d9d8-4ca8-9295-dda27179f45c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.208318 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e78771f8-d9d8-4ca8-9295-dda27179f45c-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.208327 4749 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e78771f8-d9d8-4ca8-9295-dda27179f45c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.208337 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcz94\" (UniqueName: \"kubernetes.io/projected/e78771f8-d9d8-4ca8-9295-dda27179f45c-kube-api-access-qcz94\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.269169 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e78771f8-d9d8-4ca8-9295-dda27179f45c-config-data" (OuterVolumeSpecName: "config-data") pod "e78771f8-d9d8-4ca8-9295-dda27179f45c" (UID: "e78771f8-d9d8-4ca8-9295-dda27179f45c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.312902 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e78771f8-d9d8-4ca8-9295-dda27179f45c-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.423932 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-64c65fc786-5rs9n" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.716146 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.716586 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e78771f8-d9d8-4ca8-9295-dda27179f45c","Type":"ContainerDied","Data":"81fdcafe37332ff6aba335e95290b2db8d8b7caf1b119cacee6a45ab0f792976"} Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.716620 4749 scope.go:117] "RemoveContainer" containerID="c217810bc994ed64df8a5fb18536c7b49979ce42ac393e0fc0530f121332c7a1" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.742488 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.750473 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.758300 4749 scope.go:117] "RemoveContainer" containerID="81559378aa5cae39127229a527270851fc8cc05fb6eb870d68a51ac2e36acba8" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.765786 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 18:52:56 crc kubenswrapper[4749]: E0219 18:52:56.766254 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e78771f8-d9d8-4ca8-9295-dda27179f45c" containerName="cinder-scheduler" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.766270 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e78771f8-d9d8-4ca8-9295-dda27179f45c" containerName="cinder-scheduler" Feb 19 18:52:56 crc kubenswrapper[4749]: E0219 18:52:56.766312 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e78771f8-d9d8-4ca8-9295-dda27179f45c" containerName="probe" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.766320 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e78771f8-d9d8-4ca8-9295-dda27179f45c" containerName="probe" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.766579 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e78771f8-d9d8-4ca8-9295-dda27179f45c" containerName="probe" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.766631 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e78771f8-d9d8-4ca8-9295-dda27179f45c" containerName="cinder-scheduler" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.768137 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.770461 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.786065 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.837618 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f19d3222-dbed-44bf-94e0-7a17f5906051-config-data\") pod \"cinder-scheduler-0\" (UID: \"f19d3222-dbed-44bf-94e0-7a17f5906051\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.837690 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f19d3222-dbed-44bf-94e0-7a17f5906051-scripts\") pod \"cinder-scheduler-0\" (UID: \"f19d3222-dbed-44bf-94e0-7a17f5906051\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.837784 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f19d3222-dbed-44bf-94e0-7a17f5906051-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f19d3222-dbed-44bf-94e0-7a17f5906051\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.837815 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f19d3222-dbed-44bf-94e0-7a17f5906051-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f19d3222-dbed-44bf-94e0-7a17f5906051\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.837950 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snrcq\" (UniqueName: \"kubernetes.io/projected/f19d3222-dbed-44bf-94e0-7a17f5906051-kube-api-access-snrcq\") pod \"cinder-scheduler-0\" (UID: \"f19d3222-dbed-44bf-94e0-7a17f5906051\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.838012 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f19d3222-dbed-44bf-94e0-7a17f5906051-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f19d3222-dbed-44bf-94e0-7a17f5906051\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.939283 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snrcq\" (UniqueName: \"kubernetes.io/projected/f19d3222-dbed-44bf-94e0-7a17f5906051-kube-api-access-snrcq\") pod \"cinder-scheduler-0\" (UID: \"f19d3222-dbed-44bf-94e0-7a17f5906051\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.939354 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f19d3222-dbed-44bf-94e0-7a17f5906051-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f19d3222-dbed-44bf-94e0-7a17f5906051\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.939416 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f19d3222-dbed-44bf-94e0-7a17f5906051-config-data\") pod \"cinder-scheduler-0\" (UID: \"f19d3222-dbed-44bf-94e0-7a17f5906051\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.939457 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f19d3222-dbed-44bf-94e0-7a17f5906051-scripts\") pod \"cinder-scheduler-0\" (UID: \"f19d3222-dbed-44bf-94e0-7a17f5906051\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.939493 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f19d3222-dbed-44bf-94e0-7a17f5906051-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f19d3222-dbed-44bf-94e0-7a17f5906051\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.939516 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f19d3222-dbed-44bf-94e0-7a17f5906051-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f19d3222-dbed-44bf-94e0-7a17f5906051\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.940314 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f19d3222-dbed-44bf-94e0-7a17f5906051-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f19d3222-dbed-44bf-94e0-7a17f5906051\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.945248 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f19d3222-dbed-44bf-94e0-7a17f5906051-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f19d3222-dbed-44bf-94e0-7a17f5906051\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.945332 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f19d3222-dbed-44bf-94e0-7a17f5906051-scripts\") pod \"cinder-scheduler-0\" (UID: \"f19d3222-dbed-44bf-94e0-7a17f5906051\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.945807 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f19d3222-dbed-44bf-94e0-7a17f5906051-config-data\") pod \"cinder-scheduler-0\" (UID: \"f19d3222-dbed-44bf-94e0-7a17f5906051\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.952857 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f19d3222-dbed-44bf-94e0-7a17f5906051-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f19d3222-dbed-44bf-94e0-7a17f5906051\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:56 crc kubenswrapper[4749]: I0219 18:52:56.959958 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snrcq\" (UniqueName: \"kubernetes.io/projected/f19d3222-dbed-44bf-94e0-7a17f5906051-kube-api-access-snrcq\") pod \"cinder-scheduler-0\" (UID: \"f19d3222-dbed-44bf-94e0-7a17f5906051\") " pod="openstack/cinder-scheduler-0" Feb 19 18:52:57 crc kubenswrapper[4749]: I0219 18:52:57.088427 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 18:52:57 crc kubenswrapper[4749]: I0219 18:52:57.581646 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 18:52:57 crc kubenswrapper[4749]: I0219 18:52:57.679225 4749 scope.go:117] "RemoveContainer" containerID="bb744c3795b7bbfd4b38ce2f51e278731316bb1a81d475e683fa156086c9ffdd" Feb 19 18:52:57 crc kubenswrapper[4749]: E0219 18:52:57.679509 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(763173db-176a-426d-bd85-e051d56ec5cf)\"" pod="openstack/watcher-decision-engine-0" podUID="763173db-176a-426d-bd85-e051d56ec5cf" Feb 19 18:52:57 crc kubenswrapper[4749]: I0219 18:52:57.725010 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f19d3222-dbed-44bf-94e0-7a17f5906051","Type":"ContainerStarted","Data":"ad19c14d5c9331d49c65df0df0e9e2daed2f5a1b8700ea2c3ca80e561f586e55"} Feb 19 18:52:57 crc kubenswrapper[4749]: I0219 18:52:57.830740 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-59976cccdd-n7pz6" Feb 19 18:52:58 crc kubenswrapper[4749]: I0219 18:52:58.693832 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e78771f8-d9d8-4ca8-9295-dda27179f45c" path="/var/lib/kubelet/pods/e78771f8-d9d8-4ca8-9295-dda27179f45c/volumes" Feb 19 18:52:58 crc kubenswrapper[4749]: I0219 18:52:58.744069 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-85c7d94649-hz2gq" Feb 19 18:52:58 crc kubenswrapper[4749]: I0219 18:52:58.773168 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f19d3222-dbed-44bf-94e0-7a17f5906051","Type":"ContainerStarted","Data":"803c62ee7587e8db452ee7e4243db9bf9b3fbfe69fc6ac21ea957ed18c476c69"} Feb 19 18:52:58 crc kubenswrapper[4749]: I0219 18:52:58.808401 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-64c65fc786-5rs9n"] Feb 19 18:52:58 crc kubenswrapper[4749]: I0219 18:52:58.808785 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-64c65fc786-5rs9n" podUID="cd0afcb9-06ed-4981-a030-34f26ae748c3" containerName="neutron-httpd" containerID="cri-o://8d09a764d722e755c6a433d6318e930d5aff9b5a5b875bc9822a77e9a76fd842" gracePeriod=30 Feb 19 18:52:58 crc kubenswrapper[4749]: I0219 18:52:58.808720 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-64c65fc786-5rs9n" podUID="cd0afcb9-06ed-4981-a030-34f26ae748c3" containerName="neutron-api" containerID="cri-o://7679c4c78ad17569a830bc78b91549f45e3b99b32012c7029ba9b8e4c4c3aaab" gracePeriod=30 Feb 19 18:52:59 crc kubenswrapper[4749]: I0219 18:52:59.065363 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-548647668b-bwckt" Feb 19 18:52:59 crc kubenswrapper[4749]: I0219 18:52:59.140082 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-78cffb9ffd-xjhv2"] Feb 19 18:52:59 crc kubenswrapper[4749]: I0219 18:52:59.140524 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-78cffb9ffd-xjhv2" podUID="7b35ccc4-45ea-446a-8e12-5e8d57ccbda5" containerName="placement-log" containerID="cri-o://44a454c44ae72f405d6bfcd1bed6c789721d79e3f3df21d430e1b22966408630" gracePeriod=30 Feb 19 18:52:59 crc kubenswrapper[4749]: I0219 18:52:59.141211 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-78cffb9ffd-xjhv2" podUID="7b35ccc4-45ea-446a-8e12-5e8d57ccbda5" containerName="placement-api" containerID="cri-o://7608bdcb27a2cebde3e0f9c777ade7b95e122464ac03687e36db379b339f2e1a" gracePeriod=30 Feb 19 18:52:59 crc kubenswrapper[4749]: I0219 18:52:59.279236 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" Feb 19 18:52:59 crc kubenswrapper[4749]: I0219 18:52:59.369093 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dcd8cd889-wv2ds"] Feb 19 18:52:59 crc kubenswrapper[4749]: I0219 18:52:59.369461 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" podUID="b1a9182f-9dd3-40f6-a0b1-12b570382705" containerName="dnsmasq-dns" containerID="cri-o://68823225846a106051d510641f982fd731d3cc7b2a596eb3b8a2beecd361de79" gracePeriod=10 Feb 19 18:52:59 crc kubenswrapper[4749]: I0219 18:52:59.795960 4749 generic.go:334] "Generic (PLEG): container finished" podID="7b35ccc4-45ea-446a-8e12-5e8d57ccbda5" containerID="44a454c44ae72f405d6bfcd1bed6c789721d79e3f3df21d430e1b22966408630" exitCode=143 Feb 19 18:52:59 crc kubenswrapper[4749]: I0219 18:52:59.796166 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78cffb9ffd-xjhv2" event={"ID":"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5","Type":"ContainerDied","Data":"44a454c44ae72f405d6bfcd1bed6c789721d79e3f3df21d430e1b22966408630"} Feb 19 18:52:59 crc kubenswrapper[4749]: I0219 18:52:59.813580 4749 generic.go:334] "Generic (PLEG): container finished" podID="cd0afcb9-06ed-4981-a030-34f26ae748c3" containerID="8d09a764d722e755c6a433d6318e930d5aff9b5a5b875bc9822a77e9a76fd842" exitCode=0 Feb 19 18:52:59 crc kubenswrapper[4749]: I0219 18:52:59.813660 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64c65fc786-5rs9n" event={"ID":"cd0afcb9-06ed-4981-a030-34f26ae748c3","Type":"ContainerDied","Data":"8d09a764d722e755c6a433d6318e930d5aff9b5a5b875bc9822a77e9a76fd842"} Feb 19 18:52:59 crc kubenswrapper[4749]: I0219 18:52:59.824173 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f19d3222-dbed-44bf-94e0-7a17f5906051","Type":"ContainerStarted","Data":"92e470f10e69842757e655136feb86bfe7d59723087733937f72fd113986ff5f"} Feb 19 18:52:59 crc kubenswrapper[4749]: I0219 18:52:59.830317 4749 generic.go:334] "Generic (PLEG): container finished" podID="b1a9182f-9dd3-40f6-a0b1-12b570382705" containerID="68823225846a106051d510641f982fd731d3cc7b2a596eb3b8a2beecd361de79" exitCode=0 Feb 19 18:52:59 crc kubenswrapper[4749]: I0219 18:52:59.830360 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" event={"ID":"b1a9182f-9dd3-40f6-a0b1-12b570382705","Type":"ContainerDied","Data":"68823225846a106051d510641f982fd731d3cc7b2a596eb3b8a2beecd361de79"} Feb 19 18:52:59 crc kubenswrapper[4749]: I0219 18:52:59.855228 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.855210395 podStartE2EDuration="3.855210395s" podCreationTimestamp="2026-02-19 18:52:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:52:59.849537127 +0000 UTC m=+1153.810757081" watchObservedRunningTime="2026-02-19 18:52:59.855210395 +0000 UTC m=+1153.816430339" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.216705 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.311633 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-ovsdbserver-sb\") pod \"b1a9182f-9dd3-40f6-a0b1-12b570382705\" (UID: \"b1a9182f-9dd3-40f6-a0b1-12b570382705\") " Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.312108 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-ovsdbserver-nb\") pod \"b1a9182f-9dd3-40f6-a0b1-12b570382705\" (UID: \"b1a9182f-9dd3-40f6-a0b1-12b570382705\") " Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.312163 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-config\") pod \"b1a9182f-9dd3-40f6-a0b1-12b570382705\" (UID: \"b1a9182f-9dd3-40f6-a0b1-12b570382705\") " Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.312207 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fx45d\" (UniqueName: \"kubernetes.io/projected/b1a9182f-9dd3-40f6-a0b1-12b570382705-kube-api-access-fx45d\") pod \"b1a9182f-9dd3-40f6-a0b1-12b570382705\" (UID: \"b1a9182f-9dd3-40f6-a0b1-12b570382705\") " Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.312249 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-dns-svc\") pod \"b1a9182f-9dd3-40f6-a0b1-12b570382705\" (UID: \"b1a9182f-9dd3-40f6-a0b1-12b570382705\") " Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.312299 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-dns-swift-storage-0\") pod \"b1a9182f-9dd3-40f6-a0b1-12b570382705\" (UID: \"b1a9182f-9dd3-40f6-a0b1-12b570382705\") " Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.334223 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1a9182f-9dd3-40f6-a0b1-12b570382705-kube-api-access-fx45d" (OuterVolumeSpecName: "kube-api-access-fx45d") pod "b1a9182f-9dd3-40f6-a0b1-12b570382705" (UID: "b1a9182f-9dd3-40f6-a0b1-12b570382705"). InnerVolumeSpecName "kube-api-access-fx45d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.382769 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b1a9182f-9dd3-40f6-a0b1-12b570382705" (UID: "b1a9182f-9dd3-40f6-a0b1-12b570382705"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.415439 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.415465 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fx45d\" (UniqueName: \"kubernetes.io/projected/b1a9182f-9dd3-40f6-a0b1-12b570382705-kube-api-access-fx45d\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.435644 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b1a9182f-9dd3-40f6-a0b1-12b570382705" (UID: "b1a9182f-9dd3-40f6-a0b1-12b570382705"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.467219 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-config" (OuterVolumeSpecName: "config") pod "b1a9182f-9dd3-40f6-a0b1-12b570382705" (UID: "b1a9182f-9dd3-40f6-a0b1-12b570382705"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.500352 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b1a9182f-9dd3-40f6-a0b1-12b570382705" (UID: "b1a9182f-9dd3-40f6-a0b1-12b570382705"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.509570 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b1a9182f-9dd3-40f6-a0b1-12b570382705" (UID: "b1a9182f-9dd3-40f6-a0b1-12b570382705"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.517650 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.517866 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.518192 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.518264 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1a9182f-9dd3-40f6-a0b1-12b570382705-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.672078 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.722733 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-public-tls-certs\") pod \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.722774 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsdjw\" (UniqueName: \"kubernetes.io/projected/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-kube-api-access-qsdjw\") pod \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.722813 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-config-data\") pod \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.722844 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-logs\") pod \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.722863 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-internal-tls-certs\") pod \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.722883 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-scripts\") pod \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.722967 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-combined-ca-bundle\") pod \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\" (UID: \"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5\") " Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.726570 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-logs" (OuterVolumeSpecName: "logs") pod "7b35ccc4-45ea-446a-8e12-5e8d57ccbda5" (UID: "7b35ccc4-45ea-446a-8e12-5e8d57ccbda5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.746410 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-kube-api-access-qsdjw" (OuterVolumeSpecName: "kube-api-access-qsdjw") pod "7b35ccc4-45ea-446a-8e12-5e8d57ccbda5" (UID: "7b35ccc4-45ea-446a-8e12-5e8d57ccbda5"). InnerVolumeSpecName "kube-api-access-qsdjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.757532 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-scripts" (OuterVolumeSpecName: "scripts") pod "7b35ccc4-45ea-446a-8e12-5e8d57ccbda5" (UID: "7b35ccc4-45ea-446a-8e12-5e8d57ccbda5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.831535 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsdjw\" (UniqueName: \"kubernetes.io/projected/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-kube-api-access-qsdjw\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.831560 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-logs\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.831569 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.852267 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" event={"ID":"b1a9182f-9dd3-40f6-a0b1-12b570382705","Type":"ContainerDied","Data":"2296a32bf91398e5096c030f8233284784a37cc2b27348420bd2ab94aa4ba5aa"} Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.852334 4749 scope.go:117] "RemoveContainer" containerID="68823225846a106051d510641f982fd731d3cc7b2a596eb3b8a2beecd361de79" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.852511 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dcd8cd889-wv2ds" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.862007 4749 generic.go:334] "Generic (PLEG): container finished" podID="7b35ccc4-45ea-446a-8e12-5e8d57ccbda5" containerID="7608bdcb27a2cebde3e0f9c777ade7b95e122464ac03687e36db379b339f2e1a" exitCode=0 Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.862082 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78cffb9ffd-xjhv2" event={"ID":"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5","Type":"ContainerDied","Data":"7608bdcb27a2cebde3e0f9c777ade7b95e122464ac03687e36db379b339f2e1a"} Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.862167 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78cffb9ffd-xjhv2" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.862174 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78cffb9ffd-xjhv2" event={"ID":"7b35ccc4-45ea-446a-8e12-5e8d57ccbda5","Type":"ContainerDied","Data":"750433d7989330751a636d9ae82300173cddae73dcd48458f86abc04cf9e9dc0"} Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.865146 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b35ccc4-45ea-446a-8e12-5e8d57ccbda5" (UID: "7b35ccc4-45ea-446a-8e12-5e8d57ccbda5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.866291 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-config-data" (OuterVolumeSpecName: "config-data") pod "7b35ccc4-45ea-446a-8e12-5e8d57ccbda5" (UID: "7b35ccc4-45ea-446a-8e12-5e8d57ccbda5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.890168 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7b35ccc4-45ea-446a-8e12-5e8d57ccbda5" (UID: "7b35ccc4-45ea-446a-8e12-5e8d57ccbda5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.892253 4749 scope.go:117] "RemoveContainer" containerID="864ca710bf18b9c2c27af884e0d6f7d42aba809fb695ab137ad4f67f227c1993" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.902273 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7b35ccc4-45ea-446a-8e12-5e8d57ccbda5" (UID: "7b35ccc4-45ea-446a-8e12-5e8d57ccbda5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.904123 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dcd8cd889-wv2ds"] Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.927081 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6dcd8cd889-wv2ds"] Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.932302 4749 scope.go:117] "RemoveContainer" containerID="7608bdcb27a2cebde3e0f9c777ade7b95e122464ac03687e36db379b339f2e1a" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.933435 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.933460 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.933469 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.933477 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.959328 4749 scope.go:117] "RemoveContainer" containerID="44a454c44ae72f405d6bfcd1bed6c789721d79e3f3df21d430e1b22966408630" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.983123 4749 scope.go:117] "RemoveContainer" containerID="7608bdcb27a2cebde3e0f9c777ade7b95e122464ac03687e36db379b339f2e1a" Feb 19 18:53:00 crc kubenswrapper[4749]: E0219 18:53:00.983556 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7608bdcb27a2cebde3e0f9c777ade7b95e122464ac03687e36db379b339f2e1a\": container with ID starting with 7608bdcb27a2cebde3e0f9c777ade7b95e122464ac03687e36db379b339f2e1a not found: ID does not exist" containerID="7608bdcb27a2cebde3e0f9c777ade7b95e122464ac03687e36db379b339f2e1a" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.983595 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7608bdcb27a2cebde3e0f9c777ade7b95e122464ac03687e36db379b339f2e1a"} err="failed to get container status \"7608bdcb27a2cebde3e0f9c777ade7b95e122464ac03687e36db379b339f2e1a\": rpc error: code = NotFound desc = could not find container \"7608bdcb27a2cebde3e0f9c777ade7b95e122464ac03687e36db379b339f2e1a\": container with ID starting with 7608bdcb27a2cebde3e0f9c777ade7b95e122464ac03687e36db379b339f2e1a not found: ID does not exist" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.983621 4749 scope.go:117] "RemoveContainer" containerID="44a454c44ae72f405d6bfcd1bed6c789721d79e3f3df21d430e1b22966408630" Feb 19 18:53:00 crc kubenswrapper[4749]: E0219 18:53:00.983952 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44a454c44ae72f405d6bfcd1bed6c789721d79e3f3df21d430e1b22966408630\": container with ID starting with 44a454c44ae72f405d6bfcd1bed6c789721d79e3f3df21d430e1b22966408630 not found: ID does not exist" containerID="44a454c44ae72f405d6bfcd1bed6c789721d79e3f3df21d430e1b22966408630" Feb 19 18:53:00 crc kubenswrapper[4749]: I0219 18:53:00.983968 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44a454c44ae72f405d6bfcd1bed6c789721d79e3f3df21d430e1b22966408630"} err="failed to get container status \"44a454c44ae72f405d6bfcd1bed6c789721d79e3f3df21d430e1b22966408630\": rpc error: code = NotFound desc = could not find container \"44a454c44ae72f405d6bfcd1bed6c789721d79e3f3df21d430e1b22966408630\": container with ID starting with 44a454c44ae72f405d6bfcd1bed6c789721d79e3f3df21d430e1b22966408630 not found: ID does not exist" Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.198915 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-78cffb9ffd-xjhv2"] Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.206968 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-78cffb9ffd-xjhv2"] Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.501123 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64c65fc786-5rs9n" Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.546744 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd0afcb9-06ed-4981-a030-34f26ae748c3-ovndb-tls-certs\") pod \"cd0afcb9-06ed-4981-a030-34f26ae748c3\" (UID: \"cd0afcb9-06ed-4981-a030-34f26ae748c3\") " Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.546807 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd0afcb9-06ed-4981-a030-34f26ae748c3-combined-ca-bundle\") pod \"cd0afcb9-06ed-4981-a030-34f26ae748c3\" (UID: \"cd0afcb9-06ed-4981-a030-34f26ae748c3\") " Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.546948 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd0afcb9-06ed-4981-a030-34f26ae748c3-config\") pod \"cd0afcb9-06ed-4981-a030-34f26ae748c3\" (UID: \"cd0afcb9-06ed-4981-a030-34f26ae748c3\") " Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.546984 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8czks\" (UniqueName: \"kubernetes.io/projected/cd0afcb9-06ed-4981-a030-34f26ae748c3-kube-api-access-8czks\") pod \"cd0afcb9-06ed-4981-a030-34f26ae748c3\" (UID: \"cd0afcb9-06ed-4981-a030-34f26ae748c3\") " Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.547012 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cd0afcb9-06ed-4981-a030-34f26ae748c3-httpd-config\") pod \"cd0afcb9-06ed-4981-a030-34f26ae748c3\" (UID: \"cd0afcb9-06ed-4981-a030-34f26ae748c3\") " Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.565848 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd0afcb9-06ed-4981-a030-34f26ae748c3-kube-api-access-8czks" (OuterVolumeSpecName: "kube-api-access-8czks") pod "cd0afcb9-06ed-4981-a030-34f26ae748c3" (UID: "cd0afcb9-06ed-4981-a030-34f26ae748c3"). InnerVolumeSpecName "kube-api-access-8czks". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.573184 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd0afcb9-06ed-4981-a030-34f26ae748c3-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "cd0afcb9-06ed-4981-a030-34f26ae748c3" (UID: "cd0afcb9-06ed-4981-a030-34f26ae748c3"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.617120 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd0afcb9-06ed-4981-a030-34f26ae748c3-config" (OuterVolumeSpecName: "config") pod "cd0afcb9-06ed-4981-a030-34f26ae748c3" (UID: "cd0afcb9-06ed-4981-a030-34f26ae748c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.635175 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd0afcb9-06ed-4981-a030-34f26ae748c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd0afcb9-06ed-4981-a030-34f26ae748c3" (UID: "cd0afcb9-06ed-4981-a030-34f26ae748c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.649929 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd0afcb9-06ed-4981-a030-34f26ae748c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.649962 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd0afcb9-06ed-4981-a030-34f26ae748c3-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.649972 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8czks\" (UniqueName: \"kubernetes.io/projected/cd0afcb9-06ed-4981-a030-34f26ae748c3-kube-api-access-8czks\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.649982 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cd0afcb9-06ed-4981-a030-34f26ae748c3-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.656179 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd0afcb9-06ed-4981-a030-34f26ae748c3-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "cd0afcb9-06ed-4981-a030-34f26ae748c3" (UID: "cd0afcb9-06ed-4981-a030-34f26ae748c3"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.751539 4749 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd0afcb9-06ed-4981-a030-34f26ae748c3-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.874895 4749 generic.go:334] "Generic (PLEG): container finished" podID="cd0afcb9-06ed-4981-a030-34f26ae748c3" containerID="7679c4c78ad17569a830bc78b91549f45e3b99b32012c7029ba9b8e4c4c3aaab" exitCode=0 Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.874928 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64c65fc786-5rs9n" Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.874994 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64c65fc786-5rs9n" event={"ID":"cd0afcb9-06ed-4981-a030-34f26ae748c3","Type":"ContainerDied","Data":"7679c4c78ad17569a830bc78b91549f45e3b99b32012c7029ba9b8e4c4c3aaab"} Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.875048 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64c65fc786-5rs9n" event={"ID":"cd0afcb9-06ed-4981-a030-34f26ae748c3","Type":"ContainerDied","Data":"5778186b98cdccbb043af081d747247c9984d10e46b707a090a14f8a3b85013d"} Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.875072 4749 scope.go:117] "RemoveContainer" containerID="8d09a764d722e755c6a433d6318e930d5aff9b5a5b875bc9822a77e9a76fd842" Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.925722 4749 scope.go:117] "RemoveContainer" containerID="7679c4c78ad17569a830bc78b91549f45e3b99b32012c7029ba9b8e4c4c3aaab" Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.939144 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-64c65fc786-5rs9n"] Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.950450 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-64c65fc786-5rs9n"] Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.964057 4749 scope.go:117] "RemoveContainer" containerID="8d09a764d722e755c6a433d6318e930d5aff9b5a5b875bc9822a77e9a76fd842" Feb 19 18:53:01 crc kubenswrapper[4749]: E0219 18:53:01.965368 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d09a764d722e755c6a433d6318e930d5aff9b5a5b875bc9822a77e9a76fd842\": container with ID starting with 8d09a764d722e755c6a433d6318e930d5aff9b5a5b875bc9822a77e9a76fd842 not found: ID does not exist" containerID="8d09a764d722e755c6a433d6318e930d5aff9b5a5b875bc9822a77e9a76fd842" Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.965405 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d09a764d722e755c6a433d6318e930d5aff9b5a5b875bc9822a77e9a76fd842"} err="failed to get container status \"8d09a764d722e755c6a433d6318e930d5aff9b5a5b875bc9822a77e9a76fd842\": rpc error: code = NotFound desc = could not find container \"8d09a764d722e755c6a433d6318e930d5aff9b5a5b875bc9822a77e9a76fd842\": container with ID starting with 8d09a764d722e755c6a433d6318e930d5aff9b5a5b875bc9822a77e9a76fd842 not found: ID does not exist" Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.965431 4749 scope.go:117] "RemoveContainer" containerID="7679c4c78ad17569a830bc78b91549f45e3b99b32012c7029ba9b8e4c4c3aaab" Feb 19 18:53:01 crc kubenswrapper[4749]: E0219 18:53:01.966720 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7679c4c78ad17569a830bc78b91549f45e3b99b32012c7029ba9b8e4c4c3aaab\": container with ID starting with 7679c4c78ad17569a830bc78b91549f45e3b99b32012c7029ba9b8e4c4c3aaab not found: ID does not exist" containerID="7679c4c78ad17569a830bc78b91549f45e3b99b32012c7029ba9b8e4c4c3aaab" Feb 19 18:53:01 crc kubenswrapper[4749]: I0219 18:53:01.966745 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7679c4c78ad17569a830bc78b91549f45e3b99b32012c7029ba9b8e4c4c3aaab"} err="failed to get container status \"7679c4c78ad17569a830bc78b91549f45e3b99b32012c7029ba9b8e4c4c3aaab\": rpc error: code = NotFound desc = could not find container \"7679c4c78ad17569a830bc78b91549f45e3b99b32012c7029ba9b8e4c4c3aaab\": container with ID starting with 7679c4c78ad17569a830bc78b91549f45e3b99b32012c7029ba9b8e4c4c3aaab not found: ID does not exist" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.089110 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.287152 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-dc7975d7b-jl97p" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.316379 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-dc7975d7b-jl97p" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.345980 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-866f5f4f4b-zfvsn" podUID="f8dcae23-3df3-4de3-8a0a-499c15a90daa" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.166:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.166:8443: connect: connection refused" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.346092 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.558768 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 19 18:53:02 crc kubenswrapper[4749]: E0219 18:53:02.559517 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b35ccc4-45ea-446a-8e12-5e8d57ccbda5" containerName="placement-log" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.559605 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b35ccc4-45ea-446a-8e12-5e8d57ccbda5" containerName="placement-log" Feb 19 18:53:02 crc kubenswrapper[4749]: E0219 18:53:02.559669 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd0afcb9-06ed-4981-a030-34f26ae748c3" containerName="neutron-api" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.559725 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd0afcb9-06ed-4981-a030-34f26ae748c3" containerName="neutron-api" Feb 19 18:53:02 crc kubenswrapper[4749]: E0219 18:53:02.559786 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1a9182f-9dd3-40f6-a0b1-12b570382705" containerName="dnsmasq-dns" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.559845 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1a9182f-9dd3-40f6-a0b1-12b570382705" containerName="dnsmasq-dns" Feb 19 18:53:02 crc kubenswrapper[4749]: E0219 18:53:02.559918 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd0afcb9-06ed-4981-a030-34f26ae748c3" containerName="neutron-httpd" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.559975 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd0afcb9-06ed-4981-a030-34f26ae748c3" containerName="neutron-httpd" Feb 19 18:53:02 crc kubenswrapper[4749]: E0219 18:53:02.560059 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1a9182f-9dd3-40f6-a0b1-12b570382705" containerName="init" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.560127 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1a9182f-9dd3-40f6-a0b1-12b570382705" containerName="init" Feb 19 18:53:02 crc kubenswrapper[4749]: E0219 18:53:02.560198 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b35ccc4-45ea-446a-8e12-5e8d57ccbda5" containerName="placement-api" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.560300 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b35ccc4-45ea-446a-8e12-5e8d57ccbda5" containerName="placement-api" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.560591 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd0afcb9-06ed-4981-a030-34f26ae748c3" containerName="neutron-httpd" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.560702 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1a9182f-9dd3-40f6-a0b1-12b570382705" containerName="dnsmasq-dns" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.560769 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b35ccc4-45ea-446a-8e12-5e8d57ccbda5" containerName="placement-log" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.560907 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd0afcb9-06ed-4981-a030-34f26ae748c3" containerName="neutron-api" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.561611 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b35ccc4-45ea-446a-8e12-5e8d57ccbda5" containerName="placement-api" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.562324 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.566500 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.566537 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.568391 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-5zphj" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.601092 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.672231 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5d676f6e-9d56-41ab-9689-a19a0b9665f7-openstack-config-secret\") pod \"openstackclient\" (UID: \"5d676f6e-9d56-41ab-9689-a19a0b9665f7\") " pod="openstack/openstackclient" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.672352 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5d676f6e-9d56-41ab-9689-a19a0b9665f7-openstack-config\") pod \"openstackclient\" (UID: \"5d676f6e-9d56-41ab-9689-a19a0b9665f7\") " pod="openstack/openstackclient" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.672391 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvbdl\" (UniqueName: \"kubernetes.io/projected/5d676f6e-9d56-41ab-9689-a19a0b9665f7-kube-api-access-jvbdl\") pod \"openstackclient\" (UID: \"5d676f6e-9d56-41ab-9689-a19a0b9665f7\") " pod="openstack/openstackclient" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.672486 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d676f6e-9d56-41ab-9689-a19a0b9665f7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5d676f6e-9d56-41ab-9689-a19a0b9665f7\") " pod="openstack/openstackclient" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.708617 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b35ccc4-45ea-446a-8e12-5e8d57ccbda5" path="/var/lib/kubelet/pods/7b35ccc4-45ea-446a-8e12-5e8d57ccbda5/volumes" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.709449 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1a9182f-9dd3-40f6-a0b1-12b570382705" path="/var/lib/kubelet/pods/b1a9182f-9dd3-40f6-a0b1-12b570382705/volumes" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.710228 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd0afcb9-06ed-4981-a030-34f26ae748c3" path="/var/lib/kubelet/pods/cd0afcb9-06ed-4981-a030-34f26ae748c3/volumes" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.775272 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5d676f6e-9d56-41ab-9689-a19a0b9665f7-openstack-config-secret\") pod \"openstackclient\" (UID: \"5d676f6e-9d56-41ab-9689-a19a0b9665f7\") " pod="openstack/openstackclient" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.775681 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5d676f6e-9d56-41ab-9689-a19a0b9665f7-openstack-config\") pod \"openstackclient\" (UID: \"5d676f6e-9d56-41ab-9689-a19a0b9665f7\") " pod="openstack/openstackclient" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.775720 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvbdl\" (UniqueName: \"kubernetes.io/projected/5d676f6e-9d56-41ab-9689-a19a0b9665f7-kube-api-access-jvbdl\") pod \"openstackclient\" (UID: \"5d676f6e-9d56-41ab-9689-a19a0b9665f7\") " pod="openstack/openstackclient" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.775843 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d676f6e-9d56-41ab-9689-a19a0b9665f7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5d676f6e-9d56-41ab-9689-a19a0b9665f7\") " pod="openstack/openstackclient" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.779418 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5d676f6e-9d56-41ab-9689-a19a0b9665f7-openstack-config\") pod \"openstackclient\" (UID: \"5d676f6e-9d56-41ab-9689-a19a0b9665f7\") " pod="openstack/openstackclient" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.781960 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d676f6e-9d56-41ab-9689-a19a0b9665f7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5d676f6e-9d56-41ab-9689-a19a0b9665f7\") " pod="openstack/openstackclient" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.790644 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5d676f6e-9d56-41ab-9689-a19a0b9665f7-openstack-config-secret\") pod \"openstackclient\" (UID: \"5d676f6e-9d56-41ab-9689-a19a0b9665f7\") " pod="openstack/openstackclient" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.815588 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvbdl\" (UniqueName: \"kubernetes.io/projected/5d676f6e-9d56-41ab-9689-a19a0b9665f7-kube-api-access-jvbdl\") pod \"openstackclient\" (UID: \"5d676f6e-9d56-41ab-9689-a19a0b9665f7\") " pod="openstack/openstackclient" Feb 19 18:53:02 crc kubenswrapper[4749]: I0219 18:53:02.880245 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 18:53:03 crc kubenswrapper[4749]: I0219 18:53:03.020040 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="fa8559a3-5137-4d82-a189-18e060db5fa5" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.184:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 18:53:03 crc kubenswrapper[4749]: I0219 18:53:03.390018 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 19 18:53:03 crc kubenswrapper[4749]: I0219 18:53:03.420110 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 18:53:03 crc kubenswrapper[4749]: I0219 18:53:03.911626 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5d676f6e-9d56-41ab-9689-a19a0b9665f7","Type":"ContainerStarted","Data":"f24afc7810ccc8dcf98ca97c6e1ffe0e6b323a752e992fd22354d49ee2ab9018"} Feb 19 18:53:04 crc kubenswrapper[4749]: I0219 18:53:04.271627 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6d97b8448-pl5l5" Feb 19 18:53:04 crc kubenswrapper[4749]: I0219 18:53:04.379983 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6d97b8448-pl5l5" Feb 19 18:53:04 crc kubenswrapper[4749]: I0219 18:53:04.449571 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-dc7975d7b-jl97p"] Feb 19 18:53:04 crc kubenswrapper[4749]: I0219 18:53:04.449777 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-dc7975d7b-jl97p" podUID="ee13f81f-9ab7-496e-b4e6-85a530c1774e" containerName="barbican-api-log" containerID="cri-o://0a2a9e2dee36c64764d83eac26426bf3c2c16cae705484223aa2ae8560ebfb85" gracePeriod=30 Feb 19 18:53:04 crc kubenswrapper[4749]: I0219 18:53:04.451099 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-dc7975d7b-jl97p" podUID="ee13f81f-9ab7-496e-b4e6-85a530c1774e" containerName="barbican-api" containerID="cri-o://92679fa58c5c8a59bff0e74b3cd821d421856825aae656fe59c734eb412e5e04" gracePeriod=30 Feb 19 18:53:04 crc kubenswrapper[4749]: I0219 18:53:04.945758 4749 generic.go:334] "Generic (PLEG): container finished" podID="ee13f81f-9ab7-496e-b4e6-85a530c1774e" containerID="0a2a9e2dee36c64764d83eac26426bf3c2c16cae705484223aa2ae8560ebfb85" exitCode=143 Feb 19 18:53:04 crc kubenswrapper[4749]: I0219 18:53:04.947295 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-dc7975d7b-jl97p" event={"ID":"ee13f81f-9ab7-496e-b4e6-85a530c1774e","Type":"ContainerDied","Data":"0a2a9e2dee36c64764d83eac26426bf3c2c16cae705484223aa2ae8560ebfb85"} Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.712568 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6966bc7795-zbh89"] Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.721414 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.757731 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.758865 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.760325 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.779157 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6966bc7795-zbh89"] Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.865365 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3ebc6a8f-ca72-408a-8add-2a21e7a4c803-etc-swift\") pod \"swift-proxy-6966bc7795-zbh89\" (UID: \"3ebc6a8f-ca72-408a-8add-2a21e7a4c803\") " pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.865400 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ebc6a8f-ca72-408a-8add-2a21e7a4c803-config-data\") pod \"swift-proxy-6966bc7795-zbh89\" (UID: \"3ebc6a8f-ca72-408a-8add-2a21e7a4c803\") " pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.865417 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ebc6a8f-ca72-408a-8add-2a21e7a4c803-combined-ca-bundle\") pod \"swift-proxy-6966bc7795-zbh89\" (UID: \"3ebc6a8f-ca72-408a-8add-2a21e7a4c803\") " pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.865444 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ebc6a8f-ca72-408a-8add-2a21e7a4c803-run-httpd\") pod \"swift-proxy-6966bc7795-zbh89\" (UID: \"3ebc6a8f-ca72-408a-8add-2a21e7a4c803\") " pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.865462 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ebc6a8f-ca72-408a-8add-2a21e7a4c803-public-tls-certs\") pod \"swift-proxy-6966bc7795-zbh89\" (UID: \"3ebc6a8f-ca72-408a-8add-2a21e7a4c803\") " pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.865488 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ebc6a8f-ca72-408a-8add-2a21e7a4c803-internal-tls-certs\") pod \"swift-proxy-6966bc7795-zbh89\" (UID: \"3ebc6a8f-ca72-408a-8add-2a21e7a4c803\") " pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.865502 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ebc6a8f-ca72-408a-8add-2a21e7a4c803-log-httpd\") pod \"swift-proxy-6966bc7795-zbh89\" (UID: \"3ebc6a8f-ca72-408a-8add-2a21e7a4c803\") " pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.865661 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjrqr\" (UniqueName: \"kubernetes.io/projected/3ebc6a8f-ca72-408a-8add-2a21e7a4c803-kube-api-access-cjrqr\") pod \"swift-proxy-6966bc7795-zbh89\" (UID: \"3ebc6a8f-ca72-408a-8add-2a21e7a4c803\") " pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.966891 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3ebc6a8f-ca72-408a-8add-2a21e7a4c803-etc-swift\") pod \"swift-proxy-6966bc7795-zbh89\" (UID: \"3ebc6a8f-ca72-408a-8add-2a21e7a4c803\") " pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.966928 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ebc6a8f-ca72-408a-8add-2a21e7a4c803-config-data\") pod \"swift-proxy-6966bc7795-zbh89\" (UID: \"3ebc6a8f-ca72-408a-8add-2a21e7a4c803\") " pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.966946 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ebc6a8f-ca72-408a-8add-2a21e7a4c803-combined-ca-bundle\") pod \"swift-proxy-6966bc7795-zbh89\" (UID: \"3ebc6a8f-ca72-408a-8add-2a21e7a4c803\") " pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.966969 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ebc6a8f-ca72-408a-8add-2a21e7a4c803-run-httpd\") pod \"swift-proxy-6966bc7795-zbh89\" (UID: \"3ebc6a8f-ca72-408a-8add-2a21e7a4c803\") " pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.967003 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ebc6a8f-ca72-408a-8add-2a21e7a4c803-public-tls-certs\") pod \"swift-proxy-6966bc7795-zbh89\" (UID: \"3ebc6a8f-ca72-408a-8add-2a21e7a4c803\") " pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.967045 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ebc6a8f-ca72-408a-8add-2a21e7a4c803-internal-tls-certs\") pod \"swift-proxy-6966bc7795-zbh89\" (UID: \"3ebc6a8f-ca72-408a-8add-2a21e7a4c803\") " pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.967065 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ebc6a8f-ca72-408a-8add-2a21e7a4c803-log-httpd\") pod \"swift-proxy-6966bc7795-zbh89\" (UID: \"3ebc6a8f-ca72-408a-8add-2a21e7a4c803\") " pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.967114 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjrqr\" (UniqueName: \"kubernetes.io/projected/3ebc6a8f-ca72-408a-8add-2a21e7a4c803-kube-api-access-cjrqr\") pod \"swift-proxy-6966bc7795-zbh89\" (UID: \"3ebc6a8f-ca72-408a-8add-2a21e7a4c803\") " pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.974623 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ebc6a8f-ca72-408a-8add-2a21e7a4c803-run-httpd\") pod \"swift-proxy-6966bc7795-zbh89\" (UID: \"3ebc6a8f-ca72-408a-8add-2a21e7a4c803\") " pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.976642 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ebc6a8f-ca72-408a-8add-2a21e7a4c803-combined-ca-bundle\") pod \"swift-proxy-6966bc7795-zbh89\" (UID: \"3ebc6a8f-ca72-408a-8add-2a21e7a4c803\") " pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.977442 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ebc6a8f-ca72-408a-8add-2a21e7a4c803-log-httpd\") pod \"swift-proxy-6966bc7795-zbh89\" (UID: \"3ebc6a8f-ca72-408a-8add-2a21e7a4c803\") " pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.978384 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ebc6a8f-ca72-408a-8add-2a21e7a4c803-internal-tls-certs\") pod \"swift-proxy-6966bc7795-zbh89\" (UID: \"3ebc6a8f-ca72-408a-8add-2a21e7a4c803\") " pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.980791 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ebc6a8f-ca72-408a-8add-2a21e7a4c803-config-data\") pod \"swift-proxy-6966bc7795-zbh89\" (UID: \"3ebc6a8f-ca72-408a-8add-2a21e7a4c803\") " pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.981488 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3ebc6a8f-ca72-408a-8add-2a21e7a4c803-etc-swift\") pod \"swift-proxy-6966bc7795-zbh89\" (UID: \"3ebc6a8f-ca72-408a-8add-2a21e7a4c803\") " pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.983377 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ebc6a8f-ca72-408a-8add-2a21e7a4c803-public-tls-certs\") pod \"swift-proxy-6966bc7795-zbh89\" (UID: \"3ebc6a8f-ca72-408a-8add-2a21e7a4c803\") " pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:05 crc kubenswrapper[4749]: I0219 18:53:05.985443 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjrqr\" (UniqueName: \"kubernetes.io/projected/3ebc6a8f-ca72-408a-8add-2a21e7a4c803-kube-api-access-cjrqr\") pod \"swift-proxy-6966bc7795-zbh89\" (UID: \"3ebc6a8f-ca72-408a-8add-2a21e7a4c803\") " pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:06 crc kubenswrapper[4749]: I0219 18:53:06.042587 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:06 crc kubenswrapper[4749]: I0219 18:53:06.131453 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-dc7975d7b-jl97p" podUID="ee13f81f-9ab7-496e-b4e6-85a530c1774e" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.188:9311/healthcheck\": read tcp 10.217.0.2:60862->10.217.0.188:9311: read: connection reset by peer" Feb 19 18:53:06 crc kubenswrapper[4749]: I0219 18:53:06.132065 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-dc7975d7b-jl97p" podUID="ee13f81f-9ab7-496e-b4e6-85a530c1774e" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.188:9311/healthcheck\": read tcp 10.217.0.2:60860->10.217.0.188:9311: read: connection reset by peer" Feb 19 18:53:06 crc kubenswrapper[4749]: I0219 18:53:06.590068 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-dc7975d7b-jl97p" Feb 19 18:53:06 crc kubenswrapper[4749]: I0219 18:53:06.633811 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6966bc7795-zbh89"] Feb 19 18:53:06 crc kubenswrapper[4749]: I0219 18:53:06.682794 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee13f81f-9ab7-496e-b4e6-85a530c1774e-logs\") pod \"ee13f81f-9ab7-496e-b4e6-85a530c1774e\" (UID: \"ee13f81f-9ab7-496e-b4e6-85a530c1774e\") " Feb 19 18:53:06 crc kubenswrapper[4749]: I0219 18:53:06.683501 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee13f81f-9ab7-496e-b4e6-85a530c1774e-config-data-custom\") pod \"ee13f81f-9ab7-496e-b4e6-85a530c1774e\" (UID: \"ee13f81f-9ab7-496e-b4e6-85a530c1774e\") " Feb 19 18:53:06 crc kubenswrapper[4749]: I0219 18:53:06.683696 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zjfw\" (UniqueName: \"kubernetes.io/projected/ee13f81f-9ab7-496e-b4e6-85a530c1774e-kube-api-access-9zjfw\") pod \"ee13f81f-9ab7-496e-b4e6-85a530c1774e\" (UID: \"ee13f81f-9ab7-496e-b4e6-85a530c1774e\") " Feb 19 18:53:06 crc kubenswrapper[4749]: I0219 18:53:06.683829 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee13f81f-9ab7-496e-b4e6-85a530c1774e-combined-ca-bundle\") pod \"ee13f81f-9ab7-496e-b4e6-85a530c1774e\" (UID: \"ee13f81f-9ab7-496e-b4e6-85a530c1774e\") " Feb 19 18:53:06 crc kubenswrapper[4749]: I0219 18:53:06.684108 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee13f81f-9ab7-496e-b4e6-85a530c1774e-config-data\") pod \"ee13f81f-9ab7-496e-b4e6-85a530c1774e\" (UID: \"ee13f81f-9ab7-496e-b4e6-85a530c1774e\") " Feb 19 18:53:06 crc kubenswrapper[4749]: I0219 18:53:06.683711 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee13f81f-9ab7-496e-b4e6-85a530c1774e-logs" (OuterVolumeSpecName: "logs") pod "ee13f81f-9ab7-496e-b4e6-85a530c1774e" (UID: "ee13f81f-9ab7-496e-b4e6-85a530c1774e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:53:06 crc kubenswrapper[4749]: I0219 18:53:06.690380 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee13f81f-9ab7-496e-b4e6-85a530c1774e-kube-api-access-9zjfw" (OuterVolumeSpecName: "kube-api-access-9zjfw") pod "ee13f81f-9ab7-496e-b4e6-85a530c1774e" (UID: "ee13f81f-9ab7-496e-b4e6-85a530c1774e"). InnerVolumeSpecName "kube-api-access-9zjfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:53:06 crc kubenswrapper[4749]: I0219 18:53:06.690710 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee13f81f-9ab7-496e-b4e6-85a530c1774e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ee13f81f-9ab7-496e-b4e6-85a530c1774e" (UID: "ee13f81f-9ab7-496e-b4e6-85a530c1774e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:06 crc kubenswrapper[4749]: I0219 18:53:06.724366 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee13f81f-9ab7-496e-b4e6-85a530c1774e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee13f81f-9ab7-496e-b4e6-85a530c1774e" (UID: "ee13f81f-9ab7-496e-b4e6-85a530c1774e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:06 crc kubenswrapper[4749]: I0219 18:53:06.752707 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee13f81f-9ab7-496e-b4e6-85a530c1774e-config-data" (OuterVolumeSpecName: "config-data") pod "ee13f81f-9ab7-496e-b4e6-85a530c1774e" (UID: "ee13f81f-9ab7-496e-b4e6-85a530c1774e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:06 crc kubenswrapper[4749]: I0219 18:53:06.786874 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee13f81f-9ab7-496e-b4e6-85a530c1774e-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:06 crc kubenswrapper[4749]: I0219 18:53:06.786922 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zjfw\" (UniqueName: \"kubernetes.io/projected/ee13f81f-9ab7-496e-b4e6-85a530c1774e-kube-api-access-9zjfw\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:06 crc kubenswrapper[4749]: I0219 18:53:06.786937 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee13f81f-9ab7-496e-b4e6-85a530c1774e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:06 crc kubenswrapper[4749]: I0219 18:53:06.786949 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee13f81f-9ab7-496e-b4e6-85a530c1774e-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:06 crc kubenswrapper[4749]: I0219 18:53:06.786959 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee13f81f-9ab7-496e-b4e6-85a530c1774e-logs\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:06 crc kubenswrapper[4749]: I0219 18:53:06.969132 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6966bc7795-zbh89" event={"ID":"3ebc6a8f-ca72-408a-8add-2a21e7a4c803","Type":"ContainerStarted","Data":"e371b452b6ed4717e9d9d957704e5a98b20e5505f800edeee8c55ee757e91ddc"} Feb 19 18:53:06 crc kubenswrapper[4749]: I0219 18:53:06.969176 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6966bc7795-zbh89" event={"ID":"3ebc6a8f-ca72-408a-8add-2a21e7a4c803","Type":"ContainerStarted","Data":"8e7ffba0711984ac69ab821e90b53fb98f1afbbf851afeb0a7e29e050a970c0e"} Feb 19 18:53:06 crc kubenswrapper[4749]: I0219 18:53:06.971443 4749 generic.go:334] "Generic (PLEG): container finished" podID="ee13f81f-9ab7-496e-b4e6-85a530c1774e" containerID="92679fa58c5c8a59bff0e74b3cd821d421856825aae656fe59c734eb412e5e04" exitCode=0 Feb 19 18:53:06 crc kubenswrapper[4749]: I0219 18:53:06.971484 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-dc7975d7b-jl97p" event={"ID":"ee13f81f-9ab7-496e-b4e6-85a530c1774e","Type":"ContainerDied","Data":"92679fa58c5c8a59bff0e74b3cd821d421856825aae656fe59c734eb412e5e04"} Feb 19 18:53:06 crc kubenswrapper[4749]: I0219 18:53:06.971545 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-dc7975d7b-jl97p" Feb 19 18:53:06 crc kubenswrapper[4749]: I0219 18:53:06.971578 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-dc7975d7b-jl97p" event={"ID":"ee13f81f-9ab7-496e-b4e6-85a530c1774e","Type":"ContainerDied","Data":"95ac553ea931a44055411e3d8c81b751e89748515d1db968bd47c76f3f2cd0ee"} Feb 19 18:53:06 crc kubenswrapper[4749]: I0219 18:53:06.971609 4749 scope.go:117] "RemoveContainer" containerID="92679fa58c5c8a59bff0e74b3cd821d421856825aae656fe59c734eb412e5e04" Feb 19 18:53:07 crc kubenswrapper[4749]: I0219 18:53:07.036624 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:53:07 crc kubenswrapper[4749]: I0219 18:53:07.037241 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b7838589-f49f-4b20-a019-48db9b2ae719" containerName="ceilometer-central-agent" containerID="cri-o://f19d5a7c0676700b8bb0a7f9196177a390c33f2bf5755b61c48ef371da44afe0" gracePeriod=30 Feb 19 18:53:07 crc kubenswrapper[4749]: I0219 18:53:07.037362 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b7838589-f49f-4b20-a019-48db9b2ae719" containerName="proxy-httpd" containerID="cri-o://a16f4e4d1acac26c195aeb2c3e2a960c0c0cedff3af620794bb6b1f4acec01c6" gracePeriod=30 Feb 19 18:53:07 crc kubenswrapper[4749]: I0219 18:53:07.037400 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b7838589-f49f-4b20-a019-48db9b2ae719" containerName="sg-core" containerID="cri-o://bbc77b191ed97fab0dcdefa3a50b1f37a01e3e78e7bfdf81eca25489fd4165d4" gracePeriod=30 Feb 19 18:53:07 crc kubenswrapper[4749]: I0219 18:53:07.037431 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b7838589-f49f-4b20-a019-48db9b2ae719" containerName="ceilometer-notification-agent" containerID="cri-o://f8643b66de6291bf89ae58c0eba3442fc66922713ac6fb99b8e06f8e3cea4fcb" gracePeriod=30 Feb 19 18:53:07 crc kubenswrapper[4749]: I0219 18:53:07.048720 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="b7838589-f49f-4b20-a019-48db9b2ae719" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.180:3000/\": EOF" Feb 19 18:53:07 crc kubenswrapper[4749]: I0219 18:53:07.124396 4749 scope.go:117] "RemoveContainer" containerID="0a2a9e2dee36c64764d83eac26426bf3c2c16cae705484223aa2ae8560ebfb85" Feb 19 18:53:07 crc kubenswrapper[4749]: I0219 18:53:07.146687 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-dc7975d7b-jl97p"] Feb 19 18:53:07 crc kubenswrapper[4749]: I0219 18:53:07.150326 4749 scope.go:117] "RemoveContainer" containerID="92679fa58c5c8a59bff0e74b3cd821d421856825aae656fe59c734eb412e5e04" Feb 19 18:53:07 crc kubenswrapper[4749]: E0219 18:53:07.155174 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92679fa58c5c8a59bff0e74b3cd821d421856825aae656fe59c734eb412e5e04\": container with ID starting with 92679fa58c5c8a59bff0e74b3cd821d421856825aae656fe59c734eb412e5e04 not found: ID does not exist" containerID="92679fa58c5c8a59bff0e74b3cd821d421856825aae656fe59c734eb412e5e04" Feb 19 18:53:07 crc kubenswrapper[4749]: I0219 18:53:07.155213 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92679fa58c5c8a59bff0e74b3cd821d421856825aae656fe59c734eb412e5e04"} err="failed to get container status \"92679fa58c5c8a59bff0e74b3cd821d421856825aae656fe59c734eb412e5e04\": rpc error: code = NotFound desc = could not find container \"92679fa58c5c8a59bff0e74b3cd821d421856825aae656fe59c734eb412e5e04\": container with ID starting with 92679fa58c5c8a59bff0e74b3cd821d421856825aae656fe59c734eb412e5e04 not found: ID does not exist" Feb 19 18:53:07 crc kubenswrapper[4749]: I0219 18:53:07.155239 4749 scope.go:117] "RemoveContainer" containerID="0a2a9e2dee36c64764d83eac26426bf3c2c16cae705484223aa2ae8560ebfb85" Feb 19 18:53:07 crc kubenswrapper[4749]: E0219 18:53:07.155512 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a2a9e2dee36c64764d83eac26426bf3c2c16cae705484223aa2ae8560ebfb85\": container with ID starting with 0a2a9e2dee36c64764d83eac26426bf3c2c16cae705484223aa2ae8560ebfb85 not found: ID does not exist" containerID="0a2a9e2dee36c64764d83eac26426bf3c2c16cae705484223aa2ae8560ebfb85" Feb 19 18:53:07 crc kubenswrapper[4749]: I0219 18:53:07.155565 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2a9e2dee36c64764d83eac26426bf3c2c16cae705484223aa2ae8560ebfb85"} err="failed to get container status \"0a2a9e2dee36c64764d83eac26426bf3c2c16cae705484223aa2ae8560ebfb85\": rpc error: code = NotFound desc = could not find container \"0a2a9e2dee36c64764d83eac26426bf3c2c16cae705484223aa2ae8560ebfb85\": container with ID starting with 0a2a9e2dee36c64764d83eac26426bf3c2c16cae705484223aa2ae8560ebfb85 not found: ID does not exist" Feb 19 18:53:07 crc kubenswrapper[4749]: I0219 18:53:07.157830 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-dc7975d7b-jl97p"] Feb 19 18:53:07 crc kubenswrapper[4749]: I0219 18:53:07.296334 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 19 18:53:07 crc kubenswrapper[4749]: I0219 18:53:07.996998 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6966bc7795-zbh89" event={"ID":"3ebc6a8f-ca72-408a-8add-2a21e7a4c803","Type":"ContainerStarted","Data":"e1aa0cd5f8a42a70aa0fa632b1249944e5d00f9ba03de56a2ee512ae85ae9c77"} Feb 19 18:53:07 crc kubenswrapper[4749]: I0219 18:53:07.997235 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:07 crc kubenswrapper[4749]: I0219 18:53:07.999516 4749 generic.go:334] "Generic (PLEG): container finished" podID="f8dcae23-3df3-4de3-8a0a-499c15a90daa" containerID="7ed458c93c10e8a5d70696643628559a1656a862171ba6383208264fc426b8b5" exitCode=137 Feb 19 18:53:07 crc kubenswrapper[4749]: I0219 18:53:07.999584 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-866f5f4f4b-zfvsn" event={"ID":"f8dcae23-3df3-4de3-8a0a-499c15a90daa","Type":"ContainerDied","Data":"7ed458c93c10e8a5d70696643628559a1656a862171ba6383208264fc426b8b5"} Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.002681 4749 generic.go:334] "Generic (PLEG): container finished" podID="b7838589-f49f-4b20-a019-48db9b2ae719" containerID="a16f4e4d1acac26c195aeb2c3e2a960c0c0cedff3af620794bb6b1f4acec01c6" exitCode=0 Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.002708 4749 generic.go:334] "Generic (PLEG): container finished" podID="b7838589-f49f-4b20-a019-48db9b2ae719" containerID="bbc77b191ed97fab0dcdefa3a50b1f37a01e3e78e7bfdf81eca25489fd4165d4" exitCode=2 Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.002717 4749 generic.go:334] "Generic (PLEG): container finished" podID="b7838589-f49f-4b20-a019-48db9b2ae719" containerID="f19d5a7c0676700b8bb0a7f9196177a390c33f2bf5755b61c48ef371da44afe0" exitCode=0 Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.002756 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7838589-f49f-4b20-a019-48db9b2ae719","Type":"ContainerDied","Data":"a16f4e4d1acac26c195aeb2c3e2a960c0c0cedff3af620794bb6b1f4acec01c6"} Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.002775 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7838589-f49f-4b20-a019-48db9b2ae719","Type":"ContainerDied","Data":"bbc77b191ed97fab0dcdefa3a50b1f37a01e3e78e7bfdf81eca25489fd4165d4"} Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.002788 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7838589-f49f-4b20-a019-48db9b2ae719","Type":"ContainerDied","Data":"f19d5a7c0676700b8bb0a7f9196177a390c33f2bf5755b61c48ef371da44afe0"} Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.038664 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6966bc7795-zbh89" podStartSLOduration=3.038646141 podStartE2EDuration="3.038646141s" podCreationTimestamp="2026-02-19 18:53:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:53:08.020641845 +0000 UTC m=+1161.981861829" watchObservedRunningTime="2026-02-19 18:53:08.038646141 +0000 UTC m=+1161.999866095" Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.483540 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.630445 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8dcae23-3df3-4de3-8a0a-499c15a90daa-horizon-tls-certs\") pod \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.631242 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8dcae23-3df3-4de3-8a0a-499c15a90daa-config-data\") pod \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.631307 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f8dcae23-3df3-4de3-8a0a-499c15a90daa-horizon-secret-key\") pod \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.631408 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dcae23-3df3-4de3-8a0a-499c15a90daa-combined-ca-bundle\") pod \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.631505 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8dcae23-3df3-4de3-8a0a-499c15a90daa-scripts\") pod \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.631537 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x24s\" (UniqueName: \"kubernetes.io/projected/f8dcae23-3df3-4de3-8a0a-499c15a90daa-kube-api-access-9x24s\") pod \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.631560 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8dcae23-3df3-4de3-8a0a-499c15a90daa-logs\") pod \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\" (UID: \"f8dcae23-3df3-4de3-8a0a-499c15a90daa\") " Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.632229 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8dcae23-3df3-4de3-8a0a-499c15a90daa-logs" (OuterVolumeSpecName: "logs") pod "f8dcae23-3df3-4de3-8a0a-499c15a90daa" (UID: "f8dcae23-3df3-4de3-8a0a-499c15a90daa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.640199 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8dcae23-3df3-4de3-8a0a-499c15a90daa-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f8dcae23-3df3-4de3-8a0a-499c15a90daa" (UID: "f8dcae23-3df3-4de3-8a0a-499c15a90daa"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.640255 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8dcae23-3df3-4de3-8a0a-499c15a90daa-kube-api-access-9x24s" (OuterVolumeSpecName: "kube-api-access-9x24s") pod "f8dcae23-3df3-4de3-8a0a-499c15a90daa" (UID: "f8dcae23-3df3-4de3-8a0a-499c15a90daa"). InnerVolumeSpecName "kube-api-access-9x24s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.656533 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8dcae23-3df3-4de3-8a0a-499c15a90daa-config-data" (OuterVolumeSpecName: "config-data") pod "f8dcae23-3df3-4de3-8a0a-499c15a90daa" (UID: "f8dcae23-3df3-4de3-8a0a-499c15a90daa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.663983 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8dcae23-3df3-4de3-8a0a-499c15a90daa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8dcae23-3df3-4de3-8a0a-499c15a90daa" (UID: "f8dcae23-3df3-4de3-8a0a-499c15a90daa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.669958 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8dcae23-3df3-4de3-8a0a-499c15a90daa-scripts" (OuterVolumeSpecName: "scripts") pod "f8dcae23-3df3-4de3-8a0a-499c15a90daa" (UID: "f8dcae23-3df3-4de3-8a0a-499c15a90daa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.693247 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8dcae23-3df3-4de3-8a0a-499c15a90daa-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "f8dcae23-3df3-4de3-8a0a-499c15a90daa" (UID: "f8dcae23-3df3-4de3-8a0a-499c15a90daa"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.694778 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee13f81f-9ab7-496e-b4e6-85a530c1774e" path="/var/lib/kubelet/pods/ee13f81f-9ab7-496e-b4e6-85a530c1774e/volumes" Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.733439 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8dcae23-3df3-4de3-8a0a-499c15a90daa-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.733476 4749 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f8dcae23-3df3-4de3-8a0a-499c15a90daa-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.733488 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dcae23-3df3-4de3-8a0a-499c15a90daa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.733499 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8dcae23-3df3-4de3-8a0a-499c15a90daa-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.733515 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x24s\" (UniqueName: \"kubernetes.io/projected/f8dcae23-3df3-4de3-8a0a-499c15a90daa-kube-api-access-9x24s\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.733528 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8dcae23-3df3-4de3-8a0a-499c15a90daa-logs\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:08 crc kubenswrapper[4749]: I0219 18:53:08.733539 4749 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8dcae23-3df3-4de3-8a0a-499c15a90daa-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:09 crc kubenswrapper[4749]: I0219 18:53:09.021720 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-866f5f4f4b-zfvsn" Feb 19 18:53:09 crc kubenswrapper[4749]: I0219 18:53:09.021940 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-866f5f4f4b-zfvsn" event={"ID":"f8dcae23-3df3-4de3-8a0a-499c15a90daa","Type":"ContainerDied","Data":"b138cbe178c3181869550453becd39b69d38d5ee3e960302cedeabe7f2c27acc"} Feb 19 18:53:09 crc kubenswrapper[4749]: I0219 18:53:09.022171 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:09 crc kubenswrapper[4749]: I0219 18:53:09.022226 4749 scope.go:117] "RemoveContainer" containerID="2b8e7df8773bc6af522d0c6c7d572642c3a5fdcfab6d7b7b20a03d6c230947c7" Feb 19 18:53:09 crc kubenswrapper[4749]: I0219 18:53:09.049711 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-866f5f4f4b-zfvsn"] Feb 19 18:53:09 crc kubenswrapper[4749]: I0219 18:53:09.058172 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-866f5f4f4b-zfvsn"] Feb 19 18:53:09 crc kubenswrapper[4749]: I0219 18:53:09.209725 4749 scope.go:117] "RemoveContainer" containerID="7ed458c93c10e8a5d70696643628559a1656a862171ba6383208264fc426b8b5" Feb 19 18:53:10 crc kubenswrapper[4749]: I0219 18:53:10.679492 4749 scope.go:117] "RemoveContainer" containerID="bb744c3795b7bbfd4b38ce2f51e278731316bb1a81d475e683fa156086c9ffdd" Feb 19 18:53:10 crc kubenswrapper[4749]: I0219 18:53:10.689438 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8dcae23-3df3-4de3-8a0a-499c15a90daa" path="/var/lib/kubelet/pods/f8dcae23-3df3-4de3-8a0a-499c15a90daa/volumes" Feb 19 18:53:11 crc kubenswrapper[4749]: I0219 18:53:11.196123 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:13 crc kubenswrapper[4749]: I0219 18:53:13.089972 4749 generic.go:334] "Generic (PLEG): container finished" podID="b7838589-f49f-4b20-a019-48db9b2ae719" containerID="f8643b66de6291bf89ae58c0eba3442fc66922713ac6fb99b8e06f8e3cea4fcb" exitCode=0 Feb 19 18:53:13 crc kubenswrapper[4749]: I0219 18:53:13.090056 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7838589-f49f-4b20-a019-48db9b2ae719","Type":"ContainerDied","Data":"f8643b66de6291bf89ae58c0eba3442fc66922713ac6fb99b8e06f8e3cea4fcb"} Feb 19 18:53:13 crc kubenswrapper[4749]: I0219 18:53:13.621039 4749 scope.go:117] "RemoveContainer" containerID="5920f7e8e48bd9db54656e5fa85d999dbef914999968cb69de532e4da7aa1a46" Feb 19 18:53:13 crc kubenswrapper[4749]: I0219 18:53:13.768677 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="b7838589-f49f-4b20-a019-48db9b2ae719" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.180:3000/\": dial tcp 10.217.0.180:3000: connect: connection refused" Feb 19 18:53:15 crc kubenswrapper[4749]: I0219 18:53:15.032171 4749 scope.go:117] "RemoveContainer" containerID="e9ddbea5c3fbfdec50ae62f4eb7b129a5fd5d32a83bbb1a7d8b2aafaca18884d" Feb 19 18:53:15 crc kubenswrapper[4749]: I0219 18:53:15.077987 4749 scope.go:117] "RemoveContainer" containerID="53b6281b165e657ca42cc38c9829498fa3bc9206eed56d7882272a861c64a378" Feb 19 18:53:15 crc kubenswrapper[4749]: I0219 18:53:15.338646 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 18:53:15 crc kubenswrapper[4749]: I0219 18:53:15.463204 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7838589-f49f-4b20-a019-48db9b2ae719-sg-core-conf-yaml\") pod \"b7838589-f49f-4b20-a019-48db9b2ae719\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " Feb 19 18:53:15 crc kubenswrapper[4749]: I0219 18:53:15.463237 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7838589-f49f-4b20-a019-48db9b2ae719-combined-ca-bundle\") pod \"b7838589-f49f-4b20-a019-48db9b2ae719\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " Feb 19 18:53:15 crc kubenswrapper[4749]: I0219 18:53:15.463337 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7838589-f49f-4b20-a019-48db9b2ae719-scripts\") pod \"b7838589-f49f-4b20-a019-48db9b2ae719\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " Feb 19 18:53:15 crc kubenswrapper[4749]: I0219 18:53:15.463367 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7838589-f49f-4b20-a019-48db9b2ae719-run-httpd\") pod \"b7838589-f49f-4b20-a019-48db9b2ae719\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " Feb 19 18:53:15 crc kubenswrapper[4749]: I0219 18:53:15.463432 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgkms\" (UniqueName: \"kubernetes.io/projected/b7838589-f49f-4b20-a019-48db9b2ae719-kube-api-access-cgkms\") pod \"b7838589-f49f-4b20-a019-48db9b2ae719\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " Feb 19 18:53:15 crc kubenswrapper[4749]: I0219 18:53:15.463465 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7838589-f49f-4b20-a019-48db9b2ae719-config-data\") pod \"b7838589-f49f-4b20-a019-48db9b2ae719\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " Feb 19 18:53:15 crc kubenswrapper[4749]: I0219 18:53:15.463563 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7838589-f49f-4b20-a019-48db9b2ae719-log-httpd\") pod \"b7838589-f49f-4b20-a019-48db9b2ae719\" (UID: \"b7838589-f49f-4b20-a019-48db9b2ae719\") " Feb 19 18:53:15 crc kubenswrapper[4749]: I0219 18:53:15.464774 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7838589-f49f-4b20-a019-48db9b2ae719-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b7838589-f49f-4b20-a019-48db9b2ae719" (UID: "b7838589-f49f-4b20-a019-48db9b2ae719"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:53:15 crc kubenswrapper[4749]: I0219 18:53:15.467094 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7838589-f49f-4b20-a019-48db9b2ae719-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b7838589-f49f-4b20-a019-48db9b2ae719" (UID: "b7838589-f49f-4b20-a019-48db9b2ae719"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:53:15 crc kubenswrapper[4749]: I0219 18:53:15.472299 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7838589-f49f-4b20-a019-48db9b2ae719-scripts" (OuterVolumeSpecName: "scripts") pod "b7838589-f49f-4b20-a019-48db9b2ae719" (UID: "b7838589-f49f-4b20-a019-48db9b2ae719"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:15 crc kubenswrapper[4749]: I0219 18:53:15.473306 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7838589-f49f-4b20-a019-48db9b2ae719-kube-api-access-cgkms" (OuterVolumeSpecName: "kube-api-access-cgkms") pod "b7838589-f49f-4b20-a019-48db9b2ae719" (UID: "b7838589-f49f-4b20-a019-48db9b2ae719"). InnerVolumeSpecName "kube-api-access-cgkms". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:53:15 crc kubenswrapper[4749]: I0219 18:53:15.495119 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7838589-f49f-4b20-a019-48db9b2ae719-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b7838589-f49f-4b20-a019-48db9b2ae719" (UID: "b7838589-f49f-4b20-a019-48db9b2ae719"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:15 crc kubenswrapper[4749]: I0219 18:53:15.541099 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7838589-f49f-4b20-a019-48db9b2ae719-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7838589-f49f-4b20-a019-48db9b2ae719" (UID: "b7838589-f49f-4b20-a019-48db9b2ae719"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:15 crc kubenswrapper[4749]: I0219 18:53:15.566262 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgkms\" (UniqueName: \"kubernetes.io/projected/b7838589-f49f-4b20-a019-48db9b2ae719-kube-api-access-cgkms\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:15 crc kubenswrapper[4749]: I0219 18:53:15.566299 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7838589-f49f-4b20-a019-48db9b2ae719-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:15 crc kubenswrapper[4749]: I0219 18:53:15.566315 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7838589-f49f-4b20-a019-48db9b2ae719-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:15 crc kubenswrapper[4749]: I0219 18:53:15.566327 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7838589-f49f-4b20-a019-48db9b2ae719-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:15 crc kubenswrapper[4749]: I0219 18:53:15.566339 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7838589-f49f-4b20-a019-48db9b2ae719-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:15 crc kubenswrapper[4749]: I0219 18:53:15.566351 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7838589-f49f-4b20-a019-48db9b2ae719-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:15 crc kubenswrapper[4749]: I0219 18:53:15.569015 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7838589-f49f-4b20-a019-48db9b2ae719-config-data" (OuterVolumeSpecName: "config-data") pod "b7838589-f49f-4b20-a019-48db9b2ae719" (UID: "b7838589-f49f-4b20-a019-48db9b2ae719"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:15 crc kubenswrapper[4749]: I0219 18:53:15.668362 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7838589-f49f-4b20-a019-48db9b2ae719-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.049052 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6966bc7795-zbh89" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.134392 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"763173db-176a-426d-bd85-e051d56ec5cf","Type":"ContainerStarted","Data":"325be88592db27be126170ad29b2fe3647f1a72fbdf514c1993695f5fc3d8121"} Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.141622 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7838589-f49f-4b20-a019-48db9b2ae719","Type":"ContainerDied","Data":"42f7884fb0c8432df808b91db0331a986e3c47ed84d9ff236ea4e2105527090f"} Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.141694 4749 scope.go:117] "RemoveContainer" containerID="a16f4e4d1acac26c195aeb2c3e2a960c0c0cedff3af620794bb6b1f4acec01c6" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.141898 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.144362 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5d676f6e-9d56-41ab-9689-a19a0b9665f7","Type":"ContainerStarted","Data":"3c06d2f03beb27f26a1d46571a8578fc6f037ef56b2cef381015a972042af9a1"} Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.168428 4749 scope.go:117] "RemoveContainer" containerID="bbc77b191ed97fab0dcdefa3a50b1f37a01e3e78e7bfdf81eca25489fd4165d4" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.185667 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.520879367 podStartE2EDuration="14.185650218s" podCreationTimestamp="2026-02-19 18:53:02 +0000 UTC" firstStartedPulling="2026-02-19 18:53:03.456820326 +0000 UTC m=+1157.418040280" lastFinishedPulling="2026-02-19 18:53:15.121591177 +0000 UTC m=+1169.082811131" observedRunningTime="2026-02-19 18:53:16.181647311 +0000 UTC m=+1170.142867275" watchObservedRunningTime="2026-02-19 18:53:16.185650218 +0000 UTC m=+1170.146870172" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.204469 4749 scope.go:117] "RemoveContainer" containerID="f8643b66de6291bf89ae58c0eba3442fc66922713ac6fb99b8e06f8e3cea4fcb" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.215713 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.247060 4749 scope.go:117] "RemoveContainer" containerID="f19d5a7c0676700b8bb0a7f9196177a390c33f2bf5755b61c48ef371da44afe0" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.247204 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.259254 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:53:16 crc kubenswrapper[4749]: E0219 18:53:16.259775 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee13f81f-9ab7-496e-b4e6-85a530c1774e" containerName="barbican-api" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.259796 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee13f81f-9ab7-496e-b4e6-85a530c1774e" containerName="barbican-api" Feb 19 18:53:16 crc kubenswrapper[4749]: E0219 18:53:16.259827 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7838589-f49f-4b20-a019-48db9b2ae719" containerName="ceilometer-central-agent" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.259837 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7838589-f49f-4b20-a019-48db9b2ae719" containerName="ceilometer-central-agent" Feb 19 18:53:16 crc kubenswrapper[4749]: E0219 18:53:16.259853 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8dcae23-3df3-4de3-8a0a-499c15a90daa" containerName="horizon-log" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.259861 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8dcae23-3df3-4de3-8a0a-499c15a90daa" containerName="horizon-log" Feb 19 18:53:16 crc kubenswrapper[4749]: E0219 18:53:16.259879 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7838589-f49f-4b20-a019-48db9b2ae719" containerName="proxy-httpd" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.259888 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7838589-f49f-4b20-a019-48db9b2ae719" containerName="proxy-httpd" Feb 19 18:53:16 crc kubenswrapper[4749]: E0219 18:53:16.259901 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7838589-f49f-4b20-a019-48db9b2ae719" containerName="ceilometer-notification-agent" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.259910 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7838589-f49f-4b20-a019-48db9b2ae719" containerName="ceilometer-notification-agent" Feb 19 18:53:16 crc kubenswrapper[4749]: E0219 18:53:16.259928 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee13f81f-9ab7-496e-b4e6-85a530c1774e" containerName="barbican-api-log" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.259936 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee13f81f-9ab7-496e-b4e6-85a530c1774e" containerName="barbican-api-log" Feb 19 18:53:16 crc kubenswrapper[4749]: E0219 18:53:16.259949 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7838589-f49f-4b20-a019-48db9b2ae719" containerName="sg-core" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.259956 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7838589-f49f-4b20-a019-48db9b2ae719" containerName="sg-core" Feb 19 18:53:16 crc kubenswrapper[4749]: E0219 18:53:16.259974 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8dcae23-3df3-4de3-8a0a-499c15a90daa" containerName="horizon" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.259981 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8dcae23-3df3-4de3-8a0a-499c15a90daa" containerName="horizon" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.260206 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7838589-f49f-4b20-a019-48db9b2ae719" containerName="sg-core" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.260219 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7838589-f49f-4b20-a019-48db9b2ae719" containerName="proxy-httpd" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.260232 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7838589-f49f-4b20-a019-48db9b2ae719" containerName="ceilometer-central-agent" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.260247 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7838589-f49f-4b20-a019-48db9b2ae719" containerName="ceilometer-notification-agent" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.260255 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8dcae23-3df3-4de3-8a0a-499c15a90daa" containerName="horizon" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.260270 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee13f81f-9ab7-496e-b4e6-85a530c1774e" containerName="barbican-api-log" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.260286 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8dcae23-3df3-4de3-8a0a-499c15a90daa" containerName="horizon-log" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.260306 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee13f81f-9ab7-496e-b4e6-85a530c1774e" containerName="barbican-api" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.262515 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.264733 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.264892 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.266489 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.382905 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69cd82cc-4f4d-498d-b976-422d89f21233-scripts\") pod \"ceilometer-0\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " pod="openstack/ceilometer-0" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.382973 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69cd82cc-4f4d-498d-b976-422d89f21233-log-httpd\") pod \"ceilometer-0\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " pod="openstack/ceilometer-0" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.383159 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69cd82cc-4f4d-498d-b976-422d89f21233-config-data\") pod \"ceilometer-0\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " pod="openstack/ceilometer-0" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.383178 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69cd82cc-4f4d-498d-b976-422d89f21233-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " pod="openstack/ceilometer-0" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.383233 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btgtf\" (UniqueName: \"kubernetes.io/projected/69cd82cc-4f4d-498d-b976-422d89f21233-kube-api-access-btgtf\") pod \"ceilometer-0\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " pod="openstack/ceilometer-0" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.383305 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69cd82cc-4f4d-498d-b976-422d89f21233-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " pod="openstack/ceilometer-0" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.383331 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69cd82cc-4f4d-498d-b976-422d89f21233-run-httpd\") pod \"ceilometer-0\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " pod="openstack/ceilometer-0" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.485191 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69cd82cc-4f4d-498d-b976-422d89f21233-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " pod="openstack/ceilometer-0" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.485240 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69cd82cc-4f4d-498d-b976-422d89f21233-run-httpd\") pod \"ceilometer-0\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " pod="openstack/ceilometer-0" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.485269 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69cd82cc-4f4d-498d-b976-422d89f21233-scripts\") pod \"ceilometer-0\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " pod="openstack/ceilometer-0" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.485303 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69cd82cc-4f4d-498d-b976-422d89f21233-log-httpd\") pod \"ceilometer-0\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " pod="openstack/ceilometer-0" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.485382 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69cd82cc-4f4d-498d-b976-422d89f21233-config-data\") pod \"ceilometer-0\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " pod="openstack/ceilometer-0" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.485402 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69cd82cc-4f4d-498d-b976-422d89f21233-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " pod="openstack/ceilometer-0" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.485424 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btgtf\" (UniqueName: \"kubernetes.io/projected/69cd82cc-4f4d-498d-b976-422d89f21233-kube-api-access-btgtf\") pod \"ceilometer-0\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " pod="openstack/ceilometer-0" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.485778 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69cd82cc-4f4d-498d-b976-422d89f21233-run-httpd\") pod \"ceilometer-0\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " pod="openstack/ceilometer-0" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.486014 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69cd82cc-4f4d-498d-b976-422d89f21233-log-httpd\") pod \"ceilometer-0\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " pod="openstack/ceilometer-0" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.490668 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69cd82cc-4f4d-498d-b976-422d89f21233-config-data\") pod \"ceilometer-0\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " pod="openstack/ceilometer-0" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.491467 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69cd82cc-4f4d-498d-b976-422d89f21233-scripts\") pod \"ceilometer-0\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " pod="openstack/ceilometer-0" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.491553 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69cd82cc-4f4d-498d-b976-422d89f21233-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " pod="openstack/ceilometer-0" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.491689 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69cd82cc-4f4d-498d-b976-422d89f21233-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " pod="openstack/ceilometer-0" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.500662 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btgtf\" (UniqueName: \"kubernetes.io/projected/69cd82cc-4f4d-498d-b976-422d89f21233-kube-api-access-btgtf\") pod \"ceilometer-0\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " pod="openstack/ceilometer-0" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.593722 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 18:53:16 crc kubenswrapper[4749]: I0219 18:53:16.720429 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7838589-f49f-4b20-a019-48db9b2ae719" path="/var/lib/kubelet/pods/b7838589-f49f-4b20-a019-48db9b2ae719/volumes" Feb 19 18:53:17 crc kubenswrapper[4749]: I0219 18:53:17.057221 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:53:17 crc kubenswrapper[4749]: W0219 18:53:17.066106 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69cd82cc_4f4d_498d_b976_422d89f21233.slice/crio-4b33efe0da505f165fca4220b9e5ef4382250036996ad7662e4d1d2d8f334eed WatchSource:0}: Error finding container 4b33efe0da505f165fca4220b9e5ef4382250036996ad7662e4d1d2d8f334eed: Status 404 returned error can't find the container with id 4b33efe0da505f165fca4220b9e5ef4382250036996ad7662e4d1d2d8f334eed Feb 19 18:53:17 crc kubenswrapper[4749]: I0219 18:53:17.155322 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69cd82cc-4f4d-498d-b976-422d89f21233","Type":"ContainerStarted","Data":"4b33efe0da505f165fca4220b9e5ef4382250036996ad7662e4d1d2d8f334eed"} Feb 19 18:53:18 crc kubenswrapper[4749]: I0219 18:53:18.171826 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69cd82cc-4f4d-498d-b976-422d89f21233","Type":"ContainerStarted","Data":"e6c82e3a0271d376778ee32109898cf9164458f6158fbe1fc94295b0d88a976d"} Feb 19 18:53:18 crc kubenswrapper[4749]: I0219 18:53:18.200619 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:53:19 crc kubenswrapper[4749]: I0219 18:53:19.182649 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69cd82cc-4f4d-498d-b976-422d89f21233","Type":"ContainerStarted","Data":"bbfa1096c03c17e0ac13e73401789b814ed520c6224aad8d33c71703c6e55236"} Feb 19 18:53:19 crc kubenswrapper[4749]: I0219 18:53:19.182963 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69cd82cc-4f4d-498d-b976-422d89f21233","Type":"ContainerStarted","Data":"6f370090fb5774fa65c3ab773e26f3c04202f23264dc3114896d648ecabdd9ce"} Feb 19 18:53:22 crc kubenswrapper[4749]: I0219 18:53:22.208174 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69cd82cc-4f4d-498d-b976-422d89f21233","Type":"ContainerStarted","Data":"302e0b550a9c92c25e9dd2d2182c9e1c1e96746a4f8db8dad27913a510d877d6"} Feb 19 18:53:22 crc kubenswrapper[4749]: I0219 18:53:22.208582 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69cd82cc-4f4d-498d-b976-422d89f21233" containerName="ceilometer-central-agent" containerID="cri-o://e6c82e3a0271d376778ee32109898cf9164458f6158fbe1fc94295b0d88a976d" gracePeriod=30 Feb 19 18:53:22 crc kubenswrapper[4749]: I0219 18:53:22.208671 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 18:53:22 crc kubenswrapper[4749]: I0219 18:53:22.209065 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69cd82cc-4f4d-498d-b976-422d89f21233" containerName="proxy-httpd" containerID="cri-o://302e0b550a9c92c25e9dd2d2182c9e1c1e96746a4f8db8dad27913a510d877d6" gracePeriod=30 Feb 19 18:53:22 crc kubenswrapper[4749]: I0219 18:53:22.209120 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69cd82cc-4f4d-498d-b976-422d89f21233" containerName="sg-core" containerID="cri-o://bbfa1096c03c17e0ac13e73401789b814ed520c6224aad8d33c71703c6e55236" gracePeriod=30 Feb 19 18:53:22 crc kubenswrapper[4749]: I0219 18:53:22.209158 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69cd82cc-4f4d-498d-b976-422d89f21233" containerName="ceilometer-notification-agent" containerID="cri-o://6f370090fb5774fa65c3ab773e26f3c04202f23264dc3114896d648ecabdd9ce" gracePeriod=30 Feb 19 18:53:22 crc kubenswrapper[4749]: I0219 18:53:22.247658 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.068914661 podStartE2EDuration="6.247642888s" podCreationTimestamp="2026-02-19 18:53:16 +0000 UTC" firstStartedPulling="2026-02-19 18:53:17.068916681 +0000 UTC m=+1171.030136645" lastFinishedPulling="2026-02-19 18:53:21.247644928 +0000 UTC m=+1175.208864872" observedRunningTime="2026-02-19 18:53:22.24194263 +0000 UTC m=+1176.203162584" watchObservedRunningTime="2026-02-19 18:53:22.247642888 +0000 UTC m=+1176.208862842" Feb 19 18:53:23 crc kubenswrapper[4749]: I0219 18:53:23.222531 4749 generic.go:334] "Generic (PLEG): container finished" podID="69cd82cc-4f4d-498d-b976-422d89f21233" containerID="302e0b550a9c92c25e9dd2d2182c9e1c1e96746a4f8db8dad27913a510d877d6" exitCode=0 Feb 19 18:53:23 crc kubenswrapper[4749]: I0219 18:53:23.223353 4749 generic.go:334] "Generic (PLEG): container finished" podID="69cd82cc-4f4d-498d-b976-422d89f21233" containerID="bbfa1096c03c17e0ac13e73401789b814ed520c6224aad8d33c71703c6e55236" exitCode=2 Feb 19 18:53:23 crc kubenswrapper[4749]: I0219 18:53:23.223372 4749 generic.go:334] "Generic (PLEG): container finished" podID="69cd82cc-4f4d-498d-b976-422d89f21233" containerID="6f370090fb5774fa65c3ab773e26f3c04202f23264dc3114896d648ecabdd9ce" exitCode=0 Feb 19 18:53:23 crc kubenswrapper[4749]: I0219 18:53:23.222616 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69cd82cc-4f4d-498d-b976-422d89f21233","Type":"ContainerDied","Data":"302e0b550a9c92c25e9dd2d2182c9e1c1e96746a4f8db8dad27913a510d877d6"} Feb 19 18:53:23 crc kubenswrapper[4749]: I0219 18:53:23.223421 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69cd82cc-4f4d-498d-b976-422d89f21233","Type":"ContainerDied","Data":"bbfa1096c03c17e0ac13e73401789b814ed520c6224aad8d33c71703c6e55236"} Feb 19 18:53:23 crc kubenswrapper[4749]: I0219 18:53:23.223442 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69cd82cc-4f4d-498d-b976-422d89f21233","Type":"ContainerDied","Data":"6f370090fb5774fa65c3ab773e26f3c04202f23264dc3114896d648ecabdd9ce"} Feb 19 18:53:23 crc kubenswrapper[4749]: I0219 18:53:23.552692 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 18:53:23 crc kubenswrapper[4749]: I0219 18:53:23.599746 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Feb 19 18:53:24 crc kubenswrapper[4749]: I0219 18:53:24.231909 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Feb 19 18:53:24 crc kubenswrapper[4749]: I0219 18:53:24.259878 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Feb 19 18:53:24 crc kubenswrapper[4749]: I0219 18:53:24.298743 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 18:53:24 crc kubenswrapper[4749]: I0219 18:53:24.725244 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 18:53:24 crc kubenswrapper[4749]: I0219 18:53:24.725306 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 18:53:25 crc kubenswrapper[4749]: E0219 18:53:25.364234 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69cd82cc_4f4d_498d_b976_422d89f21233.slice/crio-conmon-e6c82e3a0271d376778ee32109898cf9164458f6158fbe1fc94295b0d88a976d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69cd82cc_4f4d_498d_b976_422d89f21233.slice/crio-e6c82e3a0271d376778ee32109898cf9164458f6158fbe1fc94295b0d88a976d.scope\": RecentStats: unable to find data in memory cache]" Feb 19 18:53:25 crc kubenswrapper[4749]: I0219 18:53:25.677043 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 18:53:25 crc kubenswrapper[4749]: I0219 18:53:25.776427 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69cd82cc-4f4d-498d-b976-422d89f21233-config-data\") pod \"69cd82cc-4f4d-498d-b976-422d89f21233\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " Feb 19 18:53:25 crc kubenswrapper[4749]: I0219 18:53:25.776533 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69cd82cc-4f4d-498d-b976-422d89f21233-sg-core-conf-yaml\") pod \"69cd82cc-4f4d-498d-b976-422d89f21233\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " Feb 19 18:53:25 crc kubenswrapper[4749]: I0219 18:53:25.776606 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69cd82cc-4f4d-498d-b976-422d89f21233-log-httpd\") pod \"69cd82cc-4f4d-498d-b976-422d89f21233\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " Feb 19 18:53:25 crc kubenswrapper[4749]: I0219 18:53:25.776639 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btgtf\" (UniqueName: \"kubernetes.io/projected/69cd82cc-4f4d-498d-b976-422d89f21233-kube-api-access-btgtf\") pod \"69cd82cc-4f4d-498d-b976-422d89f21233\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " Feb 19 18:53:25 crc kubenswrapper[4749]: I0219 18:53:25.776669 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69cd82cc-4f4d-498d-b976-422d89f21233-run-httpd\") pod \"69cd82cc-4f4d-498d-b976-422d89f21233\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " Feb 19 18:53:25 crc kubenswrapper[4749]: I0219 18:53:25.776716 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69cd82cc-4f4d-498d-b976-422d89f21233-scripts\") pod \"69cd82cc-4f4d-498d-b976-422d89f21233\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " Feb 19 18:53:25 crc kubenswrapper[4749]: I0219 18:53:25.776800 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69cd82cc-4f4d-498d-b976-422d89f21233-combined-ca-bundle\") pod \"69cd82cc-4f4d-498d-b976-422d89f21233\" (UID: \"69cd82cc-4f4d-498d-b976-422d89f21233\") " Feb 19 18:53:25 crc kubenswrapper[4749]: I0219 18:53:25.778398 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69cd82cc-4f4d-498d-b976-422d89f21233-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "69cd82cc-4f4d-498d-b976-422d89f21233" (UID: "69cd82cc-4f4d-498d-b976-422d89f21233"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:53:25 crc kubenswrapper[4749]: I0219 18:53:25.778631 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69cd82cc-4f4d-498d-b976-422d89f21233-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "69cd82cc-4f4d-498d-b976-422d89f21233" (UID: "69cd82cc-4f4d-498d-b976-422d89f21233"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:53:25 crc kubenswrapper[4749]: I0219 18:53:25.783520 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69cd82cc-4f4d-498d-b976-422d89f21233-scripts" (OuterVolumeSpecName: "scripts") pod "69cd82cc-4f4d-498d-b976-422d89f21233" (UID: "69cd82cc-4f4d-498d-b976-422d89f21233"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:25 crc kubenswrapper[4749]: I0219 18:53:25.784148 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69cd82cc-4f4d-498d-b976-422d89f21233-kube-api-access-btgtf" (OuterVolumeSpecName: "kube-api-access-btgtf") pod "69cd82cc-4f4d-498d-b976-422d89f21233" (UID: "69cd82cc-4f4d-498d-b976-422d89f21233"). InnerVolumeSpecName "kube-api-access-btgtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:53:25 crc kubenswrapper[4749]: I0219 18:53:25.816100 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69cd82cc-4f4d-498d-b976-422d89f21233-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "69cd82cc-4f4d-498d-b976-422d89f21233" (UID: "69cd82cc-4f4d-498d-b976-422d89f21233"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:25 crc kubenswrapper[4749]: I0219 18:53:25.874719 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69cd82cc-4f4d-498d-b976-422d89f21233-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69cd82cc-4f4d-498d-b976-422d89f21233" (UID: "69cd82cc-4f4d-498d-b976-422d89f21233"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:25 crc kubenswrapper[4749]: I0219 18:53:25.878960 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69cd82cc-4f4d-498d-b976-422d89f21233-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:25 crc kubenswrapper[4749]: I0219 18:53:25.878998 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btgtf\" (UniqueName: \"kubernetes.io/projected/69cd82cc-4f4d-498d-b976-422d89f21233-kube-api-access-btgtf\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:25 crc kubenswrapper[4749]: I0219 18:53:25.879013 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69cd82cc-4f4d-498d-b976-422d89f21233-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:25 crc kubenswrapper[4749]: I0219 18:53:25.879042 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69cd82cc-4f4d-498d-b976-422d89f21233-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:25 crc kubenswrapper[4749]: I0219 18:53:25.879054 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69cd82cc-4f4d-498d-b976-422d89f21233-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:25 crc kubenswrapper[4749]: I0219 18:53:25.879065 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69cd82cc-4f4d-498d-b976-422d89f21233-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:25 crc kubenswrapper[4749]: I0219 18:53:25.891704 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69cd82cc-4f4d-498d-b976-422d89f21233-config-data" (OuterVolumeSpecName: "config-data") pod "69cd82cc-4f4d-498d-b976-422d89f21233" (UID: "69cd82cc-4f4d-498d-b976-422d89f21233"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:25 crc kubenswrapper[4749]: I0219 18:53:25.980710 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69cd82cc-4f4d-498d-b976-422d89f21233-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.120899 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.121516 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7e26f3eb-2933-4e84-9be8-2529ff29fc93" containerName="glance-httpd" containerID="cri-o://c7bfe50a5bece5a853c839e17c62830ca96a1e0b6c78f9ccb3cd22ae7b470397" gracePeriod=30 Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.121276 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7e26f3eb-2933-4e84-9be8-2529ff29fc93" containerName="glance-log" containerID="cri-o://2faff4371d10d3fe6bb565ff39f306b9c697e0b74686119bf87e91ca4a42c13d" gracePeriod=30 Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.289867 4749 generic.go:334] "Generic (PLEG): container finished" podID="69cd82cc-4f4d-498d-b976-422d89f21233" containerID="e6c82e3a0271d376778ee32109898cf9164458f6158fbe1fc94295b0d88a976d" exitCode=0 Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.289944 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69cd82cc-4f4d-498d-b976-422d89f21233","Type":"ContainerDied","Data":"e6c82e3a0271d376778ee32109898cf9164458f6158fbe1fc94295b0d88a976d"} Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.289951 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.289972 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69cd82cc-4f4d-498d-b976-422d89f21233","Type":"ContainerDied","Data":"4b33efe0da505f165fca4220b9e5ef4382250036996ad7662e4d1d2d8f334eed"} Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.289996 4749 scope.go:117] "RemoveContainer" containerID="302e0b550a9c92c25e9dd2d2182c9e1c1e96746a4f8db8dad27913a510d877d6" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.296412 4749 generic.go:334] "Generic (PLEG): container finished" podID="7e26f3eb-2933-4e84-9be8-2529ff29fc93" containerID="2faff4371d10d3fe6bb565ff39f306b9c697e0b74686119bf87e91ca4a42c13d" exitCode=143 Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.296664 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="763173db-176a-426d-bd85-e051d56ec5cf" containerName="watcher-decision-engine" containerID="cri-o://325be88592db27be126170ad29b2fe3647f1a72fbdf514c1993695f5fc3d8121" gracePeriod=30 Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.297003 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7e26f3eb-2933-4e84-9be8-2529ff29fc93","Type":"ContainerDied","Data":"2faff4371d10d3fe6bb565ff39f306b9c697e0b74686119bf87e91ca4a42c13d"} Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.337464 4749 scope.go:117] "RemoveContainer" containerID="bbfa1096c03c17e0ac13e73401789b814ed520c6224aad8d33c71703c6e55236" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.407819 4749 scope.go:117] "RemoveContainer" containerID="6f370090fb5774fa65c3ab773e26f3c04202f23264dc3114896d648ecabdd9ce" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.426277 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.439044 4749 scope.go:117] "RemoveContainer" containerID="e6c82e3a0271d376778ee32109898cf9164458f6158fbe1fc94295b0d88a976d" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.444833 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.457187 4749 scope.go:117] "RemoveContainer" containerID="302e0b550a9c92c25e9dd2d2182c9e1c1e96746a4f8db8dad27913a510d877d6" Feb 19 18:53:26 crc kubenswrapper[4749]: E0219 18:53:26.457756 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"302e0b550a9c92c25e9dd2d2182c9e1c1e96746a4f8db8dad27913a510d877d6\": container with ID starting with 302e0b550a9c92c25e9dd2d2182c9e1c1e96746a4f8db8dad27913a510d877d6 not found: ID does not exist" containerID="302e0b550a9c92c25e9dd2d2182c9e1c1e96746a4f8db8dad27913a510d877d6" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.457861 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"302e0b550a9c92c25e9dd2d2182c9e1c1e96746a4f8db8dad27913a510d877d6"} err="failed to get container status \"302e0b550a9c92c25e9dd2d2182c9e1c1e96746a4f8db8dad27913a510d877d6\": rpc error: code = NotFound desc = could not find container \"302e0b550a9c92c25e9dd2d2182c9e1c1e96746a4f8db8dad27913a510d877d6\": container with ID starting with 302e0b550a9c92c25e9dd2d2182c9e1c1e96746a4f8db8dad27913a510d877d6 not found: ID does not exist" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.457937 4749 scope.go:117] "RemoveContainer" containerID="bbfa1096c03c17e0ac13e73401789b814ed520c6224aad8d33c71703c6e55236" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.459853 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:53:26 crc kubenswrapper[4749]: E0219 18:53:26.460288 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69cd82cc-4f4d-498d-b976-422d89f21233" containerName="sg-core" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.460310 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="69cd82cc-4f4d-498d-b976-422d89f21233" containerName="sg-core" Feb 19 18:53:26 crc kubenswrapper[4749]: E0219 18:53:26.460325 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69cd82cc-4f4d-498d-b976-422d89f21233" containerName="proxy-httpd" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.460332 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="69cd82cc-4f4d-498d-b976-422d89f21233" containerName="proxy-httpd" Feb 19 18:53:26 crc kubenswrapper[4749]: E0219 18:53:26.460350 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69cd82cc-4f4d-498d-b976-422d89f21233" containerName="ceilometer-notification-agent" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.460357 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="69cd82cc-4f4d-498d-b976-422d89f21233" containerName="ceilometer-notification-agent" Feb 19 18:53:26 crc kubenswrapper[4749]: E0219 18:53:26.460382 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69cd82cc-4f4d-498d-b976-422d89f21233" containerName="ceilometer-central-agent" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.460388 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="69cd82cc-4f4d-498d-b976-422d89f21233" containerName="ceilometer-central-agent" Feb 19 18:53:26 crc kubenswrapper[4749]: E0219 18:53:26.460528 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbfa1096c03c17e0ac13e73401789b814ed520c6224aad8d33c71703c6e55236\": container with ID starting with bbfa1096c03c17e0ac13e73401789b814ed520c6224aad8d33c71703c6e55236 not found: ID does not exist" containerID="bbfa1096c03c17e0ac13e73401789b814ed520c6224aad8d33c71703c6e55236" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.460564 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="69cd82cc-4f4d-498d-b976-422d89f21233" containerName="proxy-httpd" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.460576 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="69cd82cc-4f4d-498d-b976-422d89f21233" containerName="ceilometer-central-agent" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.460572 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbfa1096c03c17e0ac13e73401789b814ed520c6224aad8d33c71703c6e55236"} err="failed to get container status \"bbfa1096c03c17e0ac13e73401789b814ed520c6224aad8d33c71703c6e55236\": rpc error: code = NotFound desc = could not find container \"bbfa1096c03c17e0ac13e73401789b814ed520c6224aad8d33c71703c6e55236\": container with ID starting with bbfa1096c03c17e0ac13e73401789b814ed520c6224aad8d33c71703c6e55236 not found: ID does not exist" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.460596 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="69cd82cc-4f4d-498d-b976-422d89f21233" containerName="sg-core" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.460605 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="69cd82cc-4f4d-498d-b976-422d89f21233" containerName="ceilometer-notification-agent" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.460609 4749 scope.go:117] "RemoveContainer" containerID="6f370090fb5774fa65c3ab773e26f3c04202f23264dc3114896d648ecabdd9ce" Feb 19 18:53:26 crc kubenswrapper[4749]: E0219 18:53:26.461313 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f370090fb5774fa65c3ab773e26f3c04202f23264dc3114896d648ecabdd9ce\": container with ID starting with 6f370090fb5774fa65c3ab773e26f3c04202f23264dc3114896d648ecabdd9ce not found: ID does not exist" containerID="6f370090fb5774fa65c3ab773e26f3c04202f23264dc3114896d648ecabdd9ce" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.461345 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f370090fb5774fa65c3ab773e26f3c04202f23264dc3114896d648ecabdd9ce"} err="failed to get container status \"6f370090fb5774fa65c3ab773e26f3c04202f23264dc3114896d648ecabdd9ce\": rpc error: code = NotFound desc = could not find container \"6f370090fb5774fa65c3ab773e26f3c04202f23264dc3114896d648ecabdd9ce\": container with ID starting with 6f370090fb5774fa65c3ab773e26f3c04202f23264dc3114896d648ecabdd9ce not found: ID does not exist" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.461365 4749 scope.go:117] "RemoveContainer" containerID="e6c82e3a0271d376778ee32109898cf9164458f6158fbe1fc94295b0d88a976d" Feb 19 18:53:26 crc kubenswrapper[4749]: E0219 18:53:26.461573 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6c82e3a0271d376778ee32109898cf9164458f6158fbe1fc94295b0d88a976d\": container with ID starting with e6c82e3a0271d376778ee32109898cf9164458f6158fbe1fc94295b0d88a976d not found: ID does not exist" containerID="e6c82e3a0271d376778ee32109898cf9164458f6158fbe1fc94295b0d88a976d" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.461603 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6c82e3a0271d376778ee32109898cf9164458f6158fbe1fc94295b0d88a976d"} err="failed to get container status \"e6c82e3a0271d376778ee32109898cf9164458f6158fbe1fc94295b0d88a976d\": rpc error: code = NotFound desc = could not find container \"e6c82e3a0271d376778ee32109898cf9164458f6158fbe1fc94295b0d88a976d\": container with ID starting with e6c82e3a0271d376778ee32109898cf9164458f6158fbe1fc94295b0d88a976d not found: ID does not exist" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.462231 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.466860 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.478586 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.478838 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.593595 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sbnx\" (UniqueName: \"kubernetes.io/projected/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-kube-api-access-2sbnx\") pod \"ceilometer-0\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " pod="openstack/ceilometer-0" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.593638 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-config-data\") pod \"ceilometer-0\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " pod="openstack/ceilometer-0" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.593680 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-log-httpd\") pod \"ceilometer-0\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " pod="openstack/ceilometer-0" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.593719 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-run-httpd\") pod \"ceilometer-0\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " pod="openstack/ceilometer-0" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.593789 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " pod="openstack/ceilometer-0" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.593846 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-scripts\") pod \"ceilometer-0\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " pod="openstack/ceilometer-0" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.593881 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " pod="openstack/ceilometer-0" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.690144 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69cd82cc-4f4d-498d-b976-422d89f21233" path="/var/lib/kubelet/pods/69cd82cc-4f4d-498d-b976-422d89f21233/volumes" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.695163 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-run-httpd\") pod \"ceilometer-0\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " pod="openstack/ceilometer-0" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.695257 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " pod="openstack/ceilometer-0" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.695338 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-scripts\") pod \"ceilometer-0\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " pod="openstack/ceilometer-0" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.695383 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " pod="openstack/ceilometer-0" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.695431 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sbnx\" (UniqueName: \"kubernetes.io/projected/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-kube-api-access-2sbnx\") pod \"ceilometer-0\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " pod="openstack/ceilometer-0" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.695454 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-config-data\") pod \"ceilometer-0\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " pod="openstack/ceilometer-0" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.695481 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-log-httpd\") pod \"ceilometer-0\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " pod="openstack/ceilometer-0" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.695760 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-run-httpd\") pod \"ceilometer-0\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " pod="openstack/ceilometer-0" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.695993 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-log-httpd\") pod \"ceilometer-0\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " pod="openstack/ceilometer-0" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.700790 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-scripts\") pod \"ceilometer-0\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " pod="openstack/ceilometer-0" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.701069 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " pod="openstack/ceilometer-0" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.704120 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-config-data\") pod \"ceilometer-0\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " pod="openstack/ceilometer-0" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.715200 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " pod="openstack/ceilometer-0" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.715948 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sbnx\" (UniqueName: \"kubernetes.io/projected/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-kube-api-access-2sbnx\") pod \"ceilometer-0\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " pod="openstack/ceilometer-0" Feb 19 18:53:26 crc kubenswrapper[4749]: I0219 18:53:26.793091 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.273114 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:53:27 crc kubenswrapper[4749]: W0219 18:53:27.301101 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9a36573_a6ea_481c_bdc1_ea3d7b06e2ce.slice/crio-8f06a677c2da21482bd507aa6a6b708e1fe6a12c48971d6abc28454a64438a05 WatchSource:0}: Error finding container 8f06a677c2da21482bd507aa6a6b708e1fe6a12c48971d6abc28454a64438a05: Status 404 returned error can't find the container with id 8f06a677c2da21482bd507aa6a6b708e1fe6a12c48971d6abc28454a64438a05 Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.314054 4749 generic.go:334] "Generic (PLEG): container finished" podID="7e26f3eb-2933-4e84-9be8-2529ff29fc93" containerID="c7bfe50a5bece5a853c839e17c62830ca96a1e0b6c78f9ccb3cd22ae7b470397" exitCode=0 Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.314130 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7e26f3eb-2933-4e84-9be8-2529ff29fc93","Type":"ContainerDied","Data":"c7bfe50a5bece5a853c839e17c62830ca96a1e0b6c78f9ccb3cd22ae7b470397"} Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.514008 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.615085 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e26f3eb-2933-4e84-9be8-2529ff29fc93-scripts\") pod \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.615491 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e26f3eb-2933-4e84-9be8-2529ff29fc93-logs\") pod \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.615558 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxdtj\" (UniqueName: \"kubernetes.io/projected/7e26f3eb-2933-4e84-9be8-2529ff29fc93-kube-api-access-hxdtj\") pod \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.615610 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e26f3eb-2933-4e84-9be8-2529ff29fc93-config-data\") pod \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.615631 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7e26f3eb-2933-4e84-9be8-2529ff29fc93-httpd-run\") pod \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.615679 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e26f3eb-2933-4e84-9be8-2529ff29fc93-public-tls-certs\") pod \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.615717 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.615843 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e26f3eb-2933-4e84-9be8-2529ff29fc93-combined-ca-bundle\") pod \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\" (UID: \"7e26f3eb-2933-4e84-9be8-2529ff29fc93\") " Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.616240 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e26f3eb-2933-4e84-9be8-2529ff29fc93-logs" (OuterVolumeSpecName: "logs") pod "7e26f3eb-2933-4e84-9be8-2529ff29fc93" (UID: "7e26f3eb-2933-4e84-9be8-2529ff29fc93"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.623280 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e26f3eb-2933-4e84-9be8-2529ff29fc93-scripts" (OuterVolumeSpecName: "scripts") pod "7e26f3eb-2933-4e84-9be8-2529ff29fc93" (UID: "7e26f3eb-2933-4e84-9be8-2529ff29fc93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.624506 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e26f3eb-2933-4e84-9be8-2529ff29fc93-kube-api-access-hxdtj" (OuterVolumeSpecName: "kube-api-access-hxdtj") pod "7e26f3eb-2933-4e84-9be8-2529ff29fc93" (UID: "7e26f3eb-2933-4e84-9be8-2529ff29fc93"). InnerVolumeSpecName "kube-api-access-hxdtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.626506 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e26f3eb-2933-4e84-9be8-2529ff29fc93-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7e26f3eb-2933-4e84-9be8-2529ff29fc93" (UID: "7e26f3eb-2933-4e84-9be8-2529ff29fc93"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.630184 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "7e26f3eb-2933-4e84-9be8-2529ff29fc93" (UID: "7e26f3eb-2933-4e84-9be8-2529ff29fc93"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.659585 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e26f3eb-2933-4e84-9be8-2529ff29fc93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e26f3eb-2933-4e84-9be8-2529ff29fc93" (UID: "7e26f3eb-2933-4e84-9be8-2529ff29fc93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.688225 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e26f3eb-2933-4e84-9be8-2529ff29fc93-config-data" (OuterVolumeSpecName: "config-data") pod "7e26f3eb-2933-4e84-9be8-2529ff29fc93" (UID: "7e26f3eb-2933-4e84-9be8-2529ff29fc93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.699385 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e26f3eb-2933-4e84-9be8-2529ff29fc93-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7e26f3eb-2933-4e84-9be8-2529ff29fc93" (UID: "7e26f3eb-2933-4e84-9be8-2529ff29fc93"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.718053 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e26f3eb-2933-4e84-9be8-2529ff29fc93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.718084 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e26f3eb-2933-4e84-9be8-2529ff29fc93-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.718092 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e26f3eb-2933-4e84-9be8-2529ff29fc93-logs\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.718101 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxdtj\" (UniqueName: \"kubernetes.io/projected/7e26f3eb-2933-4e84-9be8-2529ff29fc93-kube-api-access-hxdtj\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.718110 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e26f3eb-2933-4e84-9be8-2529ff29fc93-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.718118 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7e26f3eb-2933-4e84-9be8-2529ff29fc93-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.718127 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e26f3eb-2933-4e84-9be8-2529ff29fc93-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.718148 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.748459 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.820361 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.920813 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.921726 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e" containerName="glance-log" containerID="cri-o://430b1de81387aa810d8cf01a79bc304a5124d103d6f552e5dcf0d40f9cf5a79f" gracePeriod=30 Feb 19 18:53:27 crc kubenswrapper[4749]: I0219 18:53:27.922140 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e" containerName="glance-httpd" containerID="cri-o://168ac2ff6309d655007aef520878e87ef5536f7fcfee61e78aca70aeed5b6341" gracePeriod=30 Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.132761 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.133064 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="2124f654-902e-4591-9c6f-e98e919dc8ca" containerName="watcher-applier" containerID="cri-o://71ee3854394215c8e85f9b61fe43bf76098e7269572bd5d5696006317382cbd2" gracePeriod=30 Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.193509 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.193765 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="5d0300f8-4f96-4a80-ab8c-ebfec24dce10" containerName="watcher-api-log" containerID="cri-o://0b2d83fedc147ae1ae77fd48b66a049fc9f67e5140ec6175f6d725c0bd11c5c4" gracePeriod=30 Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.193852 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="5d0300f8-4f96-4a80-ab8c-ebfec24dce10" containerName="watcher-api" containerID="cri-o://407cee09108dc1abb8c7926b286e7267ac6421007336a636481ae314b11fc6cb" gracePeriod=30 Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.329829 4749 generic.go:334] "Generic (PLEG): container finished" podID="39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e" containerID="430b1de81387aa810d8cf01a79bc304a5124d103d6f552e5dcf0d40f9cf5a79f" exitCode=143 Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.329884 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e","Type":"ContainerDied","Data":"430b1de81387aa810d8cf01a79bc304a5124d103d6f552e5dcf0d40f9cf5a79f"} Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.331795 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7e26f3eb-2933-4e84-9be8-2529ff29fc93","Type":"ContainerDied","Data":"1343d0cfac32a0bf41ee917814cf8be72fb255949b8057ad931ada62110fe2b6"} Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.331827 4749 scope.go:117] "RemoveContainer" containerID="c7bfe50a5bece5a853c839e17c62830ca96a1e0b6c78f9ccb3cd22ae7b470397" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.331931 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.379884 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce","Type":"ContainerStarted","Data":"4a91a7722ebeaf7cb5bc44c246f5749005fabdd8e6f8416ce3be838105726ef5"} Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.379922 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce","Type":"ContainerStarted","Data":"b860af748501e19490a50014dec379a7ea76b3265261574b5c49747643480f42"} Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.379931 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce","Type":"ContainerStarted","Data":"8f06a677c2da21482bd507aa6a6b708e1fe6a12c48971d6abc28454a64438a05"} Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.401325 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.428956 4749 scope.go:117] "RemoveContainer" containerID="2faff4371d10d3fe6bb565ff39f306b9c697e0b74686119bf87e91ca4a42c13d" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.433582 4749 generic.go:334] "Generic (PLEG): container finished" podID="763173db-176a-426d-bd85-e051d56ec5cf" containerID="325be88592db27be126170ad29b2fe3647f1a72fbdf514c1993695f5fc3d8121" exitCode=0 Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.433646 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"763173db-176a-426d-bd85-e051d56ec5cf","Type":"ContainerDied","Data":"325be88592db27be126170ad29b2fe3647f1a72fbdf514c1993695f5fc3d8121"} Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.447665 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.503223 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 18:53:28 crc kubenswrapper[4749]: E0219 18:53:28.503616 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e26f3eb-2933-4e84-9be8-2529ff29fc93" containerName="glance-log" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.503633 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e26f3eb-2933-4e84-9be8-2529ff29fc93" containerName="glance-log" Feb 19 18:53:28 crc kubenswrapper[4749]: E0219 18:53:28.503653 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e26f3eb-2933-4e84-9be8-2529ff29fc93" containerName="glance-httpd" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.503660 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e26f3eb-2933-4e84-9be8-2529ff29fc93" containerName="glance-httpd" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.503827 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e26f3eb-2933-4e84-9be8-2529ff29fc93" containerName="glance-httpd" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.503840 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e26f3eb-2933-4e84-9be8-2529ff29fc93" containerName="glance-log" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.511472 4749 scope.go:117] "RemoveContainer" containerID="bb744c3795b7bbfd4b38ce2f51e278731316bb1a81d475e683fa156086c9ffdd" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.511977 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.514279 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.517055 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.517523 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.654606 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b40edc19-78bc-456e-ad9f-c3dcae644950-scripts\") pod \"glance-default-external-api-0\" (UID: \"b40edc19-78bc-456e-ad9f-c3dcae644950\") " pod="openstack/glance-default-external-api-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.654662 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b40edc19-78bc-456e-ad9f-c3dcae644950-logs\") pod \"glance-default-external-api-0\" (UID: \"b40edc19-78bc-456e-ad9f-c3dcae644950\") " pod="openstack/glance-default-external-api-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.654698 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b40edc19-78bc-456e-ad9f-c3dcae644950-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b40edc19-78bc-456e-ad9f-c3dcae644950\") " pod="openstack/glance-default-external-api-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.654721 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b40edc19-78bc-456e-ad9f-c3dcae644950-config-data\") pod \"glance-default-external-api-0\" (UID: \"b40edc19-78bc-456e-ad9f-c3dcae644950\") " pod="openstack/glance-default-external-api-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.654751 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqr4t\" (UniqueName: \"kubernetes.io/projected/b40edc19-78bc-456e-ad9f-c3dcae644950-kube-api-access-mqr4t\") pod \"glance-default-external-api-0\" (UID: \"b40edc19-78bc-456e-ad9f-c3dcae644950\") " pod="openstack/glance-default-external-api-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.654831 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b40edc19-78bc-456e-ad9f-c3dcae644950-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b40edc19-78bc-456e-ad9f-c3dcae644950\") " pod="openstack/glance-default-external-api-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.654875 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b40edc19-78bc-456e-ad9f-c3dcae644950-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b40edc19-78bc-456e-ad9f-c3dcae644950\") " pod="openstack/glance-default-external-api-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.654937 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"b40edc19-78bc-456e-ad9f-c3dcae644950\") " pod="openstack/glance-default-external-api-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.703905 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e26f3eb-2933-4e84-9be8-2529ff29fc93" path="/var/lib/kubelet/pods/7e26f3eb-2933-4e84-9be8-2529ff29fc93/volumes" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.762472 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqr4t\" (UniqueName: \"kubernetes.io/projected/b40edc19-78bc-456e-ad9f-c3dcae644950-kube-api-access-mqr4t\") pod \"glance-default-external-api-0\" (UID: \"b40edc19-78bc-456e-ad9f-c3dcae644950\") " pod="openstack/glance-default-external-api-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.762588 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b40edc19-78bc-456e-ad9f-c3dcae644950-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b40edc19-78bc-456e-ad9f-c3dcae644950\") " pod="openstack/glance-default-external-api-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.762655 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b40edc19-78bc-456e-ad9f-c3dcae644950-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b40edc19-78bc-456e-ad9f-c3dcae644950\") " pod="openstack/glance-default-external-api-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.762720 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"b40edc19-78bc-456e-ad9f-c3dcae644950\") " pod="openstack/glance-default-external-api-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.762766 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b40edc19-78bc-456e-ad9f-c3dcae644950-scripts\") pod \"glance-default-external-api-0\" (UID: \"b40edc19-78bc-456e-ad9f-c3dcae644950\") " pod="openstack/glance-default-external-api-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.762794 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b40edc19-78bc-456e-ad9f-c3dcae644950-logs\") pod \"glance-default-external-api-0\" (UID: \"b40edc19-78bc-456e-ad9f-c3dcae644950\") " pod="openstack/glance-default-external-api-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.762832 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b40edc19-78bc-456e-ad9f-c3dcae644950-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b40edc19-78bc-456e-ad9f-c3dcae644950\") " pod="openstack/glance-default-external-api-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.762854 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b40edc19-78bc-456e-ad9f-c3dcae644950-config-data\") pod \"glance-default-external-api-0\" (UID: \"b40edc19-78bc-456e-ad9f-c3dcae644950\") " pod="openstack/glance-default-external-api-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.763494 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"b40edc19-78bc-456e-ad9f-c3dcae644950\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.764397 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b40edc19-78bc-456e-ad9f-c3dcae644950-logs\") pod \"glance-default-external-api-0\" (UID: \"b40edc19-78bc-456e-ad9f-c3dcae644950\") " pod="openstack/glance-default-external-api-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.764777 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b40edc19-78bc-456e-ad9f-c3dcae644950-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b40edc19-78bc-456e-ad9f-c3dcae644950\") " pod="openstack/glance-default-external-api-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.776374 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b40edc19-78bc-456e-ad9f-c3dcae644950-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b40edc19-78bc-456e-ad9f-c3dcae644950\") " pod="openstack/glance-default-external-api-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.776924 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b40edc19-78bc-456e-ad9f-c3dcae644950-config-data\") pod \"glance-default-external-api-0\" (UID: \"b40edc19-78bc-456e-ad9f-c3dcae644950\") " pod="openstack/glance-default-external-api-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.781353 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqr4t\" (UniqueName: \"kubernetes.io/projected/b40edc19-78bc-456e-ad9f-c3dcae644950-kube-api-access-mqr4t\") pod \"glance-default-external-api-0\" (UID: \"b40edc19-78bc-456e-ad9f-c3dcae644950\") " pod="openstack/glance-default-external-api-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.782822 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b40edc19-78bc-456e-ad9f-c3dcae644950-scripts\") pod \"glance-default-external-api-0\" (UID: \"b40edc19-78bc-456e-ad9f-c3dcae644950\") " pod="openstack/glance-default-external-api-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.790276 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b40edc19-78bc-456e-ad9f-c3dcae644950-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b40edc19-78bc-456e-ad9f-c3dcae644950\") " pod="openstack/glance-default-external-api-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.811493 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"b40edc19-78bc-456e-ad9f-c3dcae644950\") " pod="openstack/glance-default-external-api-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.872265 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.971402 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck9ss\" (UniqueName: \"kubernetes.io/projected/763173db-176a-426d-bd85-e051d56ec5cf-kube-api-access-ck9ss\") pod \"763173db-176a-426d-bd85-e051d56ec5cf\" (UID: \"763173db-176a-426d-bd85-e051d56ec5cf\") " Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.971575 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/763173db-176a-426d-bd85-e051d56ec5cf-combined-ca-bundle\") pod \"763173db-176a-426d-bd85-e051d56ec5cf\" (UID: \"763173db-176a-426d-bd85-e051d56ec5cf\") " Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.971613 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/763173db-176a-426d-bd85-e051d56ec5cf-config-data\") pod \"763173db-176a-426d-bd85-e051d56ec5cf\" (UID: \"763173db-176a-426d-bd85-e051d56ec5cf\") " Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.971645 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/763173db-176a-426d-bd85-e051d56ec5cf-logs\") pod \"763173db-176a-426d-bd85-e051d56ec5cf\" (UID: \"763173db-176a-426d-bd85-e051d56ec5cf\") " Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.971672 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/763173db-176a-426d-bd85-e051d56ec5cf-custom-prometheus-ca\") pod \"763173db-176a-426d-bd85-e051d56ec5cf\" (UID: \"763173db-176a-426d-bd85-e051d56ec5cf\") " Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.978144 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/763173db-176a-426d-bd85-e051d56ec5cf-logs" (OuterVolumeSpecName: "logs") pod "763173db-176a-426d-bd85-e051d56ec5cf" (UID: "763173db-176a-426d-bd85-e051d56ec5cf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:53:28 crc kubenswrapper[4749]: I0219 18:53:28.995256 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/763173db-176a-426d-bd85-e051d56ec5cf-kube-api-access-ck9ss" (OuterVolumeSpecName: "kube-api-access-ck9ss") pod "763173db-176a-426d-bd85-e051d56ec5cf" (UID: "763173db-176a-426d-bd85-e051d56ec5cf"). InnerVolumeSpecName "kube-api-access-ck9ss". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.018166 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/763173db-176a-426d-bd85-e051d56ec5cf-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "763173db-176a-426d-bd85-e051d56ec5cf" (UID: "763173db-176a-426d-bd85-e051d56ec5cf"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.026438 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/763173db-176a-426d-bd85-e051d56ec5cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "763173db-176a-426d-bd85-e051d56ec5cf" (UID: "763173db-176a-426d-bd85-e051d56ec5cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.043896 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.074224 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/763173db-176a-426d-bd85-e051d56ec5cf-logs\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.074259 4749 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/763173db-176a-426d-bd85-e051d56ec5cf-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.074269 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ck9ss\" (UniqueName: \"kubernetes.io/projected/763173db-176a-426d-bd85-e051d56ec5cf-kube-api-access-ck9ss\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.074278 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/763173db-176a-426d-bd85-e051d56ec5cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.093790 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/763173db-176a-426d-bd85-e051d56ec5cf-config-data" (OuterVolumeSpecName: "config-data") pod "763173db-176a-426d-bd85-e051d56ec5cf" (UID: "763173db-176a-426d-bd85-e051d56ec5cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.175546 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/763173db-176a-426d-bd85-e051d56ec5cf-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.449552 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce","Type":"ContainerStarted","Data":"3d826c7e7495884091cab7bfb336ab0f5a1b33bf1ab2075b4dc7fc62238156f4"} Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.451505 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.454053 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"763173db-176a-426d-bd85-e051d56ec5cf","Type":"ContainerDied","Data":"7658ea7fdcb8badbdbe7a597858743b9fe69f892e9dcebce8e21bea2fab84bce"} Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.454102 4749 scope.go:117] "RemoveContainer" containerID="325be88592db27be126170ad29b2fe3647f1a72fbdf514c1993695f5fc3d8121" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.482135 4749 generic.go:334] "Generic (PLEG): container finished" podID="5d0300f8-4f96-4a80-ab8c-ebfec24dce10" containerID="407cee09108dc1abb8c7926b286e7267ac6421007336a636481ae314b11fc6cb" exitCode=0 Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.482157 4749 generic.go:334] "Generic (PLEG): container finished" podID="5d0300f8-4f96-4a80-ab8c-ebfec24dce10" containerID="0b2d83fedc147ae1ae77fd48b66a049fc9f67e5140ec6175f6d725c0bd11c5c4" exitCode=143 Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.482177 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"5d0300f8-4f96-4a80-ab8c-ebfec24dce10","Type":"ContainerDied","Data":"407cee09108dc1abb8c7926b286e7267ac6421007336a636481ae314b11fc6cb"} Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.482204 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"5d0300f8-4f96-4a80-ab8c-ebfec24dce10","Type":"ContainerDied","Data":"0b2d83fedc147ae1ae77fd48b66a049fc9f67e5140ec6175f6d725c0bd11c5c4"} Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.504825 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.769122 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.792426 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.806515 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 18:53:29 crc kubenswrapper[4749]: E0219 18:53:29.807442 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="763173db-176a-426d-bd85-e051d56ec5cf" containerName="watcher-decision-engine" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.807461 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="763173db-176a-426d-bd85-e051d56ec5cf" containerName="watcher-decision-engine" Feb 19 18:53:29 crc kubenswrapper[4749]: E0219 18:53:29.807494 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="763173db-176a-426d-bd85-e051d56ec5cf" containerName="watcher-decision-engine" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.807502 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="763173db-176a-426d-bd85-e051d56ec5cf" containerName="watcher-decision-engine" Feb 19 18:53:29 crc kubenswrapper[4749]: E0219 18:53:29.807518 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="763173db-176a-426d-bd85-e051d56ec5cf" containerName="watcher-decision-engine" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.807528 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="763173db-176a-426d-bd85-e051d56ec5cf" containerName="watcher-decision-engine" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.807739 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="763173db-176a-426d-bd85-e051d56ec5cf" containerName="watcher-decision-engine" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.807755 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="763173db-176a-426d-bd85-e051d56ec5cf" containerName="watcher-decision-engine" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.807770 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="763173db-176a-426d-bd85-e051d56ec5cf" containerName="watcher-decision-engine" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.807784 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="763173db-176a-426d-bd85-e051d56ec5cf" containerName="watcher-decision-engine" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.809958 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.813866 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.814442 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.831462 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 18:53:29 crc kubenswrapper[4749]: E0219 18:53:29.887352 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 71ee3854394215c8e85f9b61fe43bf76098e7269572bd5d5696006317382cbd2 is running failed: container process not found" containerID="71ee3854394215c8e85f9b61fe43bf76098e7269572bd5d5696006317382cbd2" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.898367 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdbzq\" (UniqueName: \"kubernetes.io/projected/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-kube-api-access-kdbzq\") pod \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.898510 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-custom-prometheus-ca\") pod \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.898561 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-combined-ca-bundle\") pod \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.898606 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-config-data\") pod \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.898635 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-internal-tls-certs\") pod \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.898754 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-public-tls-certs\") pod \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.898799 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-logs\") pod \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\" (UID: \"5d0300f8-4f96-4a80-ab8c-ebfec24dce10\") " Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.899193 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"90ac14db-0b4c-4638-882d-1ab8c9cde6e5\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.899310 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-config-data\") pod \"watcher-decision-engine-0\" (UID: \"90ac14db-0b4c-4638-882d-1ab8c9cde6e5\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.899413 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-logs\") pod \"watcher-decision-engine-0\" (UID: \"90ac14db-0b4c-4638-882d-1ab8c9cde6e5\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.899467 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwk8b\" (UniqueName: \"kubernetes.io/projected/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-kube-api-access-nwk8b\") pod \"watcher-decision-engine-0\" (UID: \"90ac14db-0b4c-4638-882d-1ab8c9cde6e5\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.899492 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"90ac14db-0b4c-4638-882d-1ab8c9cde6e5\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.906091 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-kube-api-access-kdbzq" (OuterVolumeSpecName: "kube-api-access-kdbzq") pod "5d0300f8-4f96-4a80-ab8c-ebfec24dce10" (UID: "5d0300f8-4f96-4a80-ab8c-ebfec24dce10"). InnerVolumeSpecName "kube-api-access-kdbzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.906358 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-logs" (OuterVolumeSpecName: "logs") pod "5d0300f8-4f96-4a80-ab8c-ebfec24dce10" (UID: "5d0300f8-4f96-4a80-ab8c-ebfec24dce10"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:53:29 crc kubenswrapper[4749]: E0219 18:53:29.915541 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 71ee3854394215c8e85f9b61fe43bf76098e7269572bd5d5696006317382cbd2 is running failed: container process not found" containerID="71ee3854394215c8e85f9b61fe43bf76098e7269572bd5d5696006317382cbd2" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 19 18:53:29 crc kubenswrapper[4749]: E0219 18:53:29.916870 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 71ee3854394215c8e85f9b61fe43bf76098e7269572bd5d5696006317382cbd2 is running failed: container process not found" containerID="71ee3854394215c8e85f9b61fe43bf76098e7269572bd5d5696006317382cbd2" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 19 18:53:29 crc kubenswrapper[4749]: E0219 18:53:29.916941 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 71ee3854394215c8e85f9b61fe43bf76098e7269572bd5d5696006317382cbd2 is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="2124f654-902e-4591-9c6f-e98e919dc8ca" containerName="watcher-applier" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.959964 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d0300f8-4f96-4a80-ab8c-ebfec24dce10" (UID: "5d0300f8-4f96-4a80-ab8c-ebfec24dce10"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.963058 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.983361 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-config-data" (OuterVolumeSpecName: "config-data") pod "5d0300f8-4f96-4a80-ab8c-ebfec24dce10" (UID: "5d0300f8-4f96-4a80-ab8c-ebfec24dce10"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.985116 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5d0300f8-4f96-4a80-ab8c-ebfec24dce10" (UID: "5d0300f8-4f96-4a80-ab8c-ebfec24dce10"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.987528 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "5d0300f8-4f96-4a80-ab8c-ebfec24dce10" (UID: "5d0300f8-4f96-4a80-ab8c-ebfec24dce10"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:29 crc kubenswrapper[4749]: I0219 18:53:29.998810 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5d0300f8-4f96-4a80-ab8c-ebfec24dce10" (UID: "5d0300f8-4f96-4a80-ab8c-ebfec24dce10"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.000762 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-logs\") pod \"watcher-decision-engine-0\" (UID: \"90ac14db-0b4c-4638-882d-1ab8c9cde6e5\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.000809 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwk8b\" (UniqueName: \"kubernetes.io/projected/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-kube-api-access-nwk8b\") pod \"watcher-decision-engine-0\" (UID: \"90ac14db-0b4c-4638-882d-1ab8c9cde6e5\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.000829 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"90ac14db-0b4c-4638-882d-1ab8c9cde6e5\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.000898 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"90ac14db-0b4c-4638-882d-1ab8c9cde6e5\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.000949 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-config-data\") pod \"watcher-decision-engine-0\" (UID: \"90ac14db-0b4c-4638-882d-1ab8c9cde6e5\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.001007 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.001026 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.001353 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.001432 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-logs\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.001495 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdbzq\" (UniqueName: \"kubernetes.io/projected/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-kube-api-access-kdbzq\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.001551 4749 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.001608 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d0300f8-4f96-4a80-ab8c-ebfec24dce10-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.002900 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-logs\") pod \"watcher-decision-engine-0\" (UID: \"90ac14db-0b4c-4638-882d-1ab8c9cde6e5\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.009343 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-config-data\") pod \"watcher-decision-engine-0\" (UID: \"90ac14db-0b4c-4638-882d-1ab8c9cde6e5\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.010918 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"90ac14db-0b4c-4638-882d-1ab8c9cde6e5\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.012427 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"90ac14db-0b4c-4638-882d-1ab8c9cde6e5\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.031096 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwk8b\" (UniqueName: \"kubernetes.io/projected/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-kube-api-access-nwk8b\") pod \"watcher-decision-engine-0\" (UID: \"90ac14db-0b4c-4638-882d-1ab8c9cde6e5\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.102847 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2124f654-902e-4591-9c6f-e98e919dc8ca-logs\") pod \"2124f654-902e-4591-9c6f-e98e919dc8ca\" (UID: \"2124f654-902e-4591-9c6f-e98e919dc8ca\") " Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.102903 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2124f654-902e-4591-9c6f-e98e919dc8ca-combined-ca-bundle\") pod \"2124f654-902e-4591-9c6f-e98e919dc8ca\" (UID: \"2124f654-902e-4591-9c6f-e98e919dc8ca\") " Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.102958 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwt25\" (UniqueName: \"kubernetes.io/projected/2124f654-902e-4591-9c6f-e98e919dc8ca-kube-api-access-hwt25\") pod \"2124f654-902e-4591-9c6f-e98e919dc8ca\" (UID: \"2124f654-902e-4591-9c6f-e98e919dc8ca\") " Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.102987 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2124f654-902e-4591-9c6f-e98e919dc8ca-config-data\") pod \"2124f654-902e-4591-9c6f-e98e919dc8ca\" (UID: \"2124f654-902e-4591-9c6f-e98e919dc8ca\") " Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.103991 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2124f654-902e-4591-9c6f-e98e919dc8ca-logs" (OuterVolumeSpecName: "logs") pod "2124f654-902e-4591-9c6f-e98e919dc8ca" (UID: "2124f654-902e-4591-9c6f-e98e919dc8ca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.129377 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2124f654-902e-4591-9c6f-e98e919dc8ca-kube-api-access-hwt25" (OuterVolumeSpecName: "kube-api-access-hwt25") pod "2124f654-902e-4591-9c6f-e98e919dc8ca" (UID: "2124f654-902e-4591-9c6f-e98e919dc8ca"). InnerVolumeSpecName "kube-api-access-hwt25". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.145769 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.154698 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2124f654-902e-4591-9c6f-e98e919dc8ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2124f654-902e-4591-9c6f-e98e919dc8ca" (UID: "2124f654-902e-4591-9c6f-e98e919dc8ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.192477 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2124f654-902e-4591-9c6f-e98e919dc8ca-config-data" (OuterVolumeSpecName: "config-data") pod "2124f654-902e-4591-9c6f-e98e919dc8ca" (UID: "2124f654-902e-4591-9c6f-e98e919dc8ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.206574 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2124f654-902e-4591-9c6f-e98e919dc8ca-logs\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.206604 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2124f654-902e-4591-9c6f-e98e919dc8ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.206614 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwt25\" (UniqueName: \"kubernetes.io/projected/2124f654-902e-4591-9c6f-e98e919dc8ca-kube-api-access-hwt25\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.206622 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2124f654-902e-4591-9c6f-e98e919dc8ca-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.232516 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.308512 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-httpd-run\") pod \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.308571 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-combined-ca-bundle\") pod \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.308601 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk52t\" (UniqueName: \"kubernetes.io/projected/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-kube-api-access-wk52t\") pod \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.308644 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-logs\") pod \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.308673 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.308710 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-internal-tls-certs\") pod \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.308784 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-scripts\") pod \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.308834 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-config-data\") pod \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\" (UID: \"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e\") " Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.312543 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-logs" (OuterVolumeSpecName: "logs") pod "39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e" (UID: "39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.312695 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e" (UID: "39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.329671 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-kube-api-access-wk52t" (OuterVolumeSpecName: "kube-api-access-wk52t") pod "39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e" (UID: "39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e"). InnerVolumeSpecName "kube-api-access-wk52t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.338004 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e" (UID: "39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.349717 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-scripts" (OuterVolumeSpecName: "scripts") pod "39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e" (UID: "39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.401361 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e" (UID: "39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.410446 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.410470 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.410479 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk52t\" (UniqueName: \"kubernetes.io/projected/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-kube-api-access-wk52t\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.410490 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-logs\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.410513 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.410522 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.415268 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e" (UID: "39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.433166 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-config-data" (OuterVolumeSpecName: "config-data") pod "39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e" (UID: "39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.481478 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.507300 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"5d0300f8-4f96-4a80-ab8c-ebfec24dce10","Type":"ContainerDied","Data":"dc6872dc284041b62c24ae595013b41da166ccf5b9ebfe7df81d5208d24962f9"} Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.507380 4749 scope.go:117] "RemoveContainer" containerID="407cee09108dc1abb8c7926b286e7267ac6421007336a636481ae314b11fc6cb" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.510239 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.512984 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.514279 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.514316 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.536165 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.546539 4749 generic.go:334] "Generic (PLEG): container finished" podID="2124f654-902e-4591-9c6f-e98e919dc8ca" containerID="71ee3854394215c8e85f9b61fe43bf76098e7269572bd5d5696006317382cbd2" exitCode=0 Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.546622 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"2124f654-902e-4591-9c6f-e98e919dc8ca","Type":"ContainerDied","Data":"71ee3854394215c8e85f9b61fe43bf76098e7269572bd5d5696006317382cbd2"} Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.546657 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"2124f654-902e-4591-9c6f-e98e919dc8ca","Type":"ContainerDied","Data":"510a5190113cb3b81aad6b5e0d50068f3a5e2da5ccc962f4dc327094fb96ef02"} Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.547191 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.584613 4749 scope.go:117] "RemoveContainer" containerID="0b2d83fedc147ae1ae77fd48b66a049fc9f67e5140ec6175f6d725c0bd11c5c4" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.586452 4749 generic.go:334] "Generic (PLEG): container finished" podID="39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e" containerID="168ac2ff6309d655007aef520878e87ef5536f7fcfee61e78aca70aeed5b6341" exitCode=0 Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.586635 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.587167 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e","Type":"ContainerDied","Data":"168ac2ff6309d655007aef520878e87ef5536f7fcfee61e78aca70aeed5b6341"} Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.587204 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e","Type":"ContainerDied","Data":"10a7a972a3ca4987065ef5cc5e43b26ec415bd4394cb473967e9e3102dbe2671"} Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.629238 4749 scope.go:117] "RemoveContainer" containerID="71ee3854394215c8e85f9b61fe43bf76098e7269572bd5d5696006317382cbd2" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.634243 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b40edc19-78bc-456e-ad9f-c3dcae644950","Type":"ContainerStarted","Data":"a197a0ed0ab7924ec896076b01144580a0810a086c4618e2e68fd0e20b4eb7c0"} Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.670902 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.692005 4749 scope.go:117] "RemoveContainer" containerID="71ee3854394215c8e85f9b61fe43bf76098e7269572bd5d5696006317382cbd2" Feb 19 18:53:30 crc kubenswrapper[4749]: E0219 18:53:30.697553 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71ee3854394215c8e85f9b61fe43bf76098e7269572bd5d5696006317382cbd2\": container with ID starting with 71ee3854394215c8e85f9b61fe43bf76098e7269572bd5d5696006317382cbd2 not found: ID does not exist" containerID="71ee3854394215c8e85f9b61fe43bf76098e7269572bd5d5696006317382cbd2" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.697600 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71ee3854394215c8e85f9b61fe43bf76098e7269572bd5d5696006317382cbd2"} err="failed to get container status \"71ee3854394215c8e85f9b61fe43bf76098e7269572bd5d5696006317382cbd2\": rpc error: code = NotFound desc = could not find container \"71ee3854394215c8e85f9b61fe43bf76098e7269572bd5d5696006317382cbd2\": container with ID starting with 71ee3854394215c8e85f9b61fe43bf76098e7269572bd5d5696006317382cbd2 not found: ID does not exist" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.697629 4749 scope.go:117] "RemoveContainer" containerID="168ac2ff6309d655007aef520878e87ef5536f7fcfee61e78aca70aeed5b6341" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.744486 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="763173db-176a-426d-bd85-e051d56ec5cf" path="/var/lib/kubelet/pods/763173db-176a-426d-bd85-e051d56ec5cf/volumes" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.745502 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.778140 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.791093 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.801257 4749 scope.go:117] "RemoveContainer" containerID="430b1de81387aa810d8cf01a79bc304a5124d103d6f552e5dcf0d40f9cf5a79f" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.811188 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Feb 19 18:53:30 crc kubenswrapper[4749]: E0219 18:53:30.811588 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e" containerName="glance-log" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.811604 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e" containerName="glance-log" Feb 19 18:53:30 crc kubenswrapper[4749]: E0219 18:53:30.811622 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2124f654-902e-4591-9c6f-e98e919dc8ca" containerName="watcher-applier" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.811628 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2124f654-902e-4591-9c6f-e98e919dc8ca" containerName="watcher-applier" Feb 19 18:53:30 crc kubenswrapper[4749]: E0219 18:53:30.811648 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d0300f8-4f96-4a80-ab8c-ebfec24dce10" containerName="watcher-api" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.811654 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d0300f8-4f96-4a80-ab8c-ebfec24dce10" containerName="watcher-api" Feb 19 18:53:30 crc kubenswrapper[4749]: E0219 18:53:30.811666 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d0300f8-4f96-4a80-ab8c-ebfec24dce10" containerName="watcher-api-log" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.811672 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d0300f8-4f96-4a80-ab8c-ebfec24dce10" containerName="watcher-api-log" Feb 19 18:53:30 crc kubenswrapper[4749]: E0219 18:53:30.811685 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="763173db-176a-426d-bd85-e051d56ec5cf" containerName="watcher-decision-engine" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.811691 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="763173db-176a-426d-bd85-e051d56ec5cf" containerName="watcher-decision-engine" Feb 19 18:53:30 crc kubenswrapper[4749]: E0219 18:53:30.811703 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e" containerName="glance-httpd" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.811710 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e" containerName="glance-httpd" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.811873 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2124f654-902e-4591-9c6f-e98e919dc8ca" containerName="watcher-applier" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.811885 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d0300f8-4f96-4a80-ab8c-ebfec24dce10" containerName="watcher-api" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.811897 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e" containerName="glance-log" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.811906 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e" containerName="glance-httpd" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.811919 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d0300f8-4f96-4a80-ab8c-ebfec24dce10" containerName="watcher-api-log" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.812614 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.815544 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.827975 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.850198 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.892027 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.907597 4749 scope.go:117] "RemoveContainer" containerID="168ac2ff6309d655007aef520878e87ef5536f7fcfee61e78aca70aeed5b6341" Feb 19 18:53:30 crc kubenswrapper[4749]: E0219 18:53:30.915374 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"168ac2ff6309d655007aef520878e87ef5536f7fcfee61e78aca70aeed5b6341\": container with ID starting with 168ac2ff6309d655007aef520878e87ef5536f7fcfee61e78aca70aeed5b6341 not found: ID does not exist" containerID="168ac2ff6309d655007aef520878e87ef5536f7fcfee61e78aca70aeed5b6341" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.915413 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"168ac2ff6309d655007aef520878e87ef5536f7fcfee61e78aca70aeed5b6341"} err="failed to get container status \"168ac2ff6309d655007aef520878e87ef5536f7fcfee61e78aca70aeed5b6341\": rpc error: code = NotFound desc = could not find container \"168ac2ff6309d655007aef520878e87ef5536f7fcfee61e78aca70aeed5b6341\": container with ID starting with 168ac2ff6309d655007aef520878e87ef5536f7fcfee61e78aca70aeed5b6341 not found: ID does not exist" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.915438 4749 scope.go:117] "RemoveContainer" containerID="430b1de81387aa810d8cf01a79bc304a5124d103d6f552e5dcf0d40f9cf5a79f" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.918116 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.919630 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 18:53:30 crc kubenswrapper[4749]: E0219 18:53:30.920248 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"430b1de81387aa810d8cf01a79bc304a5124d103d6f552e5dcf0d40f9cf5a79f\": container with ID starting with 430b1de81387aa810d8cf01a79bc304a5124d103d6f552e5dcf0d40f9cf5a79f not found: ID does not exist" containerID="430b1de81387aa810d8cf01a79bc304a5124d103d6f552e5dcf0d40f9cf5a79f" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.920275 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"430b1de81387aa810d8cf01a79bc304a5124d103d6f552e5dcf0d40f9cf5a79f"} err="failed to get container status \"430b1de81387aa810d8cf01a79bc304a5124d103d6f552e5dcf0d40f9cf5a79f\": rpc error: code = NotFound desc = could not find container \"430b1de81387aa810d8cf01a79bc304a5124d103d6f552e5dcf0d40f9cf5a79f\": container with ID starting with 430b1de81387aa810d8cf01a79bc304a5124d103d6f552e5dcf0d40f9cf5a79f not found: ID does not exist" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.922064 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.922311 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.926409 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.933989 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f97a54ef-7ca6-4ad1-951b-5c05572b591a-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"f97a54ef-7ca6-4ad1-951b-5c05572b591a\") " pod="openstack/watcher-applier-0" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.934062 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fxmm\" (UniqueName: \"kubernetes.io/projected/f97a54ef-7ca6-4ad1-951b-5c05572b591a-kube-api-access-2fxmm\") pod \"watcher-applier-0\" (UID: \"f97a54ef-7ca6-4ad1-951b-5c05572b591a\") " pod="openstack/watcher-applier-0" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.934192 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f97a54ef-7ca6-4ad1-951b-5c05572b591a-logs\") pod \"watcher-applier-0\" (UID: \"f97a54ef-7ca6-4ad1-951b-5c05572b591a\") " pod="openstack/watcher-applier-0" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.934214 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f97a54ef-7ca6-4ad1-951b-5c05572b591a-config-data\") pod \"watcher-applier-0\" (UID: \"f97a54ef-7ca6-4ad1-951b-5c05572b591a\") " pod="openstack/watcher-applier-0" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.935389 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.965544 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.967048 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.972956 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.973515 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 19 18:53:30 crc kubenswrapper[4749]: I0219 18:53:30.992118 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.016066 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.035854 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f97a54ef-7ca6-4ad1-951b-5c05572b591a-config-data\") pod \"watcher-applier-0\" (UID: \"f97a54ef-7ca6-4ad1-951b-5c05572b591a\") " pod="openstack/watcher-applier-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.035895 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6\") " pod="openstack/watcher-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.035926 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f97a54ef-7ca6-4ad1-951b-5c05572b591a-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"f97a54ef-7ca6-4ad1-951b-5c05572b591a\") " pod="openstack/watcher-applier-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.035950 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2fzf\" (UniqueName: \"kubernetes.io/projected/6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6-kube-api-access-h2fzf\") pod \"watcher-api-0\" (UID: \"6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6\") " pod="openstack/watcher-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.035978 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fxmm\" (UniqueName: \"kubernetes.io/projected/f97a54ef-7ca6-4ad1-951b-5c05572b591a-kube-api-access-2fxmm\") pod \"watcher-applier-0\" (UID: \"f97a54ef-7ca6-4ad1-951b-5c05572b591a\") " pod="openstack/watcher-applier-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.036009 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d9d8ca6-8ca8-415e-9120-5c48d275052c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4d9d8ca6-8ca8-415e-9120-5c48d275052c\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.036061 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4d9d8ca6-8ca8-415e-9120-5c48d275052c\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.036083 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d9d8ca6-8ca8-415e-9120-5c48d275052c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4d9d8ca6-8ca8-415e-9120-5c48d275052c\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.036101 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6-config-data\") pod \"watcher-api-0\" (UID: \"6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6\") " pod="openstack/watcher-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.036116 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6-public-tls-certs\") pod \"watcher-api-0\" (UID: \"6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6\") " pod="openstack/watcher-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.036141 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d9d8ca6-8ca8-415e-9120-5c48d275052c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4d9d8ca6-8ca8-415e-9120-5c48d275052c\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.036156 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d9d8ca6-8ca8-415e-9120-5c48d275052c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4d9d8ca6-8ca8-415e-9120-5c48d275052c\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.036189 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6\") " pod="openstack/watcher-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.036219 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6\") " pod="openstack/watcher-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.036241 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d9d8ca6-8ca8-415e-9120-5c48d275052c-logs\") pod \"glance-default-internal-api-0\" (UID: \"4d9d8ca6-8ca8-415e-9120-5c48d275052c\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.036257 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d9d8ca6-8ca8-415e-9120-5c48d275052c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4d9d8ca6-8ca8-415e-9120-5c48d275052c\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.036284 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6fg9\" (UniqueName: \"kubernetes.io/projected/4d9d8ca6-8ca8-415e-9120-5c48d275052c-kube-api-access-j6fg9\") pod \"glance-default-internal-api-0\" (UID: \"4d9d8ca6-8ca8-415e-9120-5c48d275052c\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.036301 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6-logs\") pod \"watcher-api-0\" (UID: \"6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6\") " pod="openstack/watcher-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.036335 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f97a54ef-7ca6-4ad1-951b-5c05572b591a-logs\") pod \"watcher-applier-0\" (UID: \"f97a54ef-7ca6-4ad1-951b-5c05572b591a\") " pod="openstack/watcher-applier-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.036645 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f97a54ef-7ca6-4ad1-951b-5c05572b591a-logs\") pod \"watcher-applier-0\" (UID: \"f97a54ef-7ca6-4ad1-951b-5c05572b591a\") " pod="openstack/watcher-applier-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.041283 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f97a54ef-7ca6-4ad1-951b-5c05572b591a-config-data\") pod \"watcher-applier-0\" (UID: \"f97a54ef-7ca6-4ad1-951b-5c05572b591a\") " pod="openstack/watcher-applier-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.042621 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f97a54ef-7ca6-4ad1-951b-5c05572b591a-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"f97a54ef-7ca6-4ad1-951b-5c05572b591a\") " pod="openstack/watcher-applier-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.058613 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fxmm\" (UniqueName: \"kubernetes.io/projected/f97a54ef-7ca6-4ad1-951b-5c05572b591a-kube-api-access-2fxmm\") pod \"watcher-applier-0\" (UID: \"f97a54ef-7ca6-4ad1-951b-5c05572b591a\") " pod="openstack/watcher-applier-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.138242 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6\") " pod="openstack/watcher-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.138575 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2fzf\" (UniqueName: \"kubernetes.io/projected/6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6-kube-api-access-h2fzf\") pod \"watcher-api-0\" (UID: \"6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6\") " pod="openstack/watcher-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.138626 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d9d8ca6-8ca8-415e-9120-5c48d275052c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4d9d8ca6-8ca8-415e-9120-5c48d275052c\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.138661 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4d9d8ca6-8ca8-415e-9120-5c48d275052c\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.139172 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d9d8ca6-8ca8-415e-9120-5c48d275052c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4d9d8ca6-8ca8-415e-9120-5c48d275052c\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.139189 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6-config-data\") pod \"watcher-api-0\" (UID: \"6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6\") " pod="openstack/watcher-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.139204 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6-public-tls-certs\") pod \"watcher-api-0\" (UID: \"6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6\") " pod="openstack/watcher-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.139271 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d9d8ca6-8ca8-415e-9120-5c48d275052c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4d9d8ca6-8ca8-415e-9120-5c48d275052c\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.139288 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d9d8ca6-8ca8-415e-9120-5c48d275052c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4d9d8ca6-8ca8-415e-9120-5c48d275052c\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.139324 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6\") " pod="openstack/watcher-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.139358 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6\") " pod="openstack/watcher-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.139367 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4d9d8ca6-8ca8-415e-9120-5c48d275052c\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.139387 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d9d8ca6-8ca8-415e-9120-5c48d275052c-logs\") pod \"glance-default-internal-api-0\" (UID: \"4d9d8ca6-8ca8-415e-9120-5c48d275052c\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.139404 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d9d8ca6-8ca8-415e-9120-5c48d275052c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4d9d8ca6-8ca8-415e-9120-5c48d275052c\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.139435 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6fg9\" (UniqueName: \"kubernetes.io/projected/4d9d8ca6-8ca8-415e-9120-5c48d275052c-kube-api-access-j6fg9\") pod \"glance-default-internal-api-0\" (UID: \"4d9d8ca6-8ca8-415e-9120-5c48d275052c\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.139452 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6-logs\") pod \"watcher-api-0\" (UID: \"6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6\") " pod="openstack/watcher-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.139808 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6-logs\") pod \"watcher-api-0\" (UID: \"6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6\") " pod="openstack/watcher-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.143102 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d9d8ca6-8ca8-415e-9120-5c48d275052c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4d9d8ca6-8ca8-415e-9120-5c48d275052c\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.143631 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6\") " pod="openstack/watcher-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.143772 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d9d8ca6-8ca8-415e-9120-5c48d275052c-logs\") pod \"glance-default-internal-api-0\" (UID: \"4d9d8ca6-8ca8-415e-9120-5c48d275052c\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.145172 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d9d8ca6-8ca8-415e-9120-5c48d275052c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4d9d8ca6-8ca8-415e-9120-5c48d275052c\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.145263 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d9d8ca6-8ca8-415e-9120-5c48d275052c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4d9d8ca6-8ca8-415e-9120-5c48d275052c\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.153543 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2fzf\" (UniqueName: \"kubernetes.io/projected/6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6-kube-api-access-h2fzf\") pod \"watcher-api-0\" (UID: \"6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6\") " pod="openstack/watcher-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.156310 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d9d8ca6-8ca8-415e-9120-5c48d275052c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4d9d8ca6-8ca8-415e-9120-5c48d275052c\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.156641 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6\") " pod="openstack/watcher-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.157207 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6-config-data\") pod \"watcher-api-0\" (UID: \"6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6\") " pod="openstack/watcher-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.160241 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6-public-tls-certs\") pod \"watcher-api-0\" (UID: \"6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6\") " pod="openstack/watcher-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.160352 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6\") " pod="openstack/watcher-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.162703 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d9d8ca6-8ca8-415e-9120-5c48d275052c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4d9d8ca6-8ca8-415e-9120-5c48d275052c\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.164684 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6fg9\" (UniqueName: \"kubernetes.io/projected/4d9d8ca6-8ca8-415e-9120-5c48d275052c-kube-api-access-j6fg9\") pod \"glance-default-internal-api-0\" (UID: \"4d9d8ca6-8ca8-415e-9120-5c48d275052c\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.195652 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4d9d8ca6-8ca8-415e-9120-5c48d275052c\") " pod="openstack/glance-default-internal-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.286315 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.318375 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.335975 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.661480 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce","Type":"ContainerStarted","Data":"96d12a07a67dbe4dc0d82d1eb9ea6cfa2cd3f3e1b225475b6aac87b7fdcb0742"} Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.661634 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" containerName="ceilometer-central-agent" containerID="cri-o://b860af748501e19490a50014dec379a7ea76b3265261574b5c49747643480f42" gracePeriod=30 Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.661902 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.661916 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" containerName="proxy-httpd" containerID="cri-o://96d12a07a67dbe4dc0d82d1eb9ea6cfa2cd3f3e1b225475b6aac87b7fdcb0742" gracePeriod=30 Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.662056 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" containerName="sg-core" containerID="cri-o://3d826c7e7495884091cab7bfb336ab0f5a1b33bf1ab2075b4dc7fc62238156f4" gracePeriod=30 Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.662116 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" containerName="ceilometer-notification-agent" containerID="cri-o://4a91a7722ebeaf7cb5bc44c246f5749005fabdd8e6f8416ce3be838105726ef5" gracePeriod=30 Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.672019 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"90ac14db-0b4c-4638-882d-1ab8c9cde6e5","Type":"ContainerStarted","Data":"86980e71adcd70f4ca16bc5945aa7844f479527ad5ab60c0a8ec9ea9440a83a2"} Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.672113 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"90ac14db-0b4c-4638-882d-1ab8c9cde6e5","Type":"ContainerStarted","Data":"10f9ce2ac5963bbf9a71fec1f185ba3242ac14993182426de7e675a29fa362fa"} Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.681908 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b40edc19-78bc-456e-ad9f-c3dcae644950","Type":"ContainerStarted","Data":"59a6c5ef2b4dc03375a4be390276eb02ac6ea63641a9ffb4bcc2fe389bf4759b"} Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.696147 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.898500613 podStartE2EDuration="5.696129203s" podCreationTimestamp="2026-02-19 18:53:26 +0000 UTC" firstStartedPulling="2026-02-19 18:53:27.304170905 +0000 UTC m=+1181.265390859" lastFinishedPulling="2026-02-19 18:53:31.101799495 +0000 UTC m=+1185.063019449" observedRunningTime="2026-02-19 18:53:31.695417697 +0000 UTC m=+1185.656637661" watchObservedRunningTime="2026-02-19 18:53:31.696129203 +0000 UTC m=+1185.657349147" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.737578 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.737552966 podStartE2EDuration="2.737552966s" podCreationTimestamp="2026-02-19 18:53:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:53:31.720465603 +0000 UTC m=+1185.681685557" watchObservedRunningTime="2026-02-19 18:53:31.737552966 +0000 UTC m=+1185.698772920" Feb 19 18:53:31 crc kubenswrapper[4749]: I0219 18:53:31.859661 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 18:53:32 crc kubenswrapper[4749]: I0219 18:53:32.091396 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 19 18:53:32 crc kubenswrapper[4749]: I0219 18:53:32.195810 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 18:53:32 crc kubenswrapper[4749]: I0219 18:53:32.696316 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2124f654-902e-4591-9c6f-e98e919dc8ca" path="/var/lib/kubelet/pods/2124f654-902e-4591-9c6f-e98e919dc8ca/volumes" Feb 19 18:53:32 crc kubenswrapper[4749]: I0219 18:53:32.698876 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e" path="/var/lib/kubelet/pods/39fd4de9-8c55-4ea6-aa69-b7ee9b3e8d7e/volumes" Feb 19 18:53:32 crc kubenswrapper[4749]: I0219 18:53:32.699906 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d0300f8-4f96-4a80-ab8c-ebfec24dce10" path="/var/lib/kubelet/pods/5d0300f8-4f96-4a80-ab8c-ebfec24dce10/volumes" Feb 19 18:53:32 crc kubenswrapper[4749]: I0219 18:53:32.701204 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4d9d8ca6-8ca8-415e-9120-5c48d275052c","Type":"ContainerStarted","Data":"726d0712243dcb7134cf3adb3af62223fa983e7e7af90bc3b81b247d79cccbab"} Feb 19 18:53:32 crc kubenswrapper[4749]: I0219 18:53:32.705929 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6","Type":"ContainerStarted","Data":"54c5d646aabc9135d63ad2c4c8158587a98c3ff0740295115b37e48fba78ed84"} Feb 19 18:53:32 crc kubenswrapper[4749]: I0219 18:53:32.705970 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6","Type":"ContainerStarted","Data":"9e43f9257dba8dafe367ed85d7756ecbbfa071db672134d868152abb2ec8cf18"} Feb 19 18:53:32 crc kubenswrapper[4749]: I0219 18:53:32.705991 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6","Type":"ContainerStarted","Data":"b7c98dd566e232d6cab82312bd2b59dff7ab4e8863a894ab1d2a5aaa24c670ed"} Feb 19 18:53:32 crc kubenswrapper[4749]: I0219 18:53:32.706542 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 19 18:53:32 crc kubenswrapper[4749]: I0219 18:53:32.710273 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b40edc19-78bc-456e-ad9f-c3dcae644950","Type":"ContainerStarted","Data":"c6c115f0f602fdff1fbfa90fb53a30bce23d6de8d1428efb21fbbc4c2e8e4c0d"} Feb 19 18:53:32 crc kubenswrapper[4749]: I0219 18:53:32.712910 4749 generic.go:334] "Generic (PLEG): container finished" podID="a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" containerID="3d826c7e7495884091cab7bfb336ab0f5a1b33bf1ab2075b4dc7fc62238156f4" exitCode=2 Feb 19 18:53:32 crc kubenswrapper[4749]: I0219 18:53:32.712935 4749 generic.go:334] "Generic (PLEG): container finished" podID="a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" containerID="4a91a7722ebeaf7cb5bc44c246f5749005fabdd8e6f8416ce3be838105726ef5" exitCode=0 Feb 19 18:53:32 crc kubenswrapper[4749]: I0219 18:53:32.712963 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce","Type":"ContainerDied","Data":"3d826c7e7495884091cab7bfb336ab0f5a1b33bf1ab2075b4dc7fc62238156f4"} Feb 19 18:53:32 crc kubenswrapper[4749]: I0219 18:53:32.712982 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce","Type":"ContainerDied","Data":"4a91a7722ebeaf7cb5bc44c246f5749005fabdd8e6f8416ce3be838105726ef5"} Feb 19 18:53:32 crc kubenswrapper[4749]: I0219 18:53:32.715267 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"f97a54ef-7ca6-4ad1-951b-5c05572b591a","Type":"ContainerStarted","Data":"0eb0a2694bc45dd0b28606a6874a9639d38328fb3107cc00b34521c9ccc4b47c"} Feb 19 18:53:32 crc kubenswrapper[4749]: I0219 18:53:32.715292 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"f97a54ef-7ca6-4ad1-951b-5c05572b591a","Type":"ContainerStarted","Data":"6ec3ed47e799c75583b13c0ffc149fab5ff03e3546684cc1566ee8f48aa9de2c"} Feb 19 18:53:32 crc kubenswrapper[4749]: I0219 18:53:32.715449 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.198:9322/\": dial tcp 10.217.0.198:9322: connect: connection refused" Feb 19 18:53:32 crc kubenswrapper[4749]: I0219 18:53:32.740368 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=2.740345064 podStartE2EDuration="2.740345064s" podCreationTimestamp="2026-02-19 18:53:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:53:32.733456707 +0000 UTC m=+1186.694676661" watchObservedRunningTime="2026-02-19 18:53:32.740345064 +0000 UTC m=+1186.701565028" Feb 19 18:53:32 crc kubenswrapper[4749]: I0219 18:53:32.766821 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.766798814 podStartE2EDuration="4.766798814s" podCreationTimestamp="2026-02-19 18:53:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:53:32.762996322 +0000 UTC m=+1186.724216276" watchObservedRunningTime="2026-02-19 18:53:32.766798814 +0000 UTC m=+1186.728018768" Feb 19 18:53:32 crc kubenswrapper[4749]: I0219 18:53:32.790406 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=2.790390565 podStartE2EDuration="2.790390565s" podCreationTimestamp="2026-02-19 18:53:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:53:32.777745279 +0000 UTC m=+1186.738965223" watchObservedRunningTime="2026-02-19 18:53:32.790390565 +0000 UTC m=+1186.751610509" Feb 19 18:53:33 crc kubenswrapper[4749]: I0219 18:53:33.727810 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4d9d8ca6-8ca8-415e-9120-5c48d275052c","Type":"ContainerStarted","Data":"f9137868519c828c9ffd15844887d8e79c73e0234bc4f2e9e606528b074668b3"} Feb 19 18:53:33 crc kubenswrapper[4749]: I0219 18:53:33.728426 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4d9d8ca6-8ca8-415e-9120-5c48d275052c","Type":"ContainerStarted","Data":"5170f7567981e9f4c728b29b2f13baeed9bb92a9328e8e8c6cd9a7c106694faf"} Feb 19 18:53:34 crc kubenswrapper[4749]: I0219 18:53:34.694645 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="5d0300f8-4f96-4a80-ab8c-ebfec24dce10" containerName="watcher-api-log" probeResult="failure" output="Get \"https://10.217.0.179:9322/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 18:53:34 crc kubenswrapper[4749]: I0219 18:53:34.694683 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="5d0300f8-4f96-4a80-ab8c-ebfec24dce10" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.179:9322/\": dial tcp 10.217.0.179:9322: i/o timeout" Feb 19 18:53:36 crc kubenswrapper[4749]: I0219 18:53:36.063711 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 19 18:53:36 crc kubenswrapper[4749]: I0219 18:53:36.085559 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.08553993 podStartE2EDuration="6.08553993s" podCreationTimestamp="2026-02-19 18:53:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:53:33.750491679 +0000 UTC m=+1187.711711633" watchObservedRunningTime="2026-02-19 18:53:36.08553993 +0000 UTC m=+1190.046759884" Feb 19 18:53:36 crc kubenswrapper[4749]: I0219 18:53:36.287179 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Feb 19 18:53:36 crc kubenswrapper[4749]: I0219 18:53:36.319530 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.551626 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-xzg8l"] Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.553074 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xzg8l" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.581079 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xzg8l"] Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.588555 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-53d3-account-create-update-7n9p9"] Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.589657 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-53d3-account-create-update-7n9p9" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.591291 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.603744 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-53d3-account-create-update-7n9p9"] Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.658990 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-dcm8j"] Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.660156 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dcm8j" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.672318 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-dcm8j"] Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.681999 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe2b2973-1897-4202-be29-9ca7805d443a-operator-scripts\") pod \"nova-api-db-create-xzg8l\" (UID: \"fe2b2973-1897-4202-be29-9ca7805d443a\") " pod="openstack/nova-api-db-create-xzg8l" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.682204 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktmxx\" (UniqueName: \"kubernetes.io/projected/fe2b2973-1897-4202-be29-9ca7805d443a-kube-api-access-ktmxx\") pod \"nova-api-db-create-xzg8l\" (UID: \"fe2b2973-1897-4202-be29-9ca7805d443a\") " pod="openstack/nova-api-db-create-xzg8l" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.682233 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fde9adc3-a038-4de5-949c-d366ebe287d6-operator-scripts\") pod \"nova-api-53d3-account-create-update-7n9p9\" (UID: \"fde9adc3-a038-4de5-949c-d366ebe287d6\") " pod="openstack/nova-api-53d3-account-create-update-7n9p9" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.682320 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4xjc\" (UniqueName: \"kubernetes.io/projected/fde9adc3-a038-4de5-949c-d366ebe287d6-kube-api-access-h4xjc\") pod \"nova-api-53d3-account-create-update-7n9p9\" (UID: \"fde9adc3-a038-4de5-949c-d366ebe287d6\") " pod="openstack/nova-api-53d3-account-create-update-7n9p9" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.723438 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-4085-account-create-update-t9pdt"] Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.724639 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4085-account-create-update-t9pdt" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.727302 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.736642 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-9r5lq"] Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.737809 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9r5lq" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.765889 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-9r5lq"] Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.782853 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4085-account-create-update-t9pdt"] Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.784582 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f-operator-scripts\") pod \"nova-cell0-db-create-dcm8j\" (UID: \"ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f\") " pod="openstack/nova-cell0-db-create-dcm8j" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.784634 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktmxx\" (UniqueName: \"kubernetes.io/projected/fe2b2973-1897-4202-be29-9ca7805d443a-kube-api-access-ktmxx\") pod \"nova-api-db-create-xzg8l\" (UID: \"fe2b2973-1897-4202-be29-9ca7805d443a\") " pod="openstack/nova-api-db-create-xzg8l" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.784656 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fde9adc3-a038-4de5-949c-d366ebe287d6-operator-scripts\") pod \"nova-api-53d3-account-create-update-7n9p9\" (UID: \"fde9adc3-a038-4de5-949c-d366ebe287d6\") " pod="openstack/nova-api-53d3-account-create-update-7n9p9" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.784679 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cb7b549-3520-49f2-9ee8-089bab48181d-operator-scripts\") pod \"nova-cell0-4085-account-create-update-t9pdt\" (UID: \"0cb7b549-3520-49f2-9ee8-089bab48181d\") " pod="openstack/nova-cell0-4085-account-create-update-t9pdt" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.784719 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6d5n\" (UniqueName: \"kubernetes.io/projected/ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f-kube-api-access-s6d5n\") pod \"nova-cell0-db-create-dcm8j\" (UID: \"ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f\") " pod="openstack/nova-cell0-db-create-dcm8j" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.784772 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4xjc\" (UniqueName: \"kubernetes.io/projected/fde9adc3-a038-4de5-949c-d366ebe287d6-kube-api-access-h4xjc\") pod \"nova-api-53d3-account-create-update-7n9p9\" (UID: \"fde9adc3-a038-4de5-949c-d366ebe287d6\") " pod="openstack/nova-api-53d3-account-create-update-7n9p9" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.784849 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw2ss\" (UniqueName: \"kubernetes.io/projected/0cb7b549-3520-49f2-9ee8-089bab48181d-kube-api-access-tw2ss\") pod \"nova-cell0-4085-account-create-update-t9pdt\" (UID: \"0cb7b549-3520-49f2-9ee8-089bab48181d\") " pod="openstack/nova-cell0-4085-account-create-update-t9pdt" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.784894 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe2b2973-1897-4202-be29-9ca7805d443a-operator-scripts\") pod \"nova-api-db-create-xzg8l\" (UID: \"fe2b2973-1897-4202-be29-9ca7805d443a\") " pod="openstack/nova-api-db-create-xzg8l" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.784932 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8vdk\" (UniqueName: \"kubernetes.io/projected/b169885b-4f49-41e1-83e7-d43237329082-kube-api-access-f8vdk\") pod \"nova-cell1-db-create-9r5lq\" (UID: \"b169885b-4f49-41e1-83e7-d43237329082\") " pod="openstack/nova-cell1-db-create-9r5lq" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.784975 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b169885b-4f49-41e1-83e7-d43237329082-operator-scripts\") pod \"nova-cell1-db-create-9r5lq\" (UID: \"b169885b-4f49-41e1-83e7-d43237329082\") " pod="openstack/nova-cell1-db-create-9r5lq" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.786212 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fde9adc3-a038-4de5-949c-d366ebe287d6-operator-scripts\") pod \"nova-api-53d3-account-create-update-7n9p9\" (UID: \"fde9adc3-a038-4de5-949c-d366ebe287d6\") " pod="openstack/nova-api-53d3-account-create-update-7n9p9" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.786591 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe2b2973-1897-4202-be29-9ca7805d443a-operator-scripts\") pod \"nova-api-db-create-xzg8l\" (UID: \"fe2b2973-1897-4202-be29-9ca7805d443a\") " pod="openstack/nova-api-db-create-xzg8l" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.799342 4749 generic.go:334] "Generic (PLEG): container finished" podID="a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" containerID="b860af748501e19490a50014dec379a7ea76b3265261574b5c49747643480f42" exitCode=0 Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.799384 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce","Type":"ContainerDied","Data":"b860af748501e19490a50014dec379a7ea76b3265261574b5c49747643480f42"} Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.807587 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4xjc\" (UniqueName: \"kubernetes.io/projected/fde9adc3-a038-4de5-949c-d366ebe287d6-kube-api-access-h4xjc\") pod \"nova-api-53d3-account-create-update-7n9p9\" (UID: \"fde9adc3-a038-4de5-949c-d366ebe287d6\") " pod="openstack/nova-api-53d3-account-create-update-7n9p9" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.811127 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktmxx\" (UniqueName: \"kubernetes.io/projected/fe2b2973-1897-4202-be29-9ca7805d443a-kube-api-access-ktmxx\") pod \"nova-api-db-create-xzg8l\" (UID: \"fe2b2973-1897-4202-be29-9ca7805d443a\") " pod="openstack/nova-api-db-create-xzg8l" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.869487 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xzg8l" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.886731 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b169885b-4f49-41e1-83e7-d43237329082-operator-scripts\") pod \"nova-cell1-db-create-9r5lq\" (UID: \"b169885b-4f49-41e1-83e7-d43237329082\") " pod="openstack/nova-cell1-db-create-9r5lq" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.886813 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f-operator-scripts\") pod \"nova-cell0-db-create-dcm8j\" (UID: \"ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f\") " pod="openstack/nova-cell0-db-create-dcm8j" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.886856 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cb7b549-3520-49f2-9ee8-089bab48181d-operator-scripts\") pod \"nova-cell0-4085-account-create-update-t9pdt\" (UID: \"0cb7b549-3520-49f2-9ee8-089bab48181d\") " pod="openstack/nova-cell0-4085-account-create-update-t9pdt" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.886904 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6d5n\" (UniqueName: \"kubernetes.io/projected/ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f-kube-api-access-s6d5n\") pod \"nova-cell0-db-create-dcm8j\" (UID: \"ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f\") " pod="openstack/nova-cell0-db-create-dcm8j" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.887185 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw2ss\" (UniqueName: \"kubernetes.io/projected/0cb7b549-3520-49f2-9ee8-089bab48181d-kube-api-access-tw2ss\") pod \"nova-cell0-4085-account-create-update-t9pdt\" (UID: \"0cb7b549-3520-49f2-9ee8-089bab48181d\") " pod="openstack/nova-cell0-4085-account-create-update-t9pdt" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.887247 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8vdk\" (UniqueName: \"kubernetes.io/projected/b169885b-4f49-41e1-83e7-d43237329082-kube-api-access-f8vdk\") pod \"nova-cell1-db-create-9r5lq\" (UID: \"b169885b-4f49-41e1-83e7-d43237329082\") " pod="openstack/nova-cell1-db-create-9r5lq" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.887831 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cb7b549-3520-49f2-9ee8-089bab48181d-operator-scripts\") pod \"nova-cell0-4085-account-create-update-t9pdt\" (UID: \"0cb7b549-3520-49f2-9ee8-089bab48181d\") " pod="openstack/nova-cell0-4085-account-create-update-t9pdt" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.887864 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f-operator-scripts\") pod \"nova-cell0-db-create-dcm8j\" (UID: \"ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f\") " pod="openstack/nova-cell0-db-create-dcm8j" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.888358 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b169885b-4f49-41e1-83e7-d43237329082-operator-scripts\") pod \"nova-cell1-db-create-9r5lq\" (UID: \"b169885b-4f49-41e1-83e7-d43237329082\") " pod="openstack/nova-cell1-db-create-9r5lq" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.904999 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-53d3-account-create-update-7n9p9" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.907831 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw2ss\" (UniqueName: \"kubernetes.io/projected/0cb7b549-3520-49f2-9ee8-089bab48181d-kube-api-access-tw2ss\") pod \"nova-cell0-4085-account-create-update-t9pdt\" (UID: \"0cb7b549-3520-49f2-9ee8-089bab48181d\") " pod="openstack/nova-cell0-4085-account-create-update-t9pdt" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.911297 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6d5n\" (UniqueName: \"kubernetes.io/projected/ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f-kube-api-access-s6d5n\") pod \"nova-cell0-db-create-dcm8j\" (UID: \"ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f\") " pod="openstack/nova-cell0-db-create-dcm8j" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.915697 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8vdk\" (UniqueName: \"kubernetes.io/projected/b169885b-4f49-41e1-83e7-d43237329082-kube-api-access-f8vdk\") pod \"nova-cell1-db-create-9r5lq\" (UID: \"b169885b-4f49-41e1-83e7-d43237329082\") " pod="openstack/nova-cell1-db-create-9r5lq" Feb 19 18:53:37 crc kubenswrapper[4749]: I0219 18:53:37.976401 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dcm8j" Feb 19 18:53:38 crc kubenswrapper[4749]: I0219 18:53:37.992018 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-f0bc-account-create-update-xv578"] Feb 19 18:53:38 crc kubenswrapper[4749]: I0219 18:53:37.993481 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f0bc-account-create-update-xv578" Feb 19 18:53:38 crc kubenswrapper[4749]: I0219 18:53:38.001329 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 19 18:53:38 crc kubenswrapper[4749]: I0219 18:53:38.046878 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4085-account-create-update-t9pdt" Feb 19 18:53:38 crc kubenswrapper[4749]: I0219 18:53:38.066626 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f0bc-account-create-update-xv578"] Feb 19 18:53:38 crc kubenswrapper[4749]: I0219 18:53:38.069676 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9r5lq" Feb 19 18:53:38 crc kubenswrapper[4749]: I0219 18:53:38.106925 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6fvk\" (UniqueName: \"kubernetes.io/projected/01a9c221-8937-4524-98ed-508ca1522a9d-kube-api-access-c6fvk\") pod \"nova-cell1-f0bc-account-create-update-xv578\" (UID: \"01a9c221-8937-4524-98ed-508ca1522a9d\") " pod="openstack/nova-cell1-f0bc-account-create-update-xv578" Feb 19 18:53:38 crc kubenswrapper[4749]: I0219 18:53:38.107097 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01a9c221-8937-4524-98ed-508ca1522a9d-operator-scripts\") pod \"nova-cell1-f0bc-account-create-update-xv578\" (UID: \"01a9c221-8937-4524-98ed-508ca1522a9d\") " pod="openstack/nova-cell1-f0bc-account-create-update-xv578" Feb 19 18:53:38 crc kubenswrapper[4749]: I0219 18:53:38.215053 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01a9c221-8937-4524-98ed-508ca1522a9d-operator-scripts\") pod \"nova-cell1-f0bc-account-create-update-xv578\" (UID: \"01a9c221-8937-4524-98ed-508ca1522a9d\") " pod="openstack/nova-cell1-f0bc-account-create-update-xv578" Feb 19 18:53:38 crc kubenswrapper[4749]: I0219 18:53:38.215223 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6fvk\" (UniqueName: \"kubernetes.io/projected/01a9c221-8937-4524-98ed-508ca1522a9d-kube-api-access-c6fvk\") pod \"nova-cell1-f0bc-account-create-update-xv578\" (UID: \"01a9c221-8937-4524-98ed-508ca1522a9d\") " pod="openstack/nova-cell1-f0bc-account-create-update-xv578" Feb 19 18:53:38 crc kubenswrapper[4749]: I0219 18:53:38.215823 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01a9c221-8937-4524-98ed-508ca1522a9d-operator-scripts\") pod \"nova-cell1-f0bc-account-create-update-xv578\" (UID: \"01a9c221-8937-4524-98ed-508ca1522a9d\") " pod="openstack/nova-cell1-f0bc-account-create-update-xv578" Feb 19 18:53:38 crc kubenswrapper[4749]: I0219 18:53:38.271481 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6fvk\" (UniqueName: \"kubernetes.io/projected/01a9c221-8937-4524-98ed-508ca1522a9d-kube-api-access-c6fvk\") pod \"nova-cell1-f0bc-account-create-update-xv578\" (UID: \"01a9c221-8937-4524-98ed-508ca1522a9d\") " pod="openstack/nova-cell1-f0bc-account-create-update-xv578" Feb 19 18:53:38 crc kubenswrapper[4749]: I0219 18:53:38.504779 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f0bc-account-create-update-xv578" Feb 19 18:53:38 crc kubenswrapper[4749]: I0219 18:53:38.589569 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xzg8l"] Feb 19 18:53:38 crc kubenswrapper[4749]: I0219 18:53:38.814409 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xzg8l" event={"ID":"fe2b2973-1897-4202-be29-9ca7805d443a","Type":"ContainerStarted","Data":"d1ae116371615005842cecd26fb4feec90c7ad69b97d39d522d3bc68dcb6dc25"} Feb 19 18:53:38 crc kubenswrapper[4749]: I0219 18:53:38.814713 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xzg8l" event={"ID":"fe2b2973-1897-4202-be29-9ca7805d443a","Type":"ContainerStarted","Data":"4276fb0a6bd638bcd4fdd7f44d2b12085d6d72903c861efba0c600eabdae2226"} Feb 19 18:53:38 crc kubenswrapper[4749]: I0219 18:53:38.834818 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-xzg8l" podStartSLOduration=1.834804411 podStartE2EDuration="1.834804411s" podCreationTimestamp="2026-02-19 18:53:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:53:38.83437959 +0000 UTC m=+1192.795599544" watchObservedRunningTime="2026-02-19 18:53:38.834804411 +0000 UTC m=+1192.796024355" Feb 19 18:53:38 crc kubenswrapper[4749]: I0219 18:53:38.999194 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-9r5lq"] Feb 19 18:53:39 crc kubenswrapper[4749]: I0219 18:53:39.013413 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4085-account-create-update-t9pdt"] Feb 19 18:53:39 crc kubenswrapper[4749]: W0219 18:53:39.016757 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb169885b_4f49_41e1_83e7_d43237329082.slice/crio-b86fca2e73ef0f04f38b97f7083190266d9e4a1c2db451aa99dc926dd3bf3517 WatchSource:0}: Error finding container b86fca2e73ef0f04f38b97f7083190266d9e4a1c2db451aa99dc926dd3bf3517: Status 404 returned error can't find the container with id b86fca2e73ef0f04f38b97f7083190266d9e4a1c2db451aa99dc926dd3bf3517 Feb 19 18:53:39 crc kubenswrapper[4749]: W0219 18:53:39.017561 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cb7b549_3520_49f2_9ee8_089bab48181d.slice/crio-38b64f8108d7897feab7aeaff8e92ae3804e7c0ce595e35184e2b0a9449ee069 WatchSource:0}: Error finding container 38b64f8108d7897feab7aeaff8e92ae3804e7c0ce595e35184e2b0a9449ee069: Status 404 returned error can't find the container with id 38b64f8108d7897feab7aeaff8e92ae3804e7c0ce595e35184e2b0a9449ee069 Feb 19 18:53:39 crc kubenswrapper[4749]: I0219 18:53:39.024752 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-dcm8j"] Feb 19 18:53:39 crc kubenswrapper[4749]: W0219 18:53:39.028387 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfde9adc3_a038_4de5_949c_d366ebe287d6.slice/crio-75e49b58616d784e186760c0bcf91e4de08697c2263ff42d5e110e1f653dbf1c WatchSource:0}: Error finding container 75e49b58616d784e186760c0bcf91e4de08697c2263ff42d5e110e1f653dbf1c: Status 404 returned error can't find the container with id 75e49b58616d784e186760c0bcf91e4de08697c2263ff42d5e110e1f653dbf1c Feb 19 18:53:39 crc kubenswrapper[4749]: I0219 18:53:39.032774 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-53d3-account-create-update-7n9p9"] Feb 19 18:53:39 crc kubenswrapper[4749]: I0219 18:53:39.044682 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 18:53:39 crc kubenswrapper[4749]: I0219 18:53:39.044801 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 18:53:39 crc kubenswrapper[4749]: I0219 18:53:39.083563 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 18:53:39 crc kubenswrapper[4749]: I0219 18:53:39.097250 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 18:53:39 crc kubenswrapper[4749]: I0219 18:53:39.098168 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f0bc-account-create-update-xv578"] Feb 19 18:53:39 crc kubenswrapper[4749]: I0219 18:53:39.824427 4749 generic.go:334] "Generic (PLEG): container finished" podID="b169885b-4f49-41e1-83e7-d43237329082" containerID="2a241fe1cf6961b1ad10732e0e8d3fd8a6fc085e00fef88f87c5f9f0009ee6e9" exitCode=0 Feb 19 18:53:39 crc kubenswrapper[4749]: I0219 18:53:39.824484 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9r5lq" event={"ID":"b169885b-4f49-41e1-83e7-d43237329082","Type":"ContainerDied","Data":"2a241fe1cf6961b1ad10732e0e8d3fd8a6fc085e00fef88f87c5f9f0009ee6e9"} Feb 19 18:53:39 crc kubenswrapper[4749]: I0219 18:53:39.824510 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9r5lq" event={"ID":"b169885b-4f49-41e1-83e7-d43237329082","Type":"ContainerStarted","Data":"b86fca2e73ef0f04f38b97f7083190266d9e4a1c2db451aa99dc926dd3bf3517"} Feb 19 18:53:39 crc kubenswrapper[4749]: I0219 18:53:39.827325 4749 generic.go:334] "Generic (PLEG): container finished" podID="ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f" containerID="63d4c8b4d8f9752dde1907d862f485e8035f1f3cf89a217ef2e652ea42cfb699" exitCode=0 Feb 19 18:53:39 crc kubenswrapper[4749]: I0219 18:53:39.827366 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dcm8j" event={"ID":"ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f","Type":"ContainerDied","Data":"63d4c8b4d8f9752dde1907d862f485e8035f1f3cf89a217ef2e652ea42cfb699"} Feb 19 18:53:39 crc kubenswrapper[4749]: I0219 18:53:39.827381 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dcm8j" event={"ID":"ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f","Type":"ContainerStarted","Data":"34af642c914a8e1313a2048b1ff40a80c8b933aee30cd9865046920599e35e47"} Feb 19 18:53:39 crc kubenswrapper[4749]: I0219 18:53:39.828873 4749 generic.go:334] "Generic (PLEG): container finished" podID="0cb7b549-3520-49f2-9ee8-089bab48181d" containerID="cfa6524ccba16d2d8f1f346ed55ba6bd6cc21a6a4e3abea529a647ce2e2307a3" exitCode=0 Feb 19 18:53:39 crc kubenswrapper[4749]: I0219 18:53:39.828908 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4085-account-create-update-t9pdt" event={"ID":"0cb7b549-3520-49f2-9ee8-089bab48181d","Type":"ContainerDied","Data":"cfa6524ccba16d2d8f1f346ed55ba6bd6cc21a6a4e3abea529a647ce2e2307a3"} Feb 19 18:53:39 crc kubenswrapper[4749]: I0219 18:53:39.828922 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4085-account-create-update-t9pdt" event={"ID":"0cb7b549-3520-49f2-9ee8-089bab48181d","Type":"ContainerStarted","Data":"38b64f8108d7897feab7aeaff8e92ae3804e7c0ce595e35184e2b0a9449ee069"} Feb 19 18:53:39 crc kubenswrapper[4749]: I0219 18:53:39.830466 4749 generic.go:334] "Generic (PLEG): container finished" podID="fde9adc3-a038-4de5-949c-d366ebe287d6" containerID="65cd47193439036fe28115d1479c32c2eb3410dab98fa8a6c954cb4a3fada840" exitCode=0 Feb 19 18:53:39 crc kubenswrapper[4749]: I0219 18:53:39.830500 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-53d3-account-create-update-7n9p9" event={"ID":"fde9adc3-a038-4de5-949c-d366ebe287d6","Type":"ContainerDied","Data":"65cd47193439036fe28115d1479c32c2eb3410dab98fa8a6c954cb4a3fada840"} Feb 19 18:53:39 crc kubenswrapper[4749]: I0219 18:53:39.830514 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-53d3-account-create-update-7n9p9" event={"ID":"fde9adc3-a038-4de5-949c-d366ebe287d6","Type":"ContainerStarted","Data":"75e49b58616d784e186760c0bcf91e4de08697c2263ff42d5e110e1f653dbf1c"} Feb 19 18:53:39 crc kubenswrapper[4749]: I0219 18:53:39.832560 4749 generic.go:334] "Generic (PLEG): container finished" podID="fe2b2973-1897-4202-be29-9ca7805d443a" containerID="d1ae116371615005842cecd26fb4feec90c7ad69b97d39d522d3bc68dcb6dc25" exitCode=0 Feb 19 18:53:39 crc kubenswrapper[4749]: I0219 18:53:39.832626 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xzg8l" event={"ID":"fe2b2973-1897-4202-be29-9ca7805d443a","Type":"ContainerDied","Data":"d1ae116371615005842cecd26fb4feec90c7ad69b97d39d522d3bc68dcb6dc25"} Feb 19 18:53:39 crc kubenswrapper[4749]: I0219 18:53:39.837272 4749 generic.go:334] "Generic (PLEG): container finished" podID="01a9c221-8937-4524-98ed-508ca1522a9d" containerID="44a837171e721b0ccd68a06da05986cc1fc3220ece3994219136528f6ac967cd" exitCode=0 Feb 19 18:53:39 crc kubenswrapper[4749]: I0219 18:53:39.837727 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f0bc-account-create-update-xv578" event={"ID":"01a9c221-8937-4524-98ed-508ca1522a9d","Type":"ContainerDied","Data":"44a837171e721b0ccd68a06da05986cc1fc3220ece3994219136528f6ac967cd"} Feb 19 18:53:39 crc kubenswrapper[4749]: I0219 18:53:39.837784 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 18:53:39 crc kubenswrapper[4749]: I0219 18:53:39.837797 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f0bc-account-create-update-xv578" event={"ID":"01a9c221-8937-4524-98ed-508ca1522a9d","Type":"ContainerStarted","Data":"e9b6623d1759483ff12551da68ebbe3d40860671545132a395932a91be7c6cd0"} Feb 19 18:53:39 crc kubenswrapper[4749]: I0219 18:53:39.839409 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 18:53:40 crc kubenswrapper[4749]: I0219 18:53:40.146362 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 18:53:40 crc kubenswrapper[4749]: I0219 18:53:40.184880 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Feb 19 18:53:40 crc kubenswrapper[4749]: I0219 18:53:40.848133 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Feb 19 18:53:40 crc kubenswrapper[4749]: I0219 18:53:40.910632 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.287677 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.318839 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.324271 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.324386 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dcm8j" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.336535 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.336578 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.358439 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.393351 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f-operator-scripts\") pod \"ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f\" (UID: \"ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f\") " Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.393434 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6d5n\" (UniqueName: \"kubernetes.io/projected/ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f-kube-api-access-s6d5n\") pod \"ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f\" (UID: \"ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f\") " Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.398910 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.412040 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f" (UID: "ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.430736 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f-kube-api-access-s6d5n" (OuterVolumeSpecName: "kube-api-access-s6d5n") pod "ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f" (UID: "ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f"). InnerVolumeSpecName "kube-api-access-s6d5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.440361 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.495010 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.495048 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6d5n\" (UniqueName: \"kubernetes.io/projected/ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f-kube-api-access-s6d5n\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.673192 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-53d3-account-create-update-7n9p9" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.690443 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f0bc-account-create-update-xv578" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.706371 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9r5lq" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.742170 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4085-account-create-update-t9pdt" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.745977 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xzg8l" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.800646 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6fvk\" (UniqueName: \"kubernetes.io/projected/01a9c221-8937-4524-98ed-508ca1522a9d-kube-api-access-c6fvk\") pod \"01a9c221-8937-4524-98ed-508ca1522a9d\" (UID: \"01a9c221-8937-4524-98ed-508ca1522a9d\") " Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.800693 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fde9adc3-a038-4de5-949c-d366ebe287d6-operator-scripts\") pod \"fde9adc3-a038-4de5-949c-d366ebe287d6\" (UID: \"fde9adc3-a038-4de5-949c-d366ebe287d6\") " Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.800744 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01a9c221-8937-4524-98ed-508ca1522a9d-operator-scripts\") pod \"01a9c221-8937-4524-98ed-508ca1522a9d\" (UID: \"01a9c221-8937-4524-98ed-508ca1522a9d\") " Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.800868 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4xjc\" (UniqueName: \"kubernetes.io/projected/fde9adc3-a038-4de5-949c-d366ebe287d6-kube-api-access-h4xjc\") pod \"fde9adc3-a038-4de5-949c-d366ebe287d6\" (UID: \"fde9adc3-a038-4de5-949c-d366ebe287d6\") " Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.801953 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fde9adc3-a038-4de5-949c-d366ebe287d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fde9adc3-a038-4de5-949c-d366ebe287d6" (UID: "fde9adc3-a038-4de5-949c-d366ebe287d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.802138 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01a9c221-8937-4524-98ed-508ca1522a9d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "01a9c221-8937-4524-98ed-508ca1522a9d" (UID: "01a9c221-8937-4524-98ed-508ca1522a9d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.805688 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fde9adc3-a038-4de5-949c-d366ebe287d6-kube-api-access-h4xjc" (OuterVolumeSpecName: "kube-api-access-h4xjc") pod "fde9adc3-a038-4de5-949c-d366ebe287d6" (UID: "fde9adc3-a038-4de5-949c-d366ebe287d6"). InnerVolumeSpecName "kube-api-access-h4xjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.805729 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01a9c221-8937-4524-98ed-508ca1522a9d-kube-api-access-c6fvk" (OuterVolumeSpecName: "kube-api-access-c6fvk") pod "01a9c221-8937-4524-98ed-508ca1522a9d" (UID: "01a9c221-8937-4524-98ed-508ca1522a9d"). InnerVolumeSpecName "kube-api-access-c6fvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.888685 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dcm8j" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.889761 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dcm8j" event={"ID":"ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f","Type":"ContainerDied","Data":"34af642c914a8e1313a2048b1ff40a80c8b933aee30cd9865046920599e35e47"} Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.889879 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34af642c914a8e1313a2048b1ff40a80c8b933aee30cd9865046920599e35e47" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.900439 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9r5lq" event={"ID":"b169885b-4f49-41e1-83e7-d43237329082","Type":"ContainerDied","Data":"b86fca2e73ef0f04f38b97f7083190266d9e4a1c2db451aa99dc926dd3bf3517"} Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.900493 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b86fca2e73ef0f04f38b97f7083190266d9e4a1c2db451aa99dc926dd3bf3517" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.900595 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9r5lq" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.903821 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cb7b549-3520-49f2-9ee8-089bab48181d-operator-scripts\") pod \"0cb7b549-3520-49f2-9ee8-089bab48181d\" (UID: \"0cb7b549-3520-49f2-9ee8-089bab48181d\") " Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.904225 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8vdk\" (UniqueName: \"kubernetes.io/projected/b169885b-4f49-41e1-83e7-d43237329082-kube-api-access-f8vdk\") pod \"b169885b-4f49-41e1-83e7-d43237329082\" (UID: \"b169885b-4f49-41e1-83e7-d43237329082\") " Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.904370 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b169885b-4f49-41e1-83e7-d43237329082-operator-scripts\") pod \"b169885b-4f49-41e1-83e7-d43237329082\" (UID: \"b169885b-4f49-41e1-83e7-d43237329082\") " Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.904453 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw2ss\" (UniqueName: \"kubernetes.io/projected/0cb7b549-3520-49f2-9ee8-089bab48181d-kube-api-access-tw2ss\") pod \"0cb7b549-3520-49f2-9ee8-089bab48181d\" (UID: \"0cb7b549-3520-49f2-9ee8-089bab48181d\") " Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.904559 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktmxx\" (UniqueName: \"kubernetes.io/projected/fe2b2973-1897-4202-be29-9ca7805d443a-kube-api-access-ktmxx\") pod \"fe2b2973-1897-4202-be29-9ca7805d443a\" (UID: \"fe2b2973-1897-4202-be29-9ca7805d443a\") " Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.904657 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe2b2973-1897-4202-be29-9ca7805d443a-operator-scripts\") pod \"fe2b2973-1897-4202-be29-9ca7805d443a\" (UID: \"fe2b2973-1897-4202-be29-9ca7805d443a\") " Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.905189 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6fvk\" (UniqueName: \"kubernetes.io/projected/01a9c221-8937-4524-98ed-508ca1522a9d-kube-api-access-c6fvk\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.905269 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fde9adc3-a038-4de5-949c-d366ebe287d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.905329 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01a9c221-8937-4524-98ed-508ca1522a9d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.905389 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4xjc\" (UniqueName: \"kubernetes.io/projected/fde9adc3-a038-4de5-949c-d366ebe287d6-kube-api-access-h4xjc\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.905580 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b169885b-4f49-41e1-83e7-d43237329082-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b169885b-4f49-41e1-83e7-d43237329082" (UID: "b169885b-4f49-41e1-83e7-d43237329082"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.905859 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe2b2973-1897-4202-be29-9ca7805d443a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fe2b2973-1897-4202-be29-9ca7805d443a" (UID: "fe2b2973-1897-4202-be29-9ca7805d443a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.905890 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cb7b549-3520-49f2-9ee8-089bab48181d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0cb7b549-3520-49f2-9ee8-089bab48181d" (UID: "0cb7b549-3520-49f2-9ee8-089bab48181d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.909554 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b169885b-4f49-41e1-83e7-d43237329082-kube-api-access-f8vdk" (OuterVolumeSpecName: "kube-api-access-f8vdk") pod "b169885b-4f49-41e1-83e7-d43237329082" (UID: "b169885b-4f49-41e1-83e7-d43237329082"). InnerVolumeSpecName "kube-api-access-f8vdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.910979 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe2b2973-1897-4202-be29-9ca7805d443a-kube-api-access-ktmxx" (OuterVolumeSpecName: "kube-api-access-ktmxx") pod "fe2b2973-1897-4202-be29-9ca7805d443a" (UID: "fe2b2973-1897-4202-be29-9ca7805d443a"). InnerVolumeSpecName "kube-api-access-ktmxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.913224 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cb7b549-3520-49f2-9ee8-089bab48181d-kube-api-access-tw2ss" (OuterVolumeSpecName: "kube-api-access-tw2ss") pod "0cb7b549-3520-49f2-9ee8-089bab48181d" (UID: "0cb7b549-3520-49f2-9ee8-089bab48181d"). InnerVolumeSpecName "kube-api-access-tw2ss". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.913961 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4085-account-create-update-t9pdt" event={"ID":"0cb7b549-3520-49f2-9ee8-089bab48181d","Type":"ContainerDied","Data":"38b64f8108d7897feab7aeaff8e92ae3804e7c0ce595e35184e2b0a9449ee069"} Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.913994 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38b64f8108d7897feab7aeaff8e92ae3804e7c0ce595e35184e2b0a9449ee069" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.914075 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4085-account-create-update-t9pdt" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.920815 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-53d3-account-create-update-7n9p9" event={"ID":"fde9adc3-a038-4de5-949c-d366ebe287d6","Type":"ContainerDied","Data":"75e49b58616d784e186760c0bcf91e4de08697c2263ff42d5e110e1f653dbf1c"} Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.920851 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75e49b58616d784e186760c0bcf91e4de08697c2263ff42d5e110e1f653dbf1c" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.920942 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-53d3-account-create-update-7n9p9" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.926500 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xzg8l" event={"ID":"fe2b2973-1897-4202-be29-9ca7805d443a","Type":"ContainerDied","Data":"4276fb0a6bd638bcd4fdd7f44d2b12085d6d72903c861efba0c600eabdae2226"} Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.926569 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4276fb0a6bd638bcd4fdd7f44d2b12085d6d72903c861efba0c600eabdae2226" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.926635 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xzg8l" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.929633 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f0bc-account-create-update-xv578" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.929880 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f0bc-account-create-update-xv578" event={"ID":"01a9c221-8937-4524-98ed-508ca1522a9d","Type":"ContainerDied","Data":"e9b6623d1759483ff12551da68ebbe3d40860671545132a395932a91be7c6cd0"} Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.929965 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9b6623d1759483ff12551da68ebbe3d40860671545132a395932a91be7c6cd0" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.930929 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.931068 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.963346 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 19 18:53:41 crc kubenswrapper[4749]: I0219 18:53:41.987199 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Feb 19 18:53:42 crc kubenswrapper[4749]: I0219 18:53:42.007671 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe2b2973-1897-4202-be29-9ca7805d443a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:42 crc kubenswrapper[4749]: I0219 18:53:42.007884 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cb7b549-3520-49f2-9ee8-089bab48181d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:42 crc kubenswrapper[4749]: I0219 18:53:42.007944 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8vdk\" (UniqueName: \"kubernetes.io/projected/b169885b-4f49-41e1-83e7-d43237329082-kube-api-access-f8vdk\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:42 crc kubenswrapper[4749]: I0219 18:53:42.007998 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b169885b-4f49-41e1-83e7-d43237329082-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:42 crc kubenswrapper[4749]: I0219 18:53:42.008083 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw2ss\" (UniqueName: \"kubernetes.io/projected/0cb7b549-3520-49f2-9ee8-089bab48181d-kube-api-access-tw2ss\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:42 crc kubenswrapper[4749]: I0219 18:53:42.008146 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktmxx\" (UniqueName: \"kubernetes.io/projected/fe2b2973-1897-4202-be29-9ca7805d443a-kube-api-access-ktmxx\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:42 crc kubenswrapper[4749]: I0219 18:53:42.107299 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 18:53:42 crc kubenswrapper[4749]: I0219 18:53:42.107715 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 18:53:42 crc kubenswrapper[4749]: I0219 18:53:42.497696 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.133049 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fhz4p"] Feb 19 18:53:43 crc kubenswrapper[4749]: E0219 18:53:43.134809 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f" containerName="mariadb-database-create" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.134940 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f" containerName="mariadb-database-create" Feb 19 18:53:43 crc kubenswrapper[4749]: E0219 18:53:43.135049 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb7b549-3520-49f2-9ee8-089bab48181d" containerName="mariadb-account-create-update" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.135143 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb7b549-3520-49f2-9ee8-089bab48181d" containerName="mariadb-account-create-update" Feb 19 18:53:43 crc kubenswrapper[4749]: E0219 18:53:43.135232 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a9c221-8937-4524-98ed-508ca1522a9d" containerName="mariadb-account-create-update" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.135304 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a9c221-8937-4524-98ed-508ca1522a9d" containerName="mariadb-account-create-update" Feb 19 18:53:43 crc kubenswrapper[4749]: E0219 18:53:43.135397 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fde9adc3-a038-4de5-949c-d366ebe287d6" containerName="mariadb-account-create-update" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.135480 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fde9adc3-a038-4de5-949c-d366ebe287d6" containerName="mariadb-account-create-update" Feb 19 18:53:43 crc kubenswrapper[4749]: E0219 18:53:43.135570 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b169885b-4f49-41e1-83e7-d43237329082" containerName="mariadb-database-create" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.135642 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b169885b-4f49-41e1-83e7-d43237329082" containerName="mariadb-database-create" Feb 19 18:53:43 crc kubenswrapper[4749]: E0219 18:53:43.135735 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe2b2973-1897-4202-be29-9ca7805d443a" containerName="mariadb-database-create" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.135822 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe2b2973-1897-4202-be29-9ca7805d443a" containerName="mariadb-database-create" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.136152 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fde9adc3-a038-4de5-949c-d366ebe287d6" containerName="mariadb-account-create-update" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.136252 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b169885b-4f49-41e1-83e7-d43237329082" containerName="mariadb-database-create" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.136322 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe2b2973-1897-4202-be29-9ca7805d443a" containerName="mariadb-database-create" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.136397 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb7b549-3520-49f2-9ee8-089bab48181d" containerName="mariadb-account-create-update" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.136478 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="01a9c221-8937-4524-98ed-508ca1522a9d" containerName="mariadb-account-create-update" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.136556 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f" containerName="mariadb-database-create" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.137448 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-fhz4p" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.140593 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vvztd" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.140675 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.151945 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fhz4p"] Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.158921 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.231833 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bba20baa-aaa9-44cd-8076-2a93bf65ff1e-config-data\") pod \"nova-cell0-conductor-db-sync-fhz4p\" (UID: \"bba20baa-aaa9-44cd-8076-2a93bf65ff1e\") " pod="openstack/nova-cell0-conductor-db-sync-fhz4p" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.232129 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bba20baa-aaa9-44cd-8076-2a93bf65ff1e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-fhz4p\" (UID: \"bba20baa-aaa9-44cd-8076-2a93bf65ff1e\") " pod="openstack/nova-cell0-conductor-db-sync-fhz4p" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.232491 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bba20baa-aaa9-44cd-8076-2a93bf65ff1e-scripts\") pod \"nova-cell0-conductor-db-sync-fhz4p\" (UID: \"bba20baa-aaa9-44cd-8076-2a93bf65ff1e\") " pod="openstack/nova-cell0-conductor-db-sync-fhz4p" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.232554 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxqw2\" (UniqueName: \"kubernetes.io/projected/bba20baa-aaa9-44cd-8076-2a93bf65ff1e-kube-api-access-cxqw2\") pod \"nova-cell0-conductor-db-sync-fhz4p\" (UID: \"bba20baa-aaa9-44cd-8076-2a93bf65ff1e\") " pod="openstack/nova-cell0-conductor-db-sync-fhz4p" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.334371 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxqw2\" (UniqueName: \"kubernetes.io/projected/bba20baa-aaa9-44cd-8076-2a93bf65ff1e-kube-api-access-cxqw2\") pod \"nova-cell0-conductor-db-sync-fhz4p\" (UID: \"bba20baa-aaa9-44cd-8076-2a93bf65ff1e\") " pod="openstack/nova-cell0-conductor-db-sync-fhz4p" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.334506 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bba20baa-aaa9-44cd-8076-2a93bf65ff1e-config-data\") pod \"nova-cell0-conductor-db-sync-fhz4p\" (UID: \"bba20baa-aaa9-44cd-8076-2a93bf65ff1e\") " pod="openstack/nova-cell0-conductor-db-sync-fhz4p" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.334536 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bba20baa-aaa9-44cd-8076-2a93bf65ff1e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-fhz4p\" (UID: \"bba20baa-aaa9-44cd-8076-2a93bf65ff1e\") " pod="openstack/nova-cell0-conductor-db-sync-fhz4p" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.334604 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bba20baa-aaa9-44cd-8076-2a93bf65ff1e-scripts\") pod \"nova-cell0-conductor-db-sync-fhz4p\" (UID: \"bba20baa-aaa9-44cd-8076-2a93bf65ff1e\") " pod="openstack/nova-cell0-conductor-db-sync-fhz4p" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.343592 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bba20baa-aaa9-44cd-8076-2a93bf65ff1e-scripts\") pod \"nova-cell0-conductor-db-sync-fhz4p\" (UID: \"bba20baa-aaa9-44cd-8076-2a93bf65ff1e\") " pod="openstack/nova-cell0-conductor-db-sync-fhz4p" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.344625 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bba20baa-aaa9-44cd-8076-2a93bf65ff1e-config-data\") pod \"nova-cell0-conductor-db-sync-fhz4p\" (UID: \"bba20baa-aaa9-44cd-8076-2a93bf65ff1e\") " pod="openstack/nova-cell0-conductor-db-sync-fhz4p" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.347193 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bba20baa-aaa9-44cd-8076-2a93bf65ff1e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-fhz4p\" (UID: \"bba20baa-aaa9-44cd-8076-2a93bf65ff1e\") " pod="openstack/nova-cell0-conductor-db-sync-fhz4p" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.355538 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxqw2\" (UniqueName: \"kubernetes.io/projected/bba20baa-aaa9-44cd-8076-2a93bf65ff1e-kube-api-access-cxqw2\") pod \"nova-cell0-conductor-db-sync-fhz4p\" (UID: \"bba20baa-aaa9-44cd-8076-2a93bf65ff1e\") " pod="openstack/nova-cell0-conductor-db-sync-fhz4p" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.454956 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-fhz4p" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.948430 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fhz4p"] Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.952484 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 18:53:43 crc kubenswrapper[4749]: I0219 18:53:43.952509 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 18:53:44 crc kubenswrapper[4749]: I0219 18:53:44.197745 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 18:53:44 crc kubenswrapper[4749]: I0219 18:53:44.248386 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 18:53:44 crc kubenswrapper[4749]: I0219 18:53:44.656792 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 18:53:44 crc kubenswrapper[4749]: I0219 18:53:44.657009 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="90ac14db-0b4c-4638-882d-1ab8c9cde6e5" containerName="watcher-decision-engine" containerID="cri-o://86980e71adcd70f4ca16bc5945aa7844f479527ad5ab60c0a8ec9ea9440a83a2" gracePeriod=30 Feb 19 18:53:44 crc kubenswrapper[4749]: I0219 18:53:44.976636 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-fhz4p" event={"ID":"bba20baa-aaa9-44cd-8076-2a93bf65ff1e","Type":"ContainerStarted","Data":"daf9705d83cbfeed4696f83574c601c10007c0db5c3df4418b374cb25474a4da"} Feb 19 18:53:49 crc kubenswrapper[4749]: I0219 18:53:49.023387 4749 generic.go:334] "Generic (PLEG): container finished" podID="90ac14db-0b4c-4638-882d-1ab8c9cde6e5" containerID="86980e71adcd70f4ca16bc5945aa7844f479527ad5ab60c0a8ec9ea9440a83a2" exitCode=0 Feb 19 18:53:49 crc kubenswrapper[4749]: I0219 18:53:49.023472 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"90ac14db-0b4c-4638-882d-1ab8c9cde6e5","Type":"ContainerDied","Data":"86980e71adcd70f4ca16bc5945aa7844f479527ad5ab60c0a8ec9ea9440a83a2"} Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.003045 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.081228 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"90ac14db-0b4c-4638-882d-1ab8c9cde6e5","Type":"ContainerDied","Data":"10f9ce2ac5963bbf9a71fec1f185ba3242ac14993182426de7e675a29fa362fa"} Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.081775 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.082747 4749 scope.go:117] "RemoveContainer" containerID="86980e71adcd70f4ca16bc5945aa7844f479527ad5ab60c0a8ec9ea9440a83a2" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.135249 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-combined-ca-bundle\") pod \"90ac14db-0b4c-4638-882d-1ab8c9cde6e5\" (UID: \"90ac14db-0b4c-4638-882d-1ab8c9cde6e5\") " Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.135365 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-config-data\") pod \"90ac14db-0b4c-4638-882d-1ab8c9cde6e5\" (UID: \"90ac14db-0b4c-4638-882d-1ab8c9cde6e5\") " Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.135453 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwk8b\" (UniqueName: \"kubernetes.io/projected/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-kube-api-access-nwk8b\") pod \"90ac14db-0b4c-4638-882d-1ab8c9cde6e5\" (UID: \"90ac14db-0b4c-4638-882d-1ab8c9cde6e5\") " Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.135524 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-custom-prometheus-ca\") pod \"90ac14db-0b4c-4638-882d-1ab8c9cde6e5\" (UID: \"90ac14db-0b4c-4638-882d-1ab8c9cde6e5\") " Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.135557 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-logs\") pod \"90ac14db-0b4c-4638-882d-1ab8c9cde6e5\" (UID: \"90ac14db-0b4c-4638-882d-1ab8c9cde6e5\") " Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.136244 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-logs" (OuterVolumeSpecName: "logs") pod "90ac14db-0b4c-4638-882d-1ab8c9cde6e5" (UID: "90ac14db-0b4c-4638-882d-1ab8c9cde6e5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.141136 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-kube-api-access-nwk8b" (OuterVolumeSpecName: "kube-api-access-nwk8b") pod "90ac14db-0b4c-4638-882d-1ab8c9cde6e5" (UID: "90ac14db-0b4c-4638-882d-1ab8c9cde6e5"). InnerVolumeSpecName "kube-api-access-nwk8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.164370 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "90ac14db-0b4c-4638-882d-1ab8c9cde6e5" (UID: "90ac14db-0b4c-4638-882d-1ab8c9cde6e5"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.165846 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90ac14db-0b4c-4638-882d-1ab8c9cde6e5" (UID: "90ac14db-0b4c-4638-882d-1ab8c9cde6e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.196386 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-config-data" (OuterVolumeSpecName: "config-data") pod "90ac14db-0b4c-4638-882d-1ab8c9cde6e5" (UID: "90ac14db-0b4c-4638-882d-1ab8c9cde6e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.237211 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.237458 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.237566 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwk8b\" (UniqueName: \"kubernetes.io/projected/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-kube-api-access-nwk8b\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.237650 4749 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.237739 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90ac14db-0b4c-4638-882d-1ab8c9cde6e5-logs\") on node \"crc\" DevicePath \"\"" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.417587 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.429408 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.444106 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 18:53:53 crc kubenswrapper[4749]: E0219 18:53:53.444593 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90ac14db-0b4c-4638-882d-1ab8c9cde6e5" containerName="watcher-decision-engine" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.444620 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="90ac14db-0b4c-4638-882d-1ab8c9cde6e5" containerName="watcher-decision-engine" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.444846 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="90ac14db-0b4c-4638-882d-1ab8c9cde6e5" containerName="watcher-decision-engine" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.445854 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.448399 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.456556 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.542759 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5f18804d-f75a-4e9c-ba11-ba225b074df7-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"5f18804d-f75a-4e9c-ba11-ba225b074df7\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.542828 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f18804d-f75a-4e9c-ba11-ba225b074df7-logs\") pod \"watcher-decision-engine-0\" (UID: \"5f18804d-f75a-4e9c-ba11-ba225b074df7\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.542925 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f584d\" (UniqueName: \"kubernetes.io/projected/5f18804d-f75a-4e9c-ba11-ba225b074df7-kube-api-access-f584d\") pod \"watcher-decision-engine-0\" (UID: \"5f18804d-f75a-4e9c-ba11-ba225b074df7\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.543014 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f18804d-f75a-4e9c-ba11-ba225b074df7-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"5f18804d-f75a-4e9c-ba11-ba225b074df7\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.543064 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f18804d-f75a-4e9c-ba11-ba225b074df7-config-data\") pod \"watcher-decision-engine-0\" (UID: \"5f18804d-f75a-4e9c-ba11-ba225b074df7\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.645263 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5f18804d-f75a-4e9c-ba11-ba225b074df7-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"5f18804d-f75a-4e9c-ba11-ba225b074df7\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.645324 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f18804d-f75a-4e9c-ba11-ba225b074df7-logs\") pod \"watcher-decision-engine-0\" (UID: \"5f18804d-f75a-4e9c-ba11-ba225b074df7\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.645414 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f584d\" (UniqueName: \"kubernetes.io/projected/5f18804d-f75a-4e9c-ba11-ba225b074df7-kube-api-access-f584d\") pod \"watcher-decision-engine-0\" (UID: \"5f18804d-f75a-4e9c-ba11-ba225b074df7\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.645506 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f18804d-f75a-4e9c-ba11-ba225b074df7-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"5f18804d-f75a-4e9c-ba11-ba225b074df7\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.645539 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f18804d-f75a-4e9c-ba11-ba225b074df7-config-data\") pod \"watcher-decision-engine-0\" (UID: \"5f18804d-f75a-4e9c-ba11-ba225b074df7\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.646663 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f18804d-f75a-4e9c-ba11-ba225b074df7-logs\") pod \"watcher-decision-engine-0\" (UID: \"5f18804d-f75a-4e9c-ba11-ba225b074df7\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.650819 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5f18804d-f75a-4e9c-ba11-ba225b074df7-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"5f18804d-f75a-4e9c-ba11-ba225b074df7\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.650967 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f18804d-f75a-4e9c-ba11-ba225b074df7-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"5f18804d-f75a-4e9c-ba11-ba225b074df7\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.651483 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f18804d-f75a-4e9c-ba11-ba225b074df7-config-data\") pod \"watcher-decision-engine-0\" (UID: \"5f18804d-f75a-4e9c-ba11-ba225b074df7\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.665531 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f584d\" (UniqueName: \"kubernetes.io/projected/5f18804d-f75a-4e9c-ba11-ba225b074df7-kube-api-access-f584d\") pod \"watcher-decision-engine-0\" (UID: \"5f18804d-f75a-4e9c-ba11-ba225b074df7\") " pod="openstack/watcher-decision-engine-0" Feb 19 18:53:53 crc kubenswrapper[4749]: I0219 18:53:53.773227 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 18:53:54 crc kubenswrapper[4749]: I0219 18:53:54.110467 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-fhz4p" event={"ID":"bba20baa-aaa9-44cd-8076-2a93bf65ff1e","Type":"ContainerStarted","Data":"cf43de36c50b60b14fe4b2bcb8e098295a7c8556e9921688cc793bf38436280e"} Feb 19 18:53:54 crc kubenswrapper[4749]: I0219 18:53:54.139253 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-fhz4p" podStartSLOduration=2.29452909 podStartE2EDuration="11.139233068s" podCreationTimestamp="2026-02-19 18:53:43 +0000 UTC" firstStartedPulling="2026-02-19 18:53:43.954565859 +0000 UTC m=+1197.915785813" lastFinishedPulling="2026-02-19 18:53:52.799269837 +0000 UTC m=+1206.760489791" observedRunningTime="2026-02-19 18:53:54.132623228 +0000 UTC m=+1208.093843182" watchObservedRunningTime="2026-02-19 18:53:54.139233068 +0000 UTC m=+1208.100453022" Feb 19 18:53:54 crc kubenswrapper[4749]: I0219 18:53:54.319567 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 18:53:54 crc kubenswrapper[4749]: I0219 18:53:54.692330 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90ac14db-0b4c-4638-882d-1ab8c9cde6e5" path="/var/lib/kubelet/pods/90ac14db-0b4c-4638-882d-1ab8c9cde6e5/volumes" Feb 19 18:53:54 crc kubenswrapper[4749]: I0219 18:53:54.726579 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 18:53:54 crc kubenswrapper[4749]: I0219 18:53:54.726656 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 18:53:55 crc kubenswrapper[4749]: I0219 18:53:55.131728 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"5f18804d-f75a-4e9c-ba11-ba225b074df7","Type":"ContainerStarted","Data":"1e5405bf5ffbf6ba1ce26339ee3ffefa0f2a1e20eaba7b0d5872a82ee5ab4e38"} Feb 19 18:53:55 crc kubenswrapper[4749]: I0219 18:53:55.132079 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"5f18804d-f75a-4e9c-ba11-ba225b074df7","Type":"ContainerStarted","Data":"754acad90e79c0fed757433fc9db86bf7338e319a7364be970401d649013c429"} Feb 19 18:53:55 crc kubenswrapper[4749]: I0219 18:53:55.161431 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.161414854 podStartE2EDuration="2.161414854s" podCreationTimestamp="2026-02-19 18:53:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:53:55.151435913 +0000 UTC m=+1209.112655867" watchObservedRunningTime="2026-02-19 18:53:55.161414854 +0000 UTC m=+1209.122634808" Feb 19 18:53:56 crc kubenswrapper[4749]: I0219 18:53:56.797653 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 19 18:54:02 crc kubenswrapper[4749]: I0219 18:54:02.192338 4749 generic.go:334] "Generic (PLEG): container finished" podID="a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" containerID="96d12a07a67dbe4dc0d82d1eb9ea6cfa2cd3f3e1b225475b6aac87b7fdcb0742" exitCode=137 Feb 19 18:54:02 crc kubenswrapper[4749]: I0219 18:54:02.192549 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce","Type":"ContainerDied","Data":"96d12a07a67dbe4dc0d82d1eb9ea6cfa2cd3f3e1b225475b6aac87b7fdcb0742"} Feb 19 18:54:02 crc kubenswrapper[4749]: I0219 18:54:02.894870 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.030511 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-run-httpd\") pod \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.030556 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-log-httpd\") pod \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.030595 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sbnx\" (UniqueName: \"kubernetes.io/projected/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-kube-api-access-2sbnx\") pod \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.030663 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-scripts\") pod \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.030765 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-sg-core-conf-yaml\") pod \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.030819 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-combined-ca-bundle\") pod \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.031101 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-config-data\") pod \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\" (UID: \"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce\") " Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.032973 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" (UID: "a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.033194 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" (UID: "a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.046847 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-kube-api-access-2sbnx" (OuterVolumeSpecName: "kube-api-access-2sbnx") pod "a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" (UID: "a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce"). InnerVolumeSpecName "kube-api-access-2sbnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.052806 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-scripts" (OuterVolumeSpecName: "scripts") pod "a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" (UID: "a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.070191 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" (UID: "a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.130590 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" (UID: "a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.133683 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.133720 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.133734 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.133745 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.133758 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sbnx\" (UniqueName: \"kubernetes.io/projected/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-kube-api-access-2sbnx\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.133774 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.163077 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-config-data" (OuterVolumeSpecName: "config-data") pod "a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" (UID: "a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.203624 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce","Type":"ContainerDied","Data":"8f06a677c2da21482bd507aa6a6b708e1fe6a12c48971d6abc28454a64438a05"} Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.203671 4749 scope.go:117] "RemoveContainer" containerID="96d12a07a67dbe4dc0d82d1eb9ea6cfa2cd3f3e1b225475b6aac87b7fdcb0742" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.203716 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.224437 4749 scope.go:117] "RemoveContainer" containerID="3d826c7e7495884091cab7bfb336ab0f5a1b33bf1ab2075b4dc7fc62238156f4" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.235412 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.242547 4749 scope.go:117] "RemoveContainer" containerID="4a91a7722ebeaf7cb5bc44c246f5749005fabdd8e6f8416ce3be838105726ef5" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.251440 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.272357 4749 scope.go:117] "RemoveContainer" containerID="b860af748501e19490a50014dec379a7ea76b3265261574b5c49747643480f42" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.291450 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.302942 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:54:03 crc kubenswrapper[4749]: E0219 18:54:03.303343 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" containerName="proxy-httpd" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.303360 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" containerName="proxy-httpd" Feb 19 18:54:03 crc kubenswrapper[4749]: E0219 18:54:03.303379 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" containerName="ceilometer-notification-agent" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.303386 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" containerName="ceilometer-notification-agent" Feb 19 18:54:03 crc kubenswrapper[4749]: E0219 18:54:03.303404 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" containerName="ceilometer-central-agent" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.303410 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" containerName="ceilometer-central-agent" Feb 19 18:54:03 crc kubenswrapper[4749]: E0219 18:54:03.303434 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" containerName="sg-core" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.303440 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" containerName="sg-core" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.303605 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" containerName="ceilometer-central-agent" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.303615 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" containerName="ceilometer-notification-agent" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.303631 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" containerName="sg-core" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.303644 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" containerName="proxy-httpd" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.305211 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.308459 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.313591 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.313876 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.439419 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b39e6857-5164-4568-b6e9-bbb966f59eaf-scripts\") pod \"ceilometer-0\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " pod="openstack/ceilometer-0" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.439500 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht4pz\" (UniqueName: \"kubernetes.io/projected/b39e6857-5164-4568-b6e9-bbb966f59eaf-kube-api-access-ht4pz\") pod \"ceilometer-0\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " pod="openstack/ceilometer-0" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.439526 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b39e6857-5164-4568-b6e9-bbb966f59eaf-run-httpd\") pod \"ceilometer-0\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " pod="openstack/ceilometer-0" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.439557 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b39e6857-5164-4568-b6e9-bbb966f59eaf-config-data\") pod \"ceilometer-0\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " pod="openstack/ceilometer-0" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.439637 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b39e6857-5164-4568-b6e9-bbb966f59eaf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " pod="openstack/ceilometer-0" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.439775 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b39e6857-5164-4568-b6e9-bbb966f59eaf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " pod="openstack/ceilometer-0" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.440017 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b39e6857-5164-4568-b6e9-bbb966f59eaf-log-httpd\") pod \"ceilometer-0\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " pod="openstack/ceilometer-0" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.541874 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b39e6857-5164-4568-b6e9-bbb966f59eaf-scripts\") pod \"ceilometer-0\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " pod="openstack/ceilometer-0" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.541942 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht4pz\" (UniqueName: \"kubernetes.io/projected/b39e6857-5164-4568-b6e9-bbb966f59eaf-kube-api-access-ht4pz\") pod \"ceilometer-0\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " pod="openstack/ceilometer-0" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.541976 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b39e6857-5164-4568-b6e9-bbb966f59eaf-run-httpd\") pod \"ceilometer-0\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " pod="openstack/ceilometer-0" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.542015 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b39e6857-5164-4568-b6e9-bbb966f59eaf-config-data\") pod \"ceilometer-0\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " pod="openstack/ceilometer-0" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.542098 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b39e6857-5164-4568-b6e9-bbb966f59eaf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " pod="openstack/ceilometer-0" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.542121 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b39e6857-5164-4568-b6e9-bbb966f59eaf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " pod="openstack/ceilometer-0" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.542204 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b39e6857-5164-4568-b6e9-bbb966f59eaf-log-httpd\") pod \"ceilometer-0\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " pod="openstack/ceilometer-0" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.542457 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b39e6857-5164-4568-b6e9-bbb966f59eaf-run-httpd\") pod \"ceilometer-0\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " pod="openstack/ceilometer-0" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.542578 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b39e6857-5164-4568-b6e9-bbb966f59eaf-log-httpd\") pod \"ceilometer-0\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " pod="openstack/ceilometer-0" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.546862 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b39e6857-5164-4568-b6e9-bbb966f59eaf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " pod="openstack/ceilometer-0" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.547980 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b39e6857-5164-4568-b6e9-bbb966f59eaf-scripts\") pod \"ceilometer-0\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " pod="openstack/ceilometer-0" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.548246 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b39e6857-5164-4568-b6e9-bbb966f59eaf-config-data\") pod \"ceilometer-0\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " pod="openstack/ceilometer-0" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.548468 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b39e6857-5164-4568-b6e9-bbb966f59eaf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " pod="openstack/ceilometer-0" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.575525 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht4pz\" (UniqueName: \"kubernetes.io/projected/b39e6857-5164-4568-b6e9-bbb966f59eaf-kube-api-access-ht4pz\") pod \"ceilometer-0\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " pod="openstack/ceilometer-0" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.631866 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.774341 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 18:54:03 crc kubenswrapper[4749]: I0219 18:54:03.844084 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Feb 19 18:54:04 crc kubenswrapper[4749]: I0219 18:54:04.155756 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:54:04 crc kubenswrapper[4749]: I0219 18:54:04.218390 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b39e6857-5164-4568-b6e9-bbb966f59eaf","Type":"ContainerStarted","Data":"6965ff897db76677d359d2a97eaab40f61d5e0413240ad0f40f827670935a5b3"} Feb 19 18:54:04 crc kubenswrapper[4749]: I0219 18:54:04.218432 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Feb 19 18:54:04 crc kubenswrapper[4749]: I0219 18:54:04.247522 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Feb 19 18:54:04 crc kubenswrapper[4749]: I0219 18:54:04.690453 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce" path="/var/lib/kubelet/pods/a9a36573-a6ea-481c-bdc1-ea3d7b06e2ce/volumes" Feb 19 18:54:04 crc kubenswrapper[4749]: I0219 18:54:04.731008 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 18:54:05 crc kubenswrapper[4749]: I0219 18:54:05.242499 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b39e6857-5164-4568-b6e9-bbb966f59eaf","Type":"ContainerStarted","Data":"5799f1bf305e09e0667f2b557c57b80f10435c0e0b580337933ffeaa46c08bde"} Feb 19 18:54:05 crc kubenswrapper[4749]: I0219 18:54:05.244085 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b39e6857-5164-4568-b6e9-bbb966f59eaf","Type":"ContainerStarted","Data":"832baa879a55b54bdd345cc180f3475e0ad99e513766be4f53df607d27df6226"} Feb 19 18:54:06 crc kubenswrapper[4749]: I0219 18:54:06.253831 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b39e6857-5164-4568-b6e9-bbb966f59eaf","Type":"ContainerStarted","Data":"1c75e71934d67d6b6d41b48e045a78a1d625974cc7bca63d43248edaf85f80a9"} Feb 19 18:54:08 crc kubenswrapper[4749]: I0219 18:54:08.276512 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b39e6857-5164-4568-b6e9-bbb966f59eaf","Type":"ContainerStarted","Data":"8fd21324a96ab0276be5194c939a899c0d772ed167b83498e1c9ef6604d5c909"} Feb 19 18:54:08 crc kubenswrapper[4749]: I0219 18:54:08.278264 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 18:54:08 crc kubenswrapper[4749]: I0219 18:54:08.310709 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.099393001 podStartE2EDuration="5.310685866s" podCreationTimestamp="2026-02-19 18:54:03 +0000 UTC" firstStartedPulling="2026-02-19 18:54:04.167130371 +0000 UTC m=+1218.128350315" lastFinishedPulling="2026-02-19 18:54:07.378423226 +0000 UTC m=+1221.339643180" observedRunningTime="2026-02-19 18:54:08.306624498 +0000 UTC m=+1222.267844462" watchObservedRunningTime="2026-02-19 18:54:08.310685866 +0000 UTC m=+1222.271905820" Feb 19 18:54:10 crc kubenswrapper[4749]: I0219 18:54:10.296383 4749 generic.go:334] "Generic (PLEG): container finished" podID="bba20baa-aaa9-44cd-8076-2a93bf65ff1e" containerID="cf43de36c50b60b14fe4b2bcb8e098295a7c8556e9921688cc793bf38436280e" exitCode=0 Feb 19 18:54:10 crc kubenswrapper[4749]: I0219 18:54:10.296644 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-fhz4p" event={"ID":"bba20baa-aaa9-44cd-8076-2a93bf65ff1e","Type":"ContainerDied","Data":"cf43de36c50b60b14fe4b2bcb8e098295a7c8556e9921688cc793bf38436280e"} Feb 19 18:54:11 crc kubenswrapper[4749]: I0219 18:54:11.699558 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-fhz4p" Feb 19 18:54:11 crc kubenswrapper[4749]: I0219 18:54:11.822969 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bba20baa-aaa9-44cd-8076-2a93bf65ff1e-scripts\") pod \"bba20baa-aaa9-44cd-8076-2a93bf65ff1e\" (UID: \"bba20baa-aaa9-44cd-8076-2a93bf65ff1e\") " Feb 19 18:54:11 crc kubenswrapper[4749]: I0219 18:54:11.823266 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxqw2\" (UniqueName: \"kubernetes.io/projected/bba20baa-aaa9-44cd-8076-2a93bf65ff1e-kube-api-access-cxqw2\") pod \"bba20baa-aaa9-44cd-8076-2a93bf65ff1e\" (UID: \"bba20baa-aaa9-44cd-8076-2a93bf65ff1e\") " Feb 19 18:54:11 crc kubenswrapper[4749]: I0219 18:54:11.823440 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bba20baa-aaa9-44cd-8076-2a93bf65ff1e-config-data\") pod \"bba20baa-aaa9-44cd-8076-2a93bf65ff1e\" (UID: \"bba20baa-aaa9-44cd-8076-2a93bf65ff1e\") " Feb 19 18:54:11 crc kubenswrapper[4749]: I0219 18:54:11.823574 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bba20baa-aaa9-44cd-8076-2a93bf65ff1e-combined-ca-bundle\") pod \"bba20baa-aaa9-44cd-8076-2a93bf65ff1e\" (UID: \"bba20baa-aaa9-44cd-8076-2a93bf65ff1e\") " Feb 19 18:54:11 crc kubenswrapper[4749]: I0219 18:54:11.828219 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bba20baa-aaa9-44cd-8076-2a93bf65ff1e-scripts" (OuterVolumeSpecName: "scripts") pod "bba20baa-aaa9-44cd-8076-2a93bf65ff1e" (UID: "bba20baa-aaa9-44cd-8076-2a93bf65ff1e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:54:11 crc kubenswrapper[4749]: I0219 18:54:11.841576 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bba20baa-aaa9-44cd-8076-2a93bf65ff1e-kube-api-access-cxqw2" (OuterVolumeSpecName: "kube-api-access-cxqw2") pod "bba20baa-aaa9-44cd-8076-2a93bf65ff1e" (UID: "bba20baa-aaa9-44cd-8076-2a93bf65ff1e"). InnerVolumeSpecName "kube-api-access-cxqw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:54:11 crc kubenswrapper[4749]: I0219 18:54:11.861920 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bba20baa-aaa9-44cd-8076-2a93bf65ff1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bba20baa-aaa9-44cd-8076-2a93bf65ff1e" (UID: "bba20baa-aaa9-44cd-8076-2a93bf65ff1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:54:11 crc kubenswrapper[4749]: I0219 18:54:11.868202 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bba20baa-aaa9-44cd-8076-2a93bf65ff1e-config-data" (OuterVolumeSpecName: "config-data") pod "bba20baa-aaa9-44cd-8076-2a93bf65ff1e" (UID: "bba20baa-aaa9-44cd-8076-2a93bf65ff1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:54:11 crc kubenswrapper[4749]: I0219 18:54:11.926754 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bba20baa-aaa9-44cd-8076-2a93bf65ff1e-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:11 crc kubenswrapper[4749]: I0219 18:54:11.926791 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxqw2\" (UniqueName: \"kubernetes.io/projected/bba20baa-aaa9-44cd-8076-2a93bf65ff1e-kube-api-access-cxqw2\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:11 crc kubenswrapper[4749]: I0219 18:54:11.926802 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bba20baa-aaa9-44cd-8076-2a93bf65ff1e-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:11 crc kubenswrapper[4749]: I0219 18:54:11.926811 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bba20baa-aaa9-44cd-8076-2a93bf65ff1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:12 crc kubenswrapper[4749]: I0219 18:54:12.316345 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-fhz4p" event={"ID":"bba20baa-aaa9-44cd-8076-2a93bf65ff1e","Type":"ContainerDied","Data":"daf9705d83cbfeed4696f83574c601c10007c0db5c3df4418b374cb25474a4da"} Feb 19 18:54:12 crc kubenswrapper[4749]: I0219 18:54:12.316707 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="daf9705d83cbfeed4696f83574c601c10007c0db5c3df4418b374cb25474a4da" Feb 19 18:54:12 crc kubenswrapper[4749]: I0219 18:54:12.316408 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-fhz4p" Feb 19 18:54:12 crc kubenswrapper[4749]: I0219 18:54:12.407616 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 18:54:12 crc kubenswrapper[4749]: E0219 18:54:12.408186 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bba20baa-aaa9-44cd-8076-2a93bf65ff1e" containerName="nova-cell0-conductor-db-sync" Feb 19 18:54:12 crc kubenswrapper[4749]: I0219 18:54:12.408213 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bba20baa-aaa9-44cd-8076-2a93bf65ff1e" containerName="nova-cell0-conductor-db-sync" Feb 19 18:54:12 crc kubenswrapper[4749]: I0219 18:54:12.408460 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bba20baa-aaa9-44cd-8076-2a93bf65ff1e" containerName="nova-cell0-conductor-db-sync" Feb 19 18:54:12 crc kubenswrapper[4749]: I0219 18:54:12.409378 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 18:54:12 crc kubenswrapper[4749]: I0219 18:54:12.411213 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vvztd" Feb 19 18:54:12 crc kubenswrapper[4749]: I0219 18:54:12.411507 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 18:54:12 crc kubenswrapper[4749]: I0219 18:54:12.418320 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 18:54:12 crc kubenswrapper[4749]: I0219 18:54:12.541624 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df5lw\" (UniqueName: \"kubernetes.io/projected/4ccfb5c8-e819-4a4c-bf02-d7dd004d970e-kube-api-access-df5lw\") pod \"nova-cell0-conductor-0\" (UID: \"4ccfb5c8-e819-4a4c-bf02-d7dd004d970e\") " pod="openstack/nova-cell0-conductor-0" Feb 19 18:54:12 crc kubenswrapper[4749]: I0219 18:54:12.541663 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ccfb5c8-e819-4a4c-bf02-d7dd004d970e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4ccfb5c8-e819-4a4c-bf02-d7dd004d970e\") " pod="openstack/nova-cell0-conductor-0" Feb 19 18:54:12 crc kubenswrapper[4749]: I0219 18:54:12.541706 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ccfb5c8-e819-4a4c-bf02-d7dd004d970e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4ccfb5c8-e819-4a4c-bf02-d7dd004d970e\") " pod="openstack/nova-cell0-conductor-0" Feb 19 18:54:12 crc kubenswrapper[4749]: I0219 18:54:12.645695 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df5lw\" (UniqueName: \"kubernetes.io/projected/4ccfb5c8-e819-4a4c-bf02-d7dd004d970e-kube-api-access-df5lw\") pod \"nova-cell0-conductor-0\" (UID: \"4ccfb5c8-e819-4a4c-bf02-d7dd004d970e\") " pod="openstack/nova-cell0-conductor-0" Feb 19 18:54:12 crc kubenswrapper[4749]: I0219 18:54:12.645748 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ccfb5c8-e819-4a4c-bf02-d7dd004d970e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4ccfb5c8-e819-4a4c-bf02-d7dd004d970e\") " pod="openstack/nova-cell0-conductor-0" Feb 19 18:54:12 crc kubenswrapper[4749]: I0219 18:54:12.645788 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ccfb5c8-e819-4a4c-bf02-d7dd004d970e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4ccfb5c8-e819-4a4c-bf02-d7dd004d970e\") " pod="openstack/nova-cell0-conductor-0" Feb 19 18:54:12 crc kubenswrapper[4749]: I0219 18:54:12.651109 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ccfb5c8-e819-4a4c-bf02-d7dd004d970e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4ccfb5c8-e819-4a4c-bf02-d7dd004d970e\") " pod="openstack/nova-cell0-conductor-0" Feb 19 18:54:12 crc kubenswrapper[4749]: I0219 18:54:12.663301 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ccfb5c8-e819-4a4c-bf02-d7dd004d970e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4ccfb5c8-e819-4a4c-bf02-d7dd004d970e\") " pod="openstack/nova-cell0-conductor-0" Feb 19 18:54:12 crc kubenswrapper[4749]: I0219 18:54:12.666185 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df5lw\" (UniqueName: \"kubernetes.io/projected/4ccfb5c8-e819-4a4c-bf02-d7dd004d970e-kube-api-access-df5lw\") pod \"nova-cell0-conductor-0\" (UID: \"4ccfb5c8-e819-4a4c-bf02-d7dd004d970e\") " pod="openstack/nova-cell0-conductor-0" Feb 19 18:54:12 crc kubenswrapper[4749]: I0219 18:54:12.728291 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 18:54:13 crc kubenswrapper[4749]: I0219 18:54:13.150993 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 18:54:13 crc kubenswrapper[4749]: I0219 18:54:13.326087 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4ccfb5c8-e819-4a4c-bf02-d7dd004d970e","Type":"ContainerStarted","Data":"705f6fde454123d9ff703d424f28a4ef3c383792303f38b3277556c5c29c751c"} Feb 19 18:54:14 crc kubenswrapper[4749]: I0219 18:54:14.336167 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4ccfb5c8-e819-4a4c-bf02-d7dd004d970e","Type":"ContainerStarted","Data":"9e2786e3eddf2e4a4f248ba46a6a6c451a3d17dbecaee397981786fce97d89c2"} Feb 19 18:54:14 crc kubenswrapper[4749]: I0219 18:54:14.336658 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 19 18:54:14 crc kubenswrapper[4749]: I0219 18:54:14.358142 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.358115473 podStartE2EDuration="2.358115473s" podCreationTimestamp="2026-02-19 18:54:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:54:14.350832487 +0000 UTC m=+1228.312052451" watchObservedRunningTime="2026-02-19 18:54:14.358115473 +0000 UTC m=+1228.319335457" Feb 19 18:54:22 crc kubenswrapper[4749]: I0219 18:54:22.757444 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.188518 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-lnwdf"] Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.191424 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lnwdf" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.193454 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.200980 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.204008 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-lnwdf"] Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.245610 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7-config-data\") pod \"nova-cell0-cell-mapping-lnwdf\" (UID: \"ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7\") " pod="openstack/nova-cell0-cell-mapping-lnwdf" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.245676 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lnwdf\" (UID: \"ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7\") " pod="openstack/nova-cell0-cell-mapping-lnwdf" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.245723 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7-scripts\") pod \"nova-cell0-cell-mapping-lnwdf\" (UID: \"ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7\") " pod="openstack/nova-cell0-cell-mapping-lnwdf" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.245751 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bvns\" (UniqueName: \"kubernetes.io/projected/ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7-kube-api-access-5bvns\") pod \"nova-cell0-cell-mapping-lnwdf\" (UID: \"ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7\") " pod="openstack/nova-cell0-cell-mapping-lnwdf" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.350102 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7-config-data\") pod \"nova-cell0-cell-mapping-lnwdf\" (UID: \"ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7\") " pod="openstack/nova-cell0-cell-mapping-lnwdf" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.350186 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lnwdf\" (UID: \"ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7\") " pod="openstack/nova-cell0-cell-mapping-lnwdf" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.350234 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7-scripts\") pod \"nova-cell0-cell-mapping-lnwdf\" (UID: \"ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7\") " pod="openstack/nova-cell0-cell-mapping-lnwdf" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.350278 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bvns\" (UniqueName: \"kubernetes.io/projected/ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7-kube-api-access-5bvns\") pod \"nova-cell0-cell-mapping-lnwdf\" (UID: \"ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7\") " pod="openstack/nova-cell0-cell-mapping-lnwdf" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.361640 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lnwdf\" (UID: \"ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7\") " pod="openstack/nova-cell0-cell-mapping-lnwdf" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.362781 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7-scripts\") pod \"nova-cell0-cell-mapping-lnwdf\" (UID: \"ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7\") " pod="openstack/nova-cell0-cell-mapping-lnwdf" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.365054 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.367071 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7-config-data\") pod \"nova-cell0-cell-mapping-lnwdf\" (UID: \"ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7\") " pod="openstack/nova-cell0-cell-mapping-lnwdf" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.367831 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.374005 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bvns\" (UniqueName: \"kubernetes.io/projected/ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7-kube-api-access-5bvns\") pod \"nova-cell0-cell-mapping-lnwdf\" (UID: \"ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7\") " pod="openstack/nova-cell0-cell-mapping-lnwdf" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.376228 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.436369 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.455281 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb0ae847-95f0-442e-82e7-2c05a874e190-logs\") pod \"nova-api-0\" (UID: \"cb0ae847-95f0-442e-82e7-2c05a874e190\") " pod="openstack/nova-api-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.455345 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb0ae847-95f0-442e-82e7-2c05a874e190-config-data\") pod \"nova-api-0\" (UID: \"cb0ae847-95f0-442e-82e7-2c05a874e190\") " pod="openstack/nova-api-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.455396 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxmcc\" (UniqueName: \"kubernetes.io/projected/cb0ae847-95f0-442e-82e7-2c05a874e190-kube-api-access-qxmcc\") pod \"nova-api-0\" (UID: \"cb0ae847-95f0-442e-82e7-2c05a874e190\") " pod="openstack/nova-api-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.455555 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb0ae847-95f0-442e-82e7-2c05a874e190-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cb0ae847-95f0-442e-82e7-2c05a874e190\") " pod="openstack/nova-api-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.493988 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.495578 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.514069 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.515044 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lnwdf" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.515816 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.559265 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3bc8e41-cdb9-48ac-a47d-f4e4624fae90-logs\") pod \"nova-metadata-0\" (UID: \"a3bc8e41-cdb9-48ac-a47d-f4e4624fae90\") " pod="openstack/nova-metadata-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.559311 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3bc8e41-cdb9-48ac-a47d-f4e4624fae90-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a3bc8e41-cdb9-48ac-a47d-f4e4624fae90\") " pod="openstack/nova-metadata-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.559347 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqm76\" (UniqueName: \"kubernetes.io/projected/a3bc8e41-cdb9-48ac-a47d-f4e4624fae90-kube-api-access-tqm76\") pod \"nova-metadata-0\" (UID: \"a3bc8e41-cdb9-48ac-a47d-f4e4624fae90\") " pod="openstack/nova-metadata-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.559401 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb0ae847-95f0-442e-82e7-2c05a874e190-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cb0ae847-95f0-442e-82e7-2c05a874e190\") " pod="openstack/nova-api-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.559448 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb0ae847-95f0-442e-82e7-2c05a874e190-logs\") pod \"nova-api-0\" (UID: \"cb0ae847-95f0-442e-82e7-2c05a874e190\") " pod="openstack/nova-api-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.559484 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb0ae847-95f0-442e-82e7-2c05a874e190-config-data\") pod \"nova-api-0\" (UID: \"cb0ae847-95f0-442e-82e7-2c05a874e190\") " pod="openstack/nova-api-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.559523 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxmcc\" (UniqueName: \"kubernetes.io/projected/cb0ae847-95f0-442e-82e7-2c05a874e190-kube-api-access-qxmcc\") pod \"nova-api-0\" (UID: \"cb0ae847-95f0-442e-82e7-2c05a874e190\") " pod="openstack/nova-api-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.559541 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3bc8e41-cdb9-48ac-a47d-f4e4624fae90-config-data\") pod \"nova-metadata-0\" (UID: \"a3bc8e41-cdb9-48ac-a47d-f4e4624fae90\") " pod="openstack/nova-metadata-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.560448 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb0ae847-95f0-442e-82e7-2c05a874e190-logs\") pod \"nova-api-0\" (UID: \"cb0ae847-95f0-442e-82e7-2c05a874e190\") " pod="openstack/nova-api-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.577692 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb0ae847-95f0-442e-82e7-2c05a874e190-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cb0ae847-95f0-442e-82e7-2c05a874e190\") " pod="openstack/nova-api-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.581313 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb0ae847-95f0-442e-82e7-2c05a874e190-config-data\") pod \"nova-api-0\" (UID: \"cb0ae847-95f0-442e-82e7-2c05a874e190\") " pod="openstack/nova-api-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.596239 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.599041 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.605828 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.622570 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.625648 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxmcc\" (UniqueName: \"kubernetes.io/projected/cb0ae847-95f0-442e-82e7-2c05a874e190-kube-api-access-qxmcc\") pod \"nova-api-0\" (UID: \"cb0ae847-95f0-442e-82e7-2c05a874e190\") " pod="openstack/nova-api-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.656117 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65c47c5f7-8xl4v"] Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.658093 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.661054 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc8p9\" (UniqueName: \"kubernetes.io/projected/0fab25af-587f-47c5-a11c-c733e64c3be9-kube-api-access-cc8p9\") pod \"nova-cell1-novncproxy-0\" (UID: \"0fab25af-587f-47c5-a11c-c733e64c3be9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.661137 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fab25af-587f-47c5-a11c-c733e64c3be9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0fab25af-587f-47c5-a11c-c733e64c3be9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.661159 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3bc8e41-cdb9-48ac-a47d-f4e4624fae90-config-data\") pod \"nova-metadata-0\" (UID: \"a3bc8e41-cdb9-48ac-a47d-f4e4624fae90\") " pod="openstack/nova-metadata-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.661178 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fab25af-587f-47c5-a11c-c733e64c3be9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0fab25af-587f-47c5-a11c-c733e64c3be9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.661229 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3bc8e41-cdb9-48ac-a47d-f4e4624fae90-logs\") pod \"nova-metadata-0\" (UID: \"a3bc8e41-cdb9-48ac-a47d-f4e4624fae90\") " pod="openstack/nova-metadata-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.661247 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3bc8e41-cdb9-48ac-a47d-f4e4624fae90-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a3bc8e41-cdb9-48ac-a47d-f4e4624fae90\") " pod="openstack/nova-metadata-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.661275 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqm76\" (UniqueName: \"kubernetes.io/projected/a3bc8e41-cdb9-48ac-a47d-f4e4624fae90-kube-api-access-tqm76\") pod \"nova-metadata-0\" (UID: \"a3bc8e41-cdb9-48ac-a47d-f4e4624fae90\") " pod="openstack/nova-metadata-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.665583 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3bc8e41-cdb9-48ac-a47d-f4e4624fae90-logs\") pod \"nova-metadata-0\" (UID: \"a3bc8e41-cdb9-48ac-a47d-f4e4624fae90\") " pod="openstack/nova-metadata-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.689103 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65c47c5f7-8xl4v"] Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.691731 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3bc8e41-cdb9-48ac-a47d-f4e4624fae90-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a3bc8e41-cdb9-48ac-a47d-f4e4624fae90\") " pod="openstack/nova-metadata-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.693240 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3bc8e41-cdb9-48ac-a47d-f4e4624fae90-config-data\") pod \"nova-metadata-0\" (UID: \"a3bc8e41-cdb9-48ac-a47d-f4e4624fae90\") " pod="openstack/nova-metadata-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.707594 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqm76\" (UniqueName: \"kubernetes.io/projected/a3bc8e41-cdb9-48ac-a47d-f4e4624fae90-kube-api-access-tqm76\") pod \"nova-metadata-0\" (UID: \"a3bc8e41-cdb9-48ac-a47d-f4e4624fae90\") " pod="openstack/nova-metadata-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.715921 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.717370 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.721427 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.749313 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.763192 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fab25af-587f-47c5-a11c-c733e64c3be9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0fab25af-587f-47c5-a11c-c733e64c3be9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.763239 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fab25af-587f-47c5-a11c-c733e64c3be9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0fab25af-587f-47c5-a11c-c733e64c3be9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.763277 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6f83341-b64f-4544-b33d-bd987d9be64e-config-data\") pod \"nova-scheduler-0\" (UID: \"c6f83341-b64f-4544-b33d-bd987d9be64e\") " pod="openstack/nova-scheduler-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.763311 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-ovsdbserver-sb\") pod \"dnsmasq-dns-65c47c5f7-8xl4v\" (UID: \"6ba77726-b8ed-4851-8796-ddda0f4d4406\") " pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.763343 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwsck\" (UniqueName: \"kubernetes.io/projected/6ba77726-b8ed-4851-8796-ddda0f4d4406-kube-api-access-nwsck\") pod \"dnsmasq-dns-65c47c5f7-8xl4v\" (UID: \"6ba77726-b8ed-4851-8796-ddda0f4d4406\") " pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.763411 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f83341-b64f-4544-b33d-bd987d9be64e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c6f83341-b64f-4544-b33d-bd987d9be64e\") " pod="openstack/nova-scheduler-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.763457 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-config\") pod \"dnsmasq-dns-65c47c5f7-8xl4v\" (UID: \"6ba77726-b8ed-4851-8796-ddda0f4d4406\") " pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.765131 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-dns-swift-storage-0\") pod \"dnsmasq-dns-65c47c5f7-8xl4v\" (UID: \"6ba77726-b8ed-4851-8796-ddda0f4d4406\") " pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.765151 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr7qz\" (UniqueName: \"kubernetes.io/projected/c6f83341-b64f-4544-b33d-bd987d9be64e-kube-api-access-gr7qz\") pod \"nova-scheduler-0\" (UID: \"c6f83341-b64f-4544-b33d-bd987d9be64e\") " pod="openstack/nova-scheduler-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.765229 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc8p9\" (UniqueName: \"kubernetes.io/projected/0fab25af-587f-47c5-a11c-c733e64c3be9-kube-api-access-cc8p9\") pod \"nova-cell1-novncproxy-0\" (UID: \"0fab25af-587f-47c5-a11c-c733e64c3be9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.765484 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-ovsdbserver-nb\") pod \"dnsmasq-dns-65c47c5f7-8xl4v\" (UID: \"6ba77726-b8ed-4851-8796-ddda0f4d4406\") " pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.765504 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-dns-svc\") pod \"dnsmasq-dns-65c47c5f7-8xl4v\" (UID: \"6ba77726-b8ed-4851-8796-ddda0f4d4406\") " pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.769425 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.787531 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fab25af-587f-47c5-a11c-c733e64c3be9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0fab25af-587f-47c5-a11c-c733e64c3be9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.800636 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fab25af-587f-47c5-a11c-c733e64c3be9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0fab25af-587f-47c5-a11c-c733e64c3be9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.826572 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.831478 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc8p9\" (UniqueName: \"kubernetes.io/projected/0fab25af-587f-47c5-a11c-c733e64c3be9-kube-api-access-cc8p9\") pod \"nova-cell1-novncproxy-0\" (UID: \"0fab25af-587f-47c5-a11c-c733e64c3be9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.871974 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-config\") pod \"dnsmasq-dns-65c47c5f7-8xl4v\" (UID: \"6ba77726-b8ed-4851-8796-ddda0f4d4406\") " pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.872095 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-dns-swift-storage-0\") pod \"dnsmasq-dns-65c47c5f7-8xl4v\" (UID: \"6ba77726-b8ed-4851-8796-ddda0f4d4406\") " pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.872148 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr7qz\" (UniqueName: \"kubernetes.io/projected/c6f83341-b64f-4544-b33d-bd987d9be64e-kube-api-access-gr7qz\") pod \"nova-scheduler-0\" (UID: \"c6f83341-b64f-4544-b33d-bd987d9be64e\") " pod="openstack/nova-scheduler-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.872284 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-ovsdbserver-nb\") pod \"dnsmasq-dns-65c47c5f7-8xl4v\" (UID: \"6ba77726-b8ed-4851-8796-ddda0f4d4406\") " pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.872315 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-dns-svc\") pod \"dnsmasq-dns-65c47c5f7-8xl4v\" (UID: \"6ba77726-b8ed-4851-8796-ddda0f4d4406\") " pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.873760 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-config\") pod \"dnsmasq-dns-65c47c5f7-8xl4v\" (UID: \"6ba77726-b8ed-4851-8796-ddda0f4d4406\") " pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.875240 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6f83341-b64f-4544-b33d-bd987d9be64e-config-data\") pod \"nova-scheduler-0\" (UID: \"c6f83341-b64f-4544-b33d-bd987d9be64e\") " pod="openstack/nova-scheduler-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.875426 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-ovsdbserver-sb\") pod \"dnsmasq-dns-65c47c5f7-8xl4v\" (UID: \"6ba77726-b8ed-4851-8796-ddda0f4d4406\") " pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.875540 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-dns-swift-storage-0\") pod \"dnsmasq-dns-65c47c5f7-8xl4v\" (UID: \"6ba77726-b8ed-4851-8796-ddda0f4d4406\") " pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.875845 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwsck\" (UniqueName: \"kubernetes.io/projected/6ba77726-b8ed-4851-8796-ddda0f4d4406-kube-api-access-nwsck\") pod \"dnsmasq-dns-65c47c5f7-8xl4v\" (UID: \"6ba77726-b8ed-4851-8796-ddda0f4d4406\") " pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.876312 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f83341-b64f-4544-b33d-bd987d9be64e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c6f83341-b64f-4544-b33d-bd987d9be64e\") " pod="openstack/nova-scheduler-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.876524 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-dns-svc\") pod \"dnsmasq-dns-65c47c5f7-8xl4v\" (UID: \"6ba77726-b8ed-4851-8796-ddda0f4d4406\") " pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.877552 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-ovsdbserver-sb\") pod \"dnsmasq-dns-65c47c5f7-8xl4v\" (UID: \"6ba77726-b8ed-4851-8796-ddda0f4d4406\") " pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.879333 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-ovsdbserver-nb\") pod \"dnsmasq-dns-65c47c5f7-8xl4v\" (UID: \"6ba77726-b8ed-4851-8796-ddda0f4d4406\") " pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.882955 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6f83341-b64f-4544-b33d-bd987d9be64e-config-data\") pod \"nova-scheduler-0\" (UID: \"c6f83341-b64f-4544-b33d-bd987d9be64e\") " pod="openstack/nova-scheduler-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.883301 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f83341-b64f-4544-b33d-bd987d9be64e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c6f83341-b64f-4544-b33d-bd987d9be64e\") " pod="openstack/nova-scheduler-0" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.903356 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwsck\" (UniqueName: \"kubernetes.io/projected/6ba77726-b8ed-4851-8796-ddda0f4d4406-kube-api-access-nwsck\") pod \"dnsmasq-dns-65c47c5f7-8xl4v\" (UID: \"6ba77726-b8ed-4851-8796-ddda0f4d4406\") " pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" Feb 19 18:54:23 crc kubenswrapper[4749]: I0219 18:54:23.914193 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr7qz\" (UniqueName: \"kubernetes.io/projected/c6f83341-b64f-4544-b33d-bd987d9be64e-kube-api-access-gr7qz\") pod \"nova-scheduler-0\" (UID: \"c6f83341-b64f-4544-b33d-bd987d9be64e\") " pod="openstack/nova-scheduler-0" Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.067893 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.110113 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.138921 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.207706 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-lnwdf"] Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.367775 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.450831 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lnwdf" event={"ID":"ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7","Type":"ContainerStarted","Data":"14be9ba82fec78e36a048e7ab8ca2c33ca37c41cd7b77353df49c3783957d62e"} Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.452810 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.598729 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-znh79"] Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.601797 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-znh79" Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.605040 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.605240 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.621593 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-znh79"] Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.706142 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f858df-0b0b-4e6a-ada4-c3063dbae4c5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-znh79\" (UID: \"46f858df-0b0b-4e6a-ada4-c3063dbae4c5\") " pod="openstack/nova-cell1-conductor-db-sync-znh79" Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.706305 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg2qg\" (UniqueName: \"kubernetes.io/projected/46f858df-0b0b-4e6a-ada4-c3063dbae4c5-kube-api-access-wg2qg\") pod \"nova-cell1-conductor-db-sync-znh79\" (UID: \"46f858df-0b0b-4e6a-ada4-c3063dbae4c5\") " pod="openstack/nova-cell1-conductor-db-sync-znh79" Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.706374 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f858df-0b0b-4e6a-ada4-c3063dbae4c5-config-data\") pod \"nova-cell1-conductor-db-sync-znh79\" (UID: \"46f858df-0b0b-4e6a-ada4-c3063dbae4c5\") " pod="openstack/nova-cell1-conductor-db-sync-znh79" Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.706399 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46f858df-0b0b-4e6a-ada4-c3063dbae4c5-scripts\") pod \"nova-cell1-conductor-db-sync-znh79\" (UID: \"46f858df-0b0b-4e6a-ada4-c3063dbae4c5\") " pod="openstack/nova-cell1-conductor-db-sync-znh79" Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.725912 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.725963 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.726017 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.727046 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"92113cb5e4b06748bc8c8134a5b7e7475e1c6d6f0cf9e918b487ab08df45bb1f"} pod="openshift-machine-config-operator/machine-config-daemon-nzldt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.727100 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" containerID="cri-o://92113cb5e4b06748bc8c8134a5b7e7475e1c6d6f0cf9e918b487ab08df45bb1f" gracePeriod=600 Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.807803 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg2qg\" (UniqueName: \"kubernetes.io/projected/46f858df-0b0b-4e6a-ada4-c3063dbae4c5-kube-api-access-wg2qg\") pod \"nova-cell1-conductor-db-sync-znh79\" (UID: \"46f858df-0b0b-4e6a-ada4-c3063dbae4c5\") " pod="openstack/nova-cell1-conductor-db-sync-znh79" Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.807852 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f858df-0b0b-4e6a-ada4-c3063dbae4c5-config-data\") pod \"nova-cell1-conductor-db-sync-znh79\" (UID: \"46f858df-0b0b-4e6a-ada4-c3063dbae4c5\") " pod="openstack/nova-cell1-conductor-db-sync-znh79" Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.807877 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46f858df-0b0b-4e6a-ada4-c3063dbae4c5-scripts\") pod \"nova-cell1-conductor-db-sync-znh79\" (UID: \"46f858df-0b0b-4e6a-ada4-c3063dbae4c5\") " pod="openstack/nova-cell1-conductor-db-sync-znh79" Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.807936 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f858df-0b0b-4e6a-ada4-c3063dbae4c5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-znh79\" (UID: \"46f858df-0b0b-4e6a-ada4-c3063dbae4c5\") " pod="openstack/nova-cell1-conductor-db-sync-znh79" Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.809974 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65c47c5f7-8xl4v"] Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.826440 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f858df-0b0b-4e6a-ada4-c3063dbae4c5-config-data\") pod \"nova-cell1-conductor-db-sync-znh79\" (UID: \"46f858df-0b0b-4e6a-ada4-c3063dbae4c5\") " pod="openstack/nova-cell1-conductor-db-sync-znh79" Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.827111 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f858df-0b0b-4e6a-ada4-c3063dbae4c5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-znh79\" (UID: \"46f858df-0b0b-4e6a-ada4-c3063dbae4c5\") " pod="openstack/nova-cell1-conductor-db-sync-znh79" Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.827354 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46f858df-0b0b-4e6a-ada4-c3063dbae4c5-scripts\") pod \"nova-cell1-conductor-db-sync-znh79\" (UID: \"46f858df-0b0b-4e6a-ada4-c3063dbae4c5\") " pod="openstack/nova-cell1-conductor-db-sync-znh79" Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.833346 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg2qg\" (UniqueName: \"kubernetes.io/projected/46f858df-0b0b-4e6a-ada4-c3063dbae4c5-kube-api-access-wg2qg\") pod \"nova-cell1-conductor-db-sync-znh79\" (UID: \"46f858df-0b0b-4e6a-ada4-c3063dbae4c5\") " pod="openstack/nova-cell1-conductor-db-sync-znh79" Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.926498 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-znh79" Feb 19 18:54:24 crc kubenswrapper[4749]: I0219 18:54:24.935354 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 18:54:24 crc kubenswrapper[4749]: W0219 18:54:24.937587 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fab25af_587f_47c5_a11c_c733e64c3be9.slice/crio-79752a59192bc5af7e0392926a900018b90511789646065101c112bd6c1fc347 WatchSource:0}: Error finding container 79752a59192bc5af7e0392926a900018b90511789646065101c112bd6c1fc347: Status 404 returned error can't find the container with id 79752a59192bc5af7e0392926a900018b90511789646065101c112bd6c1fc347 Feb 19 18:54:25 crc kubenswrapper[4749]: I0219 18:54:25.044341 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 18:54:25 crc kubenswrapper[4749]: W0219 18:54:25.050764 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6f83341_b64f_4544_b33d_bd987d9be64e.slice/crio-f9fd49ba5e07af95a9bdbff645fa68a36b92b727d0b25a508be7aebd450578b3 WatchSource:0}: Error finding container f9fd49ba5e07af95a9bdbff645fa68a36b92b727d0b25a508be7aebd450578b3: Status 404 returned error can't find the container with id f9fd49ba5e07af95a9bdbff645fa68a36b92b727d0b25a508be7aebd450578b3 Feb 19 18:54:25 crc kubenswrapper[4749]: I0219 18:54:25.470508 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-znh79"] Feb 19 18:54:25 crc kubenswrapper[4749]: I0219 18:54:25.482447 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c6f83341-b64f-4544-b33d-bd987d9be64e","Type":"ContainerStarted","Data":"f9fd49ba5e07af95a9bdbff645fa68a36b92b727d0b25a508be7aebd450578b3"} Feb 19 18:54:25 crc kubenswrapper[4749]: I0219 18:54:25.485226 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a3bc8e41-cdb9-48ac-a47d-f4e4624fae90","Type":"ContainerStarted","Data":"8e47a92d45ec2660f5009808181dd98ef6ffc180d2e9e1de193d7d2c66b8b777"} Feb 19 18:54:25 crc kubenswrapper[4749]: I0219 18:54:25.487376 4749 generic.go:334] "Generic (PLEG): container finished" podID="6ba77726-b8ed-4851-8796-ddda0f4d4406" containerID="a05887b283e626e6dc36df5094cee78370b324544e091f5b966b839486ceda81" exitCode=0 Feb 19 18:54:25 crc kubenswrapper[4749]: I0219 18:54:25.487464 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" event={"ID":"6ba77726-b8ed-4851-8796-ddda0f4d4406","Type":"ContainerDied","Data":"a05887b283e626e6dc36df5094cee78370b324544e091f5b966b839486ceda81"} Feb 19 18:54:25 crc kubenswrapper[4749]: I0219 18:54:25.487502 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" event={"ID":"6ba77726-b8ed-4851-8796-ddda0f4d4406","Type":"ContainerStarted","Data":"b42eb91c31ae37329644415de1ac81fb41a50509a6e3277817978af05bc574b5"} Feb 19 18:54:25 crc kubenswrapper[4749]: I0219 18:54:25.489763 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0fab25af-587f-47c5-a11c-c733e64c3be9","Type":"ContainerStarted","Data":"79752a59192bc5af7e0392926a900018b90511789646065101c112bd6c1fc347"} Feb 19 18:54:25 crc kubenswrapper[4749]: I0219 18:54:25.530905 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cb0ae847-95f0-442e-82e7-2c05a874e190","Type":"ContainerStarted","Data":"889adf59bc131d682b291d5893b8863f04dacd374041707eead2f0b484206306"} Feb 19 18:54:25 crc kubenswrapper[4749]: I0219 18:54:25.580698 4749 generic.go:334] "Generic (PLEG): container finished" podID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerID="92113cb5e4b06748bc8c8134a5b7e7475e1c6d6f0cf9e918b487ab08df45bb1f" exitCode=0 Feb 19 18:54:25 crc kubenswrapper[4749]: I0219 18:54:25.580819 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerDied","Data":"92113cb5e4b06748bc8c8134a5b7e7475e1c6d6f0cf9e918b487ab08df45bb1f"} Feb 19 18:54:25 crc kubenswrapper[4749]: I0219 18:54:25.580922 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerStarted","Data":"6d4364be274ebee2999aeb5c900bcffa311ebe5dc83a06265c201d7a01da149a"} Feb 19 18:54:25 crc kubenswrapper[4749]: I0219 18:54:25.580952 4749 scope.go:117] "RemoveContainer" containerID="53301e57110cdb8ea70a0d60fa7f17a0a5c063180c8f2db9d35e0ad01b3622e9" Feb 19 18:54:25 crc kubenswrapper[4749]: I0219 18:54:25.587567 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lnwdf" event={"ID":"ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7","Type":"ContainerStarted","Data":"e741afc503439fe8962a221396ee12bb86f3f6ae7001a6e89147f203bd41ac97"} Feb 19 18:54:25 crc kubenswrapper[4749]: I0219 18:54:25.621231 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-lnwdf" podStartSLOduration=2.621201969 podStartE2EDuration="2.621201969s" podCreationTimestamp="2026-02-19 18:54:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:54:25.620467911 +0000 UTC m=+1239.581687855" watchObservedRunningTime="2026-02-19 18:54:25.621201969 +0000 UTC m=+1239.582421923" Feb 19 18:54:26 crc kubenswrapper[4749]: I0219 18:54:26.622185 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-znh79" event={"ID":"46f858df-0b0b-4e6a-ada4-c3063dbae4c5","Type":"ContainerStarted","Data":"5235264586dc444c39526b9f671323ba394d06f3ac246609350fd331e16a09fa"} Feb 19 18:54:26 crc kubenswrapper[4749]: I0219 18:54:26.622960 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-znh79" event={"ID":"46f858df-0b0b-4e6a-ada4-c3063dbae4c5","Type":"ContainerStarted","Data":"84e6e54b5695689f3aa045d425ba9110e3cfd1e0e0b861348e303f70e9771ec6"} Feb 19 18:54:26 crc kubenswrapper[4749]: I0219 18:54:26.631452 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" event={"ID":"6ba77726-b8ed-4851-8796-ddda0f4d4406","Type":"ContainerStarted","Data":"aecfb5f214ab6faa87dc13e057b0bcb9d6481569718066b3b2af4a54e23798ab"} Feb 19 18:54:26 crc kubenswrapper[4749]: I0219 18:54:26.632216 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" Feb 19 18:54:26 crc kubenswrapper[4749]: I0219 18:54:26.643432 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-znh79" podStartSLOduration=2.643416856 podStartE2EDuration="2.643416856s" podCreationTimestamp="2026-02-19 18:54:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:54:26.641120241 +0000 UTC m=+1240.602340205" watchObservedRunningTime="2026-02-19 18:54:26.643416856 +0000 UTC m=+1240.604636810" Feb 19 18:54:26 crc kubenswrapper[4749]: I0219 18:54:26.670170 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" podStartSLOduration=3.6701496430000002 podStartE2EDuration="3.670149643s" podCreationTimestamp="2026-02-19 18:54:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:54:26.664158059 +0000 UTC m=+1240.625378023" watchObservedRunningTime="2026-02-19 18:54:26.670149643 +0000 UTC m=+1240.631369597" Feb 19 18:54:27 crc kubenswrapper[4749]: I0219 18:54:27.346728 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 18:54:27 crc kubenswrapper[4749]: I0219 18:54:27.396638 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 18:54:29 crc kubenswrapper[4749]: I0219 18:54:29.685609 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cb0ae847-95f0-442e-82e7-2c05a874e190","Type":"ContainerStarted","Data":"343d4bd70cd44f515f226d72b98302c83a6c50fada29f403a665b407b3568618"} Feb 19 18:54:29 crc kubenswrapper[4749]: I0219 18:54:29.686300 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cb0ae847-95f0-442e-82e7-2c05a874e190","Type":"ContainerStarted","Data":"247756fae8a812af4a873a1af74ad7383188c41a2b9446417410a2a8690ff733"} Feb 19 18:54:29 crc kubenswrapper[4749]: I0219 18:54:29.688125 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c6f83341-b64f-4544-b33d-bd987d9be64e","Type":"ContainerStarted","Data":"156dc8d78f3f06e8526d7a8a2b7bd2e6e407211d1471e6febb54fd3b49d91161"} Feb 19 18:54:29 crc kubenswrapper[4749]: I0219 18:54:29.689884 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a3bc8e41-cdb9-48ac-a47d-f4e4624fae90","Type":"ContainerStarted","Data":"f3fec739f3fcb3c218257c6cfa29d8f0f6188766399ffa7308e9f4d3fdef6a38"} Feb 19 18:54:29 crc kubenswrapper[4749]: I0219 18:54:29.689929 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a3bc8e41-cdb9-48ac-a47d-f4e4624fae90","Type":"ContainerStarted","Data":"15b19a66cb9d3441a5f225988feac061074033dd42033f205dd2395c74a566e7"} Feb 19 18:54:29 crc kubenswrapper[4749]: I0219 18:54:29.690082 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a3bc8e41-cdb9-48ac-a47d-f4e4624fae90" containerName="nova-metadata-log" containerID="cri-o://15b19a66cb9d3441a5f225988feac061074033dd42033f205dd2395c74a566e7" gracePeriod=30 Feb 19 18:54:29 crc kubenswrapper[4749]: I0219 18:54:29.690147 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a3bc8e41-cdb9-48ac-a47d-f4e4624fae90" containerName="nova-metadata-metadata" containerID="cri-o://f3fec739f3fcb3c218257c6cfa29d8f0f6188766399ffa7308e9f4d3fdef6a38" gracePeriod=30 Feb 19 18:54:29 crc kubenswrapper[4749]: I0219 18:54:29.706589 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0fab25af-587f-47c5-a11c-c733e64c3be9","Type":"ContainerStarted","Data":"561f69d879ff790f3fb762c5bc4b844f1bfbd3aa45865f17527ccb3a53362382"} Feb 19 18:54:29 crc kubenswrapper[4749]: I0219 18:54:29.706694 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="0fab25af-587f-47c5-a11c-c733e64c3be9" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://561f69d879ff790f3fb762c5bc4b844f1bfbd3aa45865f17527ccb3a53362382" gracePeriod=30 Feb 19 18:54:29 crc kubenswrapper[4749]: I0219 18:54:29.746859 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.066923 podStartE2EDuration="6.74684499s" podCreationTimestamp="2026-02-19 18:54:23 +0000 UTC" firstStartedPulling="2026-02-19 18:54:25.055496893 +0000 UTC m=+1239.016716847" lastFinishedPulling="2026-02-19 18:54:28.735418863 +0000 UTC m=+1242.696638837" observedRunningTime="2026-02-19 18:54:29.744093653 +0000 UTC m=+1243.705313617" watchObservedRunningTime="2026-02-19 18:54:29.74684499 +0000 UTC m=+1243.708064944" Feb 19 18:54:29 crc kubenswrapper[4749]: I0219 18:54:29.753709 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.486552388 podStartE2EDuration="6.753698596s" podCreationTimestamp="2026-02-19 18:54:23 +0000 UTC" firstStartedPulling="2026-02-19 18:54:24.467314913 +0000 UTC m=+1238.428534867" lastFinishedPulling="2026-02-19 18:54:28.734461121 +0000 UTC m=+1242.695681075" observedRunningTime="2026-02-19 18:54:29.724661113 +0000 UTC m=+1243.685881067" watchObservedRunningTime="2026-02-19 18:54:29.753698596 +0000 UTC m=+1243.714918550" Feb 19 18:54:29 crc kubenswrapper[4749]: I0219 18:54:29.859755 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.065544816 podStartE2EDuration="6.859735592s" podCreationTimestamp="2026-02-19 18:54:23 +0000 UTC" firstStartedPulling="2026-02-19 18:54:24.947068419 +0000 UTC m=+1238.908288363" lastFinishedPulling="2026-02-19 18:54:28.741259185 +0000 UTC m=+1242.702479139" observedRunningTime="2026-02-19 18:54:29.810401048 +0000 UTC m=+1243.771620992" watchObservedRunningTime="2026-02-19 18:54:29.859735592 +0000 UTC m=+1243.820955546" Feb 19 18:54:29 crc kubenswrapper[4749]: I0219 18:54:29.860883 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.620312736 podStartE2EDuration="6.86087691s" podCreationTimestamp="2026-02-19 18:54:23 +0000 UTC" firstStartedPulling="2026-02-19 18:54:24.483983547 +0000 UTC m=+1238.445203501" lastFinishedPulling="2026-02-19 18:54:28.724547721 +0000 UTC m=+1242.685767675" observedRunningTime="2026-02-19 18:54:29.853188845 +0000 UTC m=+1243.814408799" watchObservedRunningTime="2026-02-19 18:54:29.86087691 +0000 UTC m=+1243.822096864" Feb 19 18:54:30 crc kubenswrapper[4749]: I0219 18:54:30.722638 4749 generic.go:334] "Generic (PLEG): container finished" podID="a3bc8e41-cdb9-48ac-a47d-f4e4624fae90" containerID="f3fec739f3fcb3c218257c6cfa29d8f0f6188766399ffa7308e9f4d3fdef6a38" exitCode=0 Feb 19 18:54:30 crc kubenswrapper[4749]: I0219 18:54:30.722960 4749 generic.go:334] "Generic (PLEG): container finished" podID="a3bc8e41-cdb9-48ac-a47d-f4e4624fae90" containerID="15b19a66cb9d3441a5f225988feac061074033dd42033f205dd2395c74a566e7" exitCode=143 Feb 19 18:54:30 crc kubenswrapper[4749]: I0219 18:54:30.722766 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a3bc8e41-cdb9-48ac-a47d-f4e4624fae90","Type":"ContainerDied","Data":"f3fec739f3fcb3c218257c6cfa29d8f0f6188766399ffa7308e9f4d3fdef6a38"} Feb 19 18:54:30 crc kubenswrapper[4749]: I0219 18:54:30.723899 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a3bc8e41-cdb9-48ac-a47d-f4e4624fae90","Type":"ContainerDied","Data":"15b19a66cb9d3441a5f225988feac061074033dd42033f205dd2395c74a566e7"} Feb 19 18:54:30 crc kubenswrapper[4749]: I0219 18:54:30.900134 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 18:54:30 crc kubenswrapper[4749]: I0219 18:54:30.974759 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3bc8e41-cdb9-48ac-a47d-f4e4624fae90-combined-ca-bundle\") pod \"a3bc8e41-cdb9-48ac-a47d-f4e4624fae90\" (UID: \"a3bc8e41-cdb9-48ac-a47d-f4e4624fae90\") " Feb 19 18:54:30 crc kubenswrapper[4749]: I0219 18:54:30.974851 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqm76\" (UniqueName: \"kubernetes.io/projected/a3bc8e41-cdb9-48ac-a47d-f4e4624fae90-kube-api-access-tqm76\") pod \"a3bc8e41-cdb9-48ac-a47d-f4e4624fae90\" (UID: \"a3bc8e41-cdb9-48ac-a47d-f4e4624fae90\") " Feb 19 18:54:30 crc kubenswrapper[4749]: I0219 18:54:30.974965 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3bc8e41-cdb9-48ac-a47d-f4e4624fae90-config-data\") pod \"a3bc8e41-cdb9-48ac-a47d-f4e4624fae90\" (UID: \"a3bc8e41-cdb9-48ac-a47d-f4e4624fae90\") " Feb 19 18:54:30 crc kubenswrapper[4749]: I0219 18:54:30.975047 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3bc8e41-cdb9-48ac-a47d-f4e4624fae90-logs\") pod \"a3bc8e41-cdb9-48ac-a47d-f4e4624fae90\" (UID: \"a3bc8e41-cdb9-48ac-a47d-f4e4624fae90\") " Feb 19 18:54:30 crc kubenswrapper[4749]: I0219 18:54:30.975703 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3bc8e41-cdb9-48ac-a47d-f4e4624fae90-logs" (OuterVolumeSpecName: "logs") pod "a3bc8e41-cdb9-48ac-a47d-f4e4624fae90" (UID: "a3bc8e41-cdb9-48ac-a47d-f4e4624fae90"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:54:30 crc kubenswrapper[4749]: I0219 18:54:30.994819 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3bc8e41-cdb9-48ac-a47d-f4e4624fae90-kube-api-access-tqm76" (OuterVolumeSpecName: "kube-api-access-tqm76") pod "a3bc8e41-cdb9-48ac-a47d-f4e4624fae90" (UID: "a3bc8e41-cdb9-48ac-a47d-f4e4624fae90"). InnerVolumeSpecName "kube-api-access-tqm76". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.017199 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3bc8e41-cdb9-48ac-a47d-f4e4624fae90-config-data" (OuterVolumeSpecName: "config-data") pod "a3bc8e41-cdb9-48ac-a47d-f4e4624fae90" (UID: "a3bc8e41-cdb9-48ac-a47d-f4e4624fae90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.017884 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3bc8e41-cdb9-48ac-a47d-f4e4624fae90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3bc8e41-cdb9-48ac-a47d-f4e4624fae90" (UID: "a3bc8e41-cdb9-48ac-a47d-f4e4624fae90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.077687 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3bc8e41-cdb9-48ac-a47d-f4e4624fae90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.078006 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqm76\" (UniqueName: \"kubernetes.io/projected/a3bc8e41-cdb9-48ac-a47d-f4e4624fae90-kube-api-access-tqm76\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.078037 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3bc8e41-cdb9-48ac-a47d-f4e4624fae90-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.078046 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3bc8e41-cdb9-48ac-a47d-f4e4624fae90-logs\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.733261 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a3bc8e41-cdb9-48ac-a47d-f4e4624fae90","Type":"ContainerDied","Data":"8e47a92d45ec2660f5009808181dd98ef6ffc180d2e9e1de193d7d2c66b8b777"} Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.733313 4749 scope.go:117] "RemoveContainer" containerID="f3fec739f3fcb3c218257c6cfa29d8f0f6188766399ffa7308e9f4d3fdef6a38" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.733327 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.752173 4749 scope.go:117] "RemoveContainer" containerID="15b19a66cb9d3441a5f225988feac061074033dd42033f205dd2395c74a566e7" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.773105 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.797316 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.809046 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 18:54:31 crc kubenswrapper[4749]: E0219 18:54:31.809498 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3bc8e41-cdb9-48ac-a47d-f4e4624fae90" containerName="nova-metadata-metadata" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.809519 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3bc8e41-cdb9-48ac-a47d-f4e4624fae90" containerName="nova-metadata-metadata" Feb 19 18:54:31 crc kubenswrapper[4749]: E0219 18:54:31.809562 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3bc8e41-cdb9-48ac-a47d-f4e4624fae90" containerName="nova-metadata-log" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.809572 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3bc8e41-cdb9-48ac-a47d-f4e4624fae90" containerName="nova-metadata-log" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.809808 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3bc8e41-cdb9-48ac-a47d-f4e4624fae90" containerName="nova-metadata-metadata" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.809835 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3bc8e41-cdb9-48ac-a47d-f4e4624fae90" containerName="nova-metadata-log" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.811062 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.813433 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.813645 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.820413 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.894516 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81617779-849b-4cad-a049-580774adcde3-config-data\") pod \"nova-metadata-0\" (UID: \"81617779-849b-4cad-a049-580774adcde3\") " pod="openstack/nova-metadata-0" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.894554 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81617779-849b-4cad-a049-580774adcde3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"81617779-849b-4cad-a049-580774adcde3\") " pod="openstack/nova-metadata-0" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.894609 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81617779-849b-4cad-a049-580774adcde3-logs\") pod \"nova-metadata-0\" (UID: \"81617779-849b-4cad-a049-580774adcde3\") " pod="openstack/nova-metadata-0" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.894656 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htqhk\" (UniqueName: \"kubernetes.io/projected/81617779-849b-4cad-a049-580774adcde3-kube-api-access-htqhk\") pod \"nova-metadata-0\" (UID: \"81617779-849b-4cad-a049-580774adcde3\") " pod="openstack/nova-metadata-0" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.894685 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/81617779-849b-4cad-a049-580774adcde3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"81617779-849b-4cad-a049-580774adcde3\") " pod="openstack/nova-metadata-0" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.996113 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81617779-849b-4cad-a049-580774adcde3-config-data\") pod \"nova-metadata-0\" (UID: \"81617779-849b-4cad-a049-580774adcde3\") " pod="openstack/nova-metadata-0" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.996161 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81617779-849b-4cad-a049-580774adcde3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"81617779-849b-4cad-a049-580774adcde3\") " pod="openstack/nova-metadata-0" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.996232 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81617779-849b-4cad-a049-580774adcde3-logs\") pod \"nova-metadata-0\" (UID: \"81617779-849b-4cad-a049-580774adcde3\") " pod="openstack/nova-metadata-0" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.996294 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htqhk\" (UniqueName: \"kubernetes.io/projected/81617779-849b-4cad-a049-580774adcde3-kube-api-access-htqhk\") pod \"nova-metadata-0\" (UID: \"81617779-849b-4cad-a049-580774adcde3\") " pod="openstack/nova-metadata-0" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.996333 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/81617779-849b-4cad-a049-580774adcde3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"81617779-849b-4cad-a049-580774adcde3\") " pod="openstack/nova-metadata-0" Feb 19 18:54:31 crc kubenswrapper[4749]: I0219 18:54:31.997545 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81617779-849b-4cad-a049-580774adcde3-logs\") pod \"nova-metadata-0\" (UID: \"81617779-849b-4cad-a049-580774adcde3\") " pod="openstack/nova-metadata-0" Feb 19 18:54:32 crc kubenswrapper[4749]: I0219 18:54:32.002925 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/81617779-849b-4cad-a049-580774adcde3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"81617779-849b-4cad-a049-580774adcde3\") " pod="openstack/nova-metadata-0" Feb 19 18:54:32 crc kubenswrapper[4749]: I0219 18:54:32.003808 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81617779-849b-4cad-a049-580774adcde3-config-data\") pod \"nova-metadata-0\" (UID: \"81617779-849b-4cad-a049-580774adcde3\") " pod="openstack/nova-metadata-0" Feb 19 18:54:32 crc kubenswrapper[4749]: I0219 18:54:32.003860 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81617779-849b-4cad-a049-580774adcde3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"81617779-849b-4cad-a049-580774adcde3\") " pod="openstack/nova-metadata-0" Feb 19 18:54:32 crc kubenswrapper[4749]: I0219 18:54:32.013952 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htqhk\" (UniqueName: \"kubernetes.io/projected/81617779-849b-4cad-a049-580774adcde3-kube-api-access-htqhk\") pod \"nova-metadata-0\" (UID: \"81617779-849b-4cad-a049-580774adcde3\") " pod="openstack/nova-metadata-0" Feb 19 18:54:32 crc kubenswrapper[4749]: I0219 18:54:32.138889 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 18:54:32 crc kubenswrapper[4749]: I0219 18:54:32.631330 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 18:54:32 crc kubenswrapper[4749]: W0219 18:54:32.637018 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81617779_849b_4cad_a049_580774adcde3.slice/crio-dadf660486630ab4c0d4b1a9ccfc817cba6103801de00263f83b4858197ad88c WatchSource:0}: Error finding container dadf660486630ab4c0d4b1a9ccfc817cba6103801de00263f83b4858197ad88c: Status 404 returned error can't find the container with id dadf660486630ab4c0d4b1a9ccfc817cba6103801de00263f83b4858197ad88c Feb 19 18:54:32 crc kubenswrapper[4749]: I0219 18:54:32.690332 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3bc8e41-cdb9-48ac-a47d-f4e4624fae90" path="/var/lib/kubelet/pods/a3bc8e41-cdb9-48ac-a47d-f4e4624fae90/volumes" Feb 19 18:54:32 crc kubenswrapper[4749]: I0219 18:54:32.749270 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"81617779-849b-4cad-a049-580774adcde3","Type":"ContainerStarted","Data":"dadf660486630ab4c0d4b1a9ccfc817cba6103801de00263f83b4858197ad88c"} Feb 19 18:54:33 crc kubenswrapper[4749]: I0219 18:54:33.642936 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 18:54:33 crc kubenswrapper[4749]: I0219 18:54:33.765177 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"81617779-849b-4cad-a049-580774adcde3","Type":"ContainerStarted","Data":"e7187a6b147dc178c0a73bf79bbb6f37a919d97e47668d408dc43506bac2774c"} Feb 19 18:54:33 crc kubenswrapper[4749]: I0219 18:54:33.765613 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"81617779-849b-4cad-a049-580774adcde3","Type":"ContainerStarted","Data":"bd61a3a6ebfdaba19c38fa5ef758c19c60b9ca2d3d5f10013ddd9da67a7e8eac"} Feb 19 18:54:33 crc kubenswrapper[4749]: I0219 18:54:33.770412 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 18:54:33 crc kubenswrapper[4749]: I0219 18:54:33.770453 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 18:54:33 crc kubenswrapper[4749]: I0219 18:54:33.773963 4749 generic.go:334] "Generic (PLEG): container finished" podID="ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7" containerID="e741afc503439fe8962a221396ee12bb86f3f6ae7001a6e89147f203bd41ac97" exitCode=0 Feb 19 18:54:33 crc kubenswrapper[4749]: I0219 18:54:33.774019 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lnwdf" event={"ID":"ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7","Type":"ContainerDied","Data":"e741afc503439fe8962a221396ee12bb86f3f6ae7001a6e89147f203bd41ac97"} Feb 19 18:54:33 crc kubenswrapper[4749]: I0219 18:54:33.788095 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.788067157 podStartE2EDuration="2.788067157s" podCreationTimestamp="2026-02-19 18:54:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:54:33.784740927 +0000 UTC m=+1247.745960891" watchObservedRunningTime="2026-02-19 18:54:33.788067157 +0000 UTC m=+1247.749287121" Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.069483 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.111954 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.139671 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.139720 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.173860 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79b767c57c-tn7gd"] Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.174182 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" podUID="ca3e53a8-410a-4b02-95e8-00a98704a7ff" containerName="dnsmasq-dns" containerID="cri-o://e95b6ba4347f86f30103b4b060450d70f5978efb2ae54f9c884bd7cdfa7d80e9" gracePeriod=10 Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.258040 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.278378 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" podUID="ca3e53a8-410a-4b02-95e8-00a98704a7ff" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.187:5353: connect: connection refused" Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.769340 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.782424 4749 generic.go:334] "Generic (PLEG): container finished" podID="ca3e53a8-410a-4b02-95e8-00a98704a7ff" containerID="e95b6ba4347f86f30103b4b060450d70f5978efb2ae54f9c884bd7cdfa7d80e9" exitCode=0 Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.783297 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.783462 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" event={"ID":"ca3e53a8-410a-4b02-95e8-00a98704a7ff","Type":"ContainerDied","Data":"e95b6ba4347f86f30103b4b060450d70f5978efb2ae54f9c884bd7cdfa7d80e9"} Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.783489 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b767c57c-tn7gd" event={"ID":"ca3e53a8-410a-4b02-95e8-00a98704a7ff","Type":"ContainerDied","Data":"a21fb7fcde68258d7052ea52a31d36f2967e4c5e2d9e0707b7e0a6a0e66d9306"} Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.783505 4749 scope.go:117] "RemoveContainer" containerID="e95b6ba4347f86f30103b4b060450d70f5978efb2ae54f9c884bd7cdfa7d80e9" Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.812764 4749 scope.go:117] "RemoveContainer" containerID="6fa7334f35c9362ca76ecadac6663a643e0d396bf0ab803232661d936faa3302" Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.855608 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cb0ae847-95f0-442e-82e7-2c05a874e190" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.211:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.855715 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cb0ae847-95f0-442e-82e7-2c05a874e190" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.211:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.873637 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmdzt\" (UniqueName: \"kubernetes.io/projected/ca3e53a8-410a-4b02-95e8-00a98704a7ff-kube-api-access-kmdzt\") pod \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\" (UID: \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\") " Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.873845 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-ovsdbserver-nb\") pod \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\" (UID: \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\") " Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.873910 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-dns-svc\") pod \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\" (UID: \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\") " Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.873964 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-config\") pod \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\" (UID: \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\") " Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.874003 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-dns-swift-storage-0\") pod \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\" (UID: \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\") " Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.874096 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-ovsdbserver-sb\") pod \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\" (UID: \"ca3e53a8-410a-4b02-95e8-00a98704a7ff\") " Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.891208 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca3e53a8-410a-4b02-95e8-00a98704a7ff-kube-api-access-kmdzt" (OuterVolumeSpecName: "kube-api-access-kmdzt") pod "ca3e53a8-410a-4b02-95e8-00a98704a7ff" (UID: "ca3e53a8-410a-4b02-95e8-00a98704a7ff"). InnerVolumeSpecName "kube-api-access-kmdzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.921622 4749 scope.go:117] "RemoveContainer" containerID="e95b6ba4347f86f30103b4b060450d70f5978efb2ae54f9c884bd7cdfa7d80e9" Feb 19 18:54:34 crc kubenswrapper[4749]: E0219 18:54:34.922420 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e95b6ba4347f86f30103b4b060450d70f5978efb2ae54f9c884bd7cdfa7d80e9\": container with ID starting with e95b6ba4347f86f30103b4b060450d70f5978efb2ae54f9c884bd7cdfa7d80e9 not found: ID does not exist" containerID="e95b6ba4347f86f30103b4b060450d70f5978efb2ae54f9c884bd7cdfa7d80e9" Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.922467 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e95b6ba4347f86f30103b4b060450d70f5978efb2ae54f9c884bd7cdfa7d80e9"} err="failed to get container status \"e95b6ba4347f86f30103b4b060450d70f5978efb2ae54f9c884bd7cdfa7d80e9\": rpc error: code = NotFound desc = could not find container \"e95b6ba4347f86f30103b4b060450d70f5978efb2ae54f9c884bd7cdfa7d80e9\": container with ID starting with e95b6ba4347f86f30103b4b060450d70f5978efb2ae54f9c884bd7cdfa7d80e9 not found: ID does not exist" Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.922516 4749 scope.go:117] "RemoveContainer" containerID="6fa7334f35c9362ca76ecadac6663a643e0d396bf0ab803232661d936faa3302" Feb 19 18:54:34 crc kubenswrapper[4749]: E0219 18:54:34.922961 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fa7334f35c9362ca76ecadac6663a643e0d396bf0ab803232661d936faa3302\": container with ID starting with 6fa7334f35c9362ca76ecadac6663a643e0d396bf0ab803232661d936faa3302 not found: ID does not exist" containerID="6fa7334f35c9362ca76ecadac6663a643e0d396bf0ab803232661d936faa3302" Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.922979 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fa7334f35c9362ca76ecadac6663a643e0d396bf0ab803232661d936faa3302"} err="failed to get container status \"6fa7334f35c9362ca76ecadac6663a643e0d396bf0ab803232661d936faa3302\": rpc error: code = NotFound desc = could not find container \"6fa7334f35c9362ca76ecadac6663a643e0d396bf0ab803232661d936faa3302\": container with ID starting with 6fa7334f35c9362ca76ecadac6663a643e0d396bf0ab803232661d936faa3302 not found: ID does not exist" Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.926482 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.956150 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ca3e53a8-410a-4b02-95e8-00a98704a7ff" (UID: "ca3e53a8-410a-4b02-95e8-00a98704a7ff"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.967604 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ca3e53a8-410a-4b02-95e8-00a98704a7ff" (UID: "ca3e53a8-410a-4b02-95e8-00a98704a7ff"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.977156 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmdzt\" (UniqueName: \"kubernetes.io/projected/ca3e53a8-410a-4b02-95e8-00a98704a7ff-kube-api-access-kmdzt\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.977182 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.977193 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:34 crc kubenswrapper[4749]: I0219 18:54:34.986929 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-config" (OuterVolumeSpecName: "config") pod "ca3e53a8-410a-4b02-95e8-00a98704a7ff" (UID: "ca3e53a8-410a-4b02-95e8-00a98704a7ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:54:35 crc kubenswrapper[4749]: I0219 18:54:35.010593 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ca3e53a8-410a-4b02-95e8-00a98704a7ff" (UID: "ca3e53a8-410a-4b02-95e8-00a98704a7ff"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:54:35 crc kubenswrapper[4749]: I0219 18:54:35.015550 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ca3e53a8-410a-4b02-95e8-00a98704a7ff" (UID: "ca3e53a8-410a-4b02-95e8-00a98704a7ff"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:54:35 crc kubenswrapper[4749]: I0219 18:54:35.078913 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:35 crc kubenswrapper[4749]: I0219 18:54:35.078947 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:35 crc kubenswrapper[4749]: I0219 18:54:35.078956 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca3e53a8-410a-4b02-95e8-00a98704a7ff-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:35 crc kubenswrapper[4749]: I0219 18:54:35.124806 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79b767c57c-tn7gd"] Feb 19 18:54:35 crc kubenswrapper[4749]: I0219 18:54:35.135113 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79b767c57c-tn7gd"] Feb 19 18:54:35 crc kubenswrapper[4749]: I0219 18:54:35.154352 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lnwdf" Feb 19 18:54:35 crc kubenswrapper[4749]: I0219 18:54:35.182108 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7-config-data\") pod \"ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7\" (UID: \"ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7\") " Feb 19 18:54:35 crc kubenswrapper[4749]: I0219 18:54:35.182190 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7-scripts\") pod \"ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7\" (UID: \"ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7\") " Feb 19 18:54:35 crc kubenswrapper[4749]: I0219 18:54:35.182275 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7-combined-ca-bundle\") pod \"ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7\" (UID: \"ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7\") " Feb 19 18:54:35 crc kubenswrapper[4749]: I0219 18:54:35.182338 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bvns\" (UniqueName: \"kubernetes.io/projected/ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7-kube-api-access-5bvns\") pod \"ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7\" (UID: \"ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7\") " Feb 19 18:54:35 crc kubenswrapper[4749]: I0219 18:54:35.196749 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7-kube-api-access-5bvns" (OuterVolumeSpecName: "kube-api-access-5bvns") pod "ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7" (UID: "ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7"). InnerVolumeSpecName "kube-api-access-5bvns". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:54:35 crc kubenswrapper[4749]: I0219 18:54:35.199944 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7-scripts" (OuterVolumeSpecName: "scripts") pod "ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7" (UID: "ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:54:35 crc kubenswrapper[4749]: I0219 18:54:35.226297 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7-config-data" (OuterVolumeSpecName: "config-data") pod "ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7" (UID: "ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:54:35 crc kubenswrapper[4749]: I0219 18:54:35.252463 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7" (UID: "ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:54:35 crc kubenswrapper[4749]: I0219 18:54:35.303077 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:35 crc kubenswrapper[4749]: I0219 18:54:35.303110 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bvns\" (UniqueName: \"kubernetes.io/projected/ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7-kube-api-access-5bvns\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:35 crc kubenswrapper[4749]: I0219 18:54:35.303124 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:35 crc kubenswrapper[4749]: I0219 18:54:35.303135 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:35 crc kubenswrapper[4749]: I0219 18:54:35.799316 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lnwdf" event={"ID":"ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7","Type":"ContainerDied","Data":"14be9ba82fec78e36a048e7ab8ca2c33ca37c41cd7b77353df49c3783957d62e"} Feb 19 18:54:35 crc kubenswrapper[4749]: I0219 18:54:35.799361 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14be9ba82fec78e36a048e7ab8ca2c33ca37c41cd7b77353df49c3783957d62e" Feb 19 18:54:35 crc kubenswrapper[4749]: I0219 18:54:35.799334 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lnwdf" Feb 19 18:54:35 crc kubenswrapper[4749]: I0219 18:54:35.977495 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 18:54:35 crc kubenswrapper[4749]: I0219 18:54:35.977729 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cb0ae847-95f0-442e-82e7-2c05a874e190" containerName="nova-api-log" containerID="cri-o://247756fae8a812af4a873a1af74ad7383188c41a2b9446417410a2a8690ff733" gracePeriod=30 Feb 19 18:54:35 crc kubenswrapper[4749]: I0219 18:54:35.977886 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cb0ae847-95f0-442e-82e7-2c05a874e190" containerName="nova-api-api" containerID="cri-o://343d4bd70cd44f515f226d72b98302c83a6c50fada29f403a665b407b3568618" gracePeriod=30 Feb 19 18:54:35 crc kubenswrapper[4749]: I0219 18:54:35.995013 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.016437 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.016957 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="81617779-849b-4cad-a049-580774adcde3" containerName="nova-metadata-log" containerID="cri-o://bd61a3a6ebfdaba19c38fa5ef758c19c60b9ca2d3d5f10013ddd9da67a7e8eac" gracePeriod=30 Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.017527 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="81617779-849b-4cad-a049-580774adcde3" containerName="nova-metadata-metadata" containerID="cri-o://e7187a6b147dc178c0a73bf79bbb6f37a919d97e47668d408dc43506bac2774c" gracePeriod=30 Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.671123 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.692350 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca3e53a8-410a-4b02-95e8-00a98704a7ff" path="/var/lib/kubelet/pods/ca3e53a8-410a-4b02-95e8-00a98704a7ff/volumes" Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.737857 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/81617779-849b-4cad-a049-580774adcde3-nova-metadata-tls-certs\") pod \"81617779-849b-4cad-a049-580774adcde3\" (UID: \"81617779-849b-4cad-a049-580774adcde3\") " Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.738089 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81617779-849b-4cad-a049-580774adcde3-config-data\") pod \"81617779-849b-4cad-a049-580774adcde3\" (UID: \"81617779-849b-4cad-a049-580774adcde3\") " Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.738220 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htqhk\" (UniqueName: \"kubernetes.io/projected/81617779-849b-4cad-a049-580774adcde3-kube-api-access-htqhk\") pod \"81617779-849b-4cad-a049-580774adcde3\" (UID: \"81617779-849b-4cad-a049-580774adcde3\") " Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.738305 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81617779-849b-4cad-a049-580774adcde3-logs\") pod \"81617779-849b-4cad-a049-580774adcde3\" (UID: \"81617779-849b-4cad-a049-580774adcde3\") " Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.738333 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81617779-849b-4cad-a049-580774adcde3-combined-ca-bundle\") pod \"81617779-849b-4cad-a049-580774adcde3\" (UID: \"81617779-849b-4cad-a049-580774adcde3\") " Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.739546 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81617779-849b-4cad-a049-580774adcde3-logs" (OuterVolumeSpecName: "logs") pod "81617779-849b-4cad-a049-580774adcde3" (UID: "81617779-849b-4cad-a049-580774adcde3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.769470 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81617779-849b-4cad-a049-580774adcde3-kube-api-access-htqhk" (OuterVolumeSpecName: "kube-api-access-htqhk") pod "81617779-849b-4cad-a049-580774adcde3" (UID: "81617779-849b-4cad-a049-580774adcde3"). InnerVolumeSpecName "kube-api-access-htqhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.805195 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81617779-849b-4cad-a049-580774adcde3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81617779-849b-4cad-a049-580774adcde3" (UID: "81617779-849b-4cad-a049-580774adcde3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.841360 4749 generic.go:334] "Generic (PLEG): container finished" podID="cb0ae847-95f0-442e-82e7-2c05a874e190" containerID="247756fae8a812af4a873a1af74ad7383188c41a2b9446417410a2a8690ff733" exitCode=143 Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.841531 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cb0ae847-95f0-442e-82e7-2c05a874e190","Type":"ContainerDied","Data":"247756fae8a812af4a873a1af74ad7383188c41a2b9446417410a2a8690ff733"} Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.842941 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81617779-849b-4cad-a049-580774adcde3-logs\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.842959 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81617779-849b-4cad-a049-580774adcde3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.842969 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htqhk\" (UniqueName: \"kubernetes.io/projected/81617779-849b-4cad-a049-580774adcde3-kube-api-access-htqhk\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.861400 4749 generic.go:334] "Generic (PLEG): container finished" podID="81617779-849b-4cad-a049-580774adcde3" containerID="e7187a6b147dc178c0a73bf79bbb6f37a919d97e47668d408dc43506bac2774c" exitCode=0 Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.861431 4749 generic.go:334] "Generic (PLEG): container finished" podID="81617779-849b-4cad-a049-580774adcde3" containerID="bd61a3a6ebfdaba19c38fa5ef758c19c60b9ca2d3d5f10013ddd9da67a7e8eac" exitCode=143 Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.861597 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c6f83341-b64f-4544-b33d-bd987d9be64e" containerName="nova-scheduler-scheduler" containerID="cri-o://156dc8d78f3f06e8526d7a8a2b7bd2e6e407211d1471e6febb54fd3b49d91161" gracePeriod=30 Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.861890 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.862126 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"81617779-849b-4cad-a049-580774adcde3","Type":"ContainerDied","Data":"e7187a6b147dc178c0a73bf79bbb6f37a919d97e47668d408dc43506bac2774c"} Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.862167 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"81617779-849b-4cad-a049-580774adcde3","Type":"ContainerDied","Data":"bd61a3a6ebfdaba19c38fa5ef758c19c60b9ca2d3d5f10013ddd9da67a7e8eac"} Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.862181 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"81617779-849b-4cad-a049-580774adcde3","Type":"ContainerDied","Data":"dadf660486630ab4c0d4b1a9ccfc817cba6103801de00263f83b4858197ad88c"} Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.862199 4749 scope.go:117] "RemoveContainer" containerID="e7187a6b147dc178c0a73bf79bbb6f37a919d97e47668d408dc43506bac2774c" Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.873217 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81617779-849b-4cad-a049-580774adcde3-config-data" (OuterVolumeSpecName: "config-data") pod "81617779-849b-4cad-a049-580774adcde3" (UID: "81617779-849b-4cad-a049-580774adcde3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.944712 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81617779-849b-4cad-a049-580774adcde3-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "81617779-849b-4cad-a049-580774adcde3" (UID: "81617779-849b-4cad-a049-580774adcde3"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.945120 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81617779-849b-4cad-a049-580774adcde3-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.945149 4749 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/81617779-849b-4cad-a049-580774adcde3-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.946622 4749 scope.go:117] "RemoveContainer" containerID="bd61a3a6ebfdaba19c38fa5ef758c19c60b9ca2d3d5f10013ddd9da67a7e8eac" Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.977408 4749 scope.go:117] "RemoveContainer" containerID="e7187a6b147dc178c0a73bf79bbb6f37a919d97e47668d408dc43506bac2774c" Feb 19 18:54:36 crc kubenswrapper[4749]: E0219 18:54:36.977984 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7187a6b147dc178c0a73bf79bbb6f37a919d97e47668d408dc43506bac2774c\": container with ID starting with e7187a6b147dc178c0a73bf79bbb6f37a919d97e47668d408dc43506bac2774c not found: ID does not exist" containerID="e7187a6b147dc178c0a73bf79bbb6f37a919d97e47668d408dc43506bac2774c" Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.978013 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7187a6b147dc178c0a73bf79bbb6f37a919d97e47668d408dc43506bac2774c"} err="failed to get container status \"e7187a6b147dc178c0a73bf79bbb6f37a919d97e47668d408dc43506bac2774c\": rpc error: code = NotFound desc = could not find container \"e7187a6b147dc178c0a73bf79bbb6f37a919d97e47668d408dc43506bac2774c\": container with ID starting with e7187a6b147dc178c0a73bf79bbb6f37a919d97e47668d408dc43506bac2774c not found: ID does not exist" Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.978052 4749 scope.go:117] "RemoveContainer" containerID="bd61a3a6ebfdaba19c38fa5ef758c19c60b9ca2d3d5f10013ddd9da67a7e8eac" Feb 19 18:54:36 crc kubenswrapper[4749]: E0219 18:54:36.978374 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd61a3a6ebfdaba19c38fa5ef758c19c60b9ca2d3d5f10013ddd9da67a7e8eac\": container with ID starting with bd61a3a6ebfdaba19c38fa5ef758c19c60b9ca2d3d5f10013ddd9da67a7e8eac not found: ID does not exist" containerID="bd61a3a6ebfdaba19c38fa5ef758c19c60b9ca2d3d5f10013ddd9da67a7e8eac" Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.978390 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd61a3a6ebfdaba19c38fa5ef758c19c60b9ca2d3d5f10013ddd9da67a7e8eac"} err="failed to get container status \"bd61a3a6ebfdaba19c38fa5ef758c19c60b9ca2d3d5f10013ddd9da67a7e8eac\": rpc error: code = NotFound desc = could not find container \"bd61a3a6ebfdaba19c38fa5ef758c19c60b9ca2d3d5f10013ddd9da67a7e8eac\": container with ID starting with bd61a3a6ebfdaba19c38fa5ef758c19c60b9ca2d3d5f10013ddd9da67a7e8eac not found: ID does not exist" Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.978409 4749 scope.go:117] "RemoveContainer" containerID="e7187a6b147dc178c0a73bf79bbb6f37a919d97e47668d408dc43506bac2774c" Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.978730 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7187a6b147dc178c0a73bf79bbb6f37a919d97e47668d408dc43506bac2774c"} err="failed to get container status \"e7187a6b147dc178c0a73bf79bbb6f37a919d97e47668d408dc43506bac2774c\": rpc error: code = NotFound desc = could not find container \"e7187a6b147dc178c0a73bf79bbb6f37a919d97e47668d408dc43506bac2774c\": container with ID starting with e7187a6b147dc178c0a73bf79bbb6f37a919d97e47668d408dc43506bac2774c not found: ID does not exist" Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.978758 4749 scope.go:117] "RemoveContainer" containerID="bd61a3a6ebfdaba19c38fa5ef758c19c60b9ca2d3d5f10013ddd9da67a7e8eac" Feb 19 18:54:36 crc kubenswrapper[4749]: I0219 18:54:36.979042 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd61a3a6ebfdaba19c38fa5ef758c19c60b9ca2d3d5f10013ddd9da67a7e8eac"} err="failed to get container status \"bd61a3a6ebfdaba19c38fa5ef758c19c60b9ca2d3d5f10013ddd9da67a7e8eac\": rpc error: code = NotFound desc = could not find container \"bd61a3a6ebfdaba19c38fa5ef758c19c60b9ca2d3d5f10013ddd9da67a7e8eac\": container with ID starting with bd61a3a6ebfdaba19c38fa5ef758c19c60b9ca2d3d5f10013ddd9da67a7e8eac not found: ID does not exist" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.195813 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.210574 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.219719 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 18:54:37 crc kubenswrapper[4749]: E0219 18:54:37.236684 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81617779-849b-4cad-a049-580774adcde3" containerName="nova-metadata-log" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.236713 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="81617779-849b-4cad-a049-580774adcde3" containerName="nova-metadata-log" Feb 19 18:54:37 crc kubenswrapper[4749]: E0219 18:54:37.236735 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca3e53a8-410a-4b02-95e8-00a98704a7ff" containerName="dnsmasq-dns" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.236743 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca3e53a8-410a-4b02-95e8-00a98704a7ff" containerName="dnsmasq-dns" Feb 19 18:54:37 crc kubenswrapper[4749]: E0219 18:54:37.236769 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81617779-849b-4cad-a049-580774adcde3" containerName="nova-metadata-metadata" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.236777 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="81617779-849b-4cad-a049-580774adcde3" containerName="nova-metadata-metadata" Feb 19 18:54:37 crc kubenswrapper[4749]: E0219 18:54:37.236796 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca3e53a8-410a-4b02-95e8-00a98704a7ff" containerName="init" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.236802 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca3e53a8-410a-4b02-95e8-00a98704a7ff" containerName="init" Feb 19 18:54:37 crc kubenswrapper[4749]: E0219 18:54:37.236810 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7" containerName="nova-manage" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.236816 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7" containerName="nova-manage" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.237001 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="81617779-849b-4cad-a049-580774adcde3" containerName="nova-metadata-metadata" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.237042 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="81617779-849b-4cad-a049-580774adcde3" containerName="nova-metadata-log" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.237054 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7" containerName="nova-manage" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.237062 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca3e53a8-410a-4b02-95e8-00a98704a7ff" containerName="dnsmasq-dns" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.238039 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.238155 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.244192 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.250318 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.352049 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f489af23-b6c5-42a0-84ef-f5fb1910c79a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f489af23-b6c5-42a0-84ef-f5fb1910c79a\") " pod="openstack/nova-metadata-0" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.352107 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6w6g\" (UniqueName: \"kubernetes.io/projected/f489af23-b6c5-42a0-84ef-f5fb1910c79a-kube-api-access-z6w6g\") pod \"nova-metadata-0\" (UID: \"f489af23-b6c5-42a0-84ef-f5fb1910c79a\") " pod="openstack/nova-metadata-0" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.352163 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f489af23-b6c5-42a0-84ef-f5fb1910c79a-logs\") pod \"nova-metadata-0\" (UID: \"f489af23-b6c5-42a0-84ef-f5fb1910c79a\") " pod="openstack/nova-metadata-0" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.352191 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f489af23-b6c5-42a0-84ef-f5fb1910c79a-config-data\") pod \"nova-metadata-0\" (UID: \"f489af23-b6c5-42a0-84ef-f5fb1910c79a\") " pod="openstack/nova-metadata-0" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.352345 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f489af23-b6c5-42a0-84ef-f5fb1910c79a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f489af23-b6c5-42a0-84ef-f5fb1910c79a\") " pod="openstack/nova-metadata-0" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.454177 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6w6g\" (UniqueName: \"kubernetes.io/projected/f489af23-b6c5-42a0-84ef-f5fb1910c79a-kube-api-access-z6w6g\") pod \"nova-metadata-0\" (UID: \"f489af23-b6c5-42a0-84ef-f5fb1910c79a\") " pod="openstack/nova-metadata-0" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.454247 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f489af23-b6c5-42a0-84ef-f5fb1910c79a-logs\") pod \"nova-metadata-0\" (UID: \"f489af23-b6c5-42a0-84ef-f5fb1910c79a\") " pod="openstack/nova-metadata-0" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.454282 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f489af23-b6c5-42a0-84ef-f5fb1910c79a-config-data\") pod \"nova-metadata-0\" (UID: \"f489af23-b6c5-42a0-84ef-f5fb1910c79a\") " pod="openstack/nova-metadata-0" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.454409 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f489af23-b6c5-42a0-84ef-f5fb1910c79a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f489af23-b6c5-42a0-84ef-f5fb1910c79a\") " pod="openstack/nova-metadata-0" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.454459 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f489af23-b6c5-42a0-84ef-f5fb1910c79a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f489af23-b6c5-42a0-84ef-f5fb1910c79a\") " pod="openstack/nova-metadata-0" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.455716 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f489af23-b6c5-42a0-84ef-f5fb1910c79a-logs\") pod \"nova-metadata-0\" (UID: \"f489af23-b6c5-42a0-84ef-f5fb1910c79a\") " pod="openstack/nova-metadata-0" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.458415 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f489af23-b6c5-42a0-84ef-f5fb1910c79a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f489af23-b6c5-42a0-84ef-f5fb1910c79a\") " pod="openstack/nova-metadata-0" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.460068 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f489af23-b6c5-42a0-84ef-f5fb1910c79a-config-data\") pod \"nova-metadata-0\" (UID: \"f489af23-b6c5-42a0-84ef-f5fb1910c79a\") " pod="openstack/nova-metadata-0" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.460467 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f489af23-b6c5-42a0-84ef-f5fb1910c79a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f489af23-b6c5-42a0-84ef-f5fb1910c79a\") " pod="openstack/nova-metadata-0" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.475704 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6w6g\" (UniqueName: \"kubernetes.io/projected/f489af23-b6c5-42a0-84ef-f5fb1910c79a-kube-api-access-z6w6g\") pod \"nova-metadata-0\" (UID: \"f489af23-b6c5-42a0-84ef-f5fb1910c79a\") " pod="openstack/nova-metadata-0" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.558264 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.874162 4749 generic.go:334] "Generic (PLEG): container finished" podID="46f858df-0b0b-4e6a-ada4-c3063dbae4c5" containerID="5235264586dc444c39526b9f671323ba394d06f3ac246609350fd331e16a09fa" exitCode=0 Feb 19 18:54:37 crc kubenswrapper[4749]: I0219 18:54:37.874202 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-znh79" event={"ID":"46f858df-0b0b-4e6a-ada4-c3063dbae4c5","Type":"ContainerDied","Data":"5235264586dc444c39526b9f671323ba394d06f3ac246609350fd331e16a09fa"} Feb 19 18:54:38 crc kubenswrapper[4749]: I0219 18:54:38.094062 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 18:54:38 crc kubenswrapper[4749]: I0219 18:54:38.521584 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 18:54:38 crc kubenswrapper[4749]: I0219 18:54:38.521841 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="872559f0-434b-4f87-b6de-c8c56cad33c3" containerName="kube-state-metrics" containerID="cri-o://4137f06e3b49a547062d31e61ecdaae382c94d659728ddfdd4ead2d540194717" gracePeriod=30 Feb 19 18:54:38 crc kubenswrapper[4749]: I0219 18:54:38.696101 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81617779-849b-4cad-a049-580774adcde3" path="/var/lib/kubelet/pods/81617779-849b-4cad-a049-580774adcde3/volumes" Feb 19 18:54:38 crc kubenswrapper[4749]: I0219 18:54:38.894396 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f489af23-b6c5-42a0-84ef-f5fb1910c79a","Type":"ContainerStarted","Data":"6ae26512a4b1f22b7d58307ebf2651e439bb635fc341d284ccf0f05624a449ad"} Feb 19 18:54:38 crc kubenswrapper[4749]: I0219 18:54:38.894437 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f489af23-b6c5-42a0-84ef-f5fb1910c79a","Type":"ContainerStarted","Data":"07e4157fa590b68f6ecc27b24f85c08e555259a8f57179f4ad33b2e8eac11a80"} Feb 19 18:54:38 crc kubenswrapper[4749]: I0219 18:54:38.894464 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f489af23-b6c5-42a0-84ef-f5fb1910c79a","Type":"ContainerStarted","Data":"43e74321512e355c3daa85b6e126ec17b1a2227bd926560bf1df662688e4d10a"} Feb 19 18:54:38 crc kubenswrapper[4749]: I0219 18:54:38.898538 4749 generic.go:334] "Generic (PLEG): container finished" podID="872559f0-434b-4f87-b6de-c8c56cad33c3" containerID="4137f06e3b49a547062d31e61ecdaae382c94d659728ddfdd4ead2d540194717" exitCode=2 Feb 19 18:54:38 crc kubenswrapper[4749]: I0219 18:54:38.898739 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"872559f0-434b-4f87-b6de-c8c56cad33c3","Type":"ContainerDied","Data":"4137f06e3b49a547062d31e61ecdaae382c94d659728ddfdd4ead2d540194717"} Feb 19 18:54:38 crc kubenswrapper[4749]: I0219 18:54:38.910051 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.9100114480000001 podStartE2EDuration="1.910011448s" podCreationTimestamp="2026-02-19 18:54:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:54:38.909090796 +0000 UTC m=+1252.870310760" watchObservedRunningTime="2026-02-19 18:54:38.910011448 +0000 UTC m=+1252.871231402" Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.044283 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.085848 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwmw4\" (UniqueName: \"kubernetes.io/projected/872559f0-434b-4f87-b6de-c8c56cad33c3-kube-api-access-gwmw4\") pod \"872559f0-434b-4f87-b6de-c8c56cad33c3\" (UID: \"872559f0-434b-4f87-b6de-c8c56cad33c3\") " Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.094081 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/872559f0-434b-4f87-b6de-c8c56cad33c3-kube-api-access-gwmw4" (OuterVolumeSpecName: "kube-api-access-gwmw4") pod "872559f0-434b-4f87-b6de-c8c56cad33c3" (UID: "872559f0-434b-4f87-b6de-c8c56cad33c3"). InnerVolumeSpecName "kube-api-access-gwmw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:54:39 crc kubenswrapper[4749]: E0219 18:54:39.152577 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="156dc8d78f3f06e8526d7a8a2b7bd2e6e407211d1471e6febb54fd3b49d91161" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 18:54:39 crc kubenswrapper[4749]: E0219 18:54:39.154662 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="156dc8d78f3f06e8526d7a8a2b7bd2e6e407211d1471e6febb54fd3b49d91161" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 18:54:39 crc kubenswrapper[4749]: E0219 18:54:39.156646 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="156dc8d78f3f06e8526d7a8a2b7bd2e6e407211d1471e6febb54fd3b49d91161" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 18:54:39 crc kubenswrapper[4749]: E0219 18:54:39.156684 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c6f83341-b64f-4544-b33d-bd987d9be64e" containerName="nova-scheduler-scheduler" Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.188446 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwmw4\" (UniqueName: \"kubernetes.io/projected/872559f0-434b-4f87-b6de-c8c56cad33c3-kube-api-access-gwmw4\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.227160 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-znh79" Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.290084 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f858df-0b0b-4e6a-ada4-c3063dbae4c5-config-data\") pod \"46f858df-0b0b-4e6a-ada4-c3063dbae4c5\" (UID: \"46f858df-0b0b-4e6a-ada4-c3063dbae4c5\") " Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.290187 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46f858df-0b0b-4e6a-ada4-c3063dbae4c5-scripts\") pod \"46f858df-0b0b-4e6a-ada4-c3063dbae4c5\" (UID: \"46f858df-0b0b-4e6a-ada4-c3063dbae4c5\") " Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.290240 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wg2qg\" (UniqueName: \"kubernetes.io/projected/46f858df-0b0b-4e6a-ada4-c3063dbae4c5-kube-api-access-wg2qg\") pod \"46f858df-0b0b-4e6a-ada4-c3063dbae4c5\" (UID: \"46f858df-0b0b-4e6a-ada4-c3063dbae4c5\") " Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.290335 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f858df-0b0b-4e6a-ada4-c3063dbae4c5-combined-ca-bundle\") pod \"46f858df-0b0b-4e6a-ada4-c3063dbae4c5\" (UID: \"46f858df-0b0b-4e6a-ada4-c3063dbae4c5\") " Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.295087 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46f858df-0b0b-4e6a-ada4-c3063dbae4c5-kube-api-access-wg2qg" (OuterVolumeSpecName: "kube-api-access-wg2qg") pod "46f858df-0b0b-4e6a-ada4-c3063dbae4c5" (UID: "46f858df-0b0b-4e6a-ada4-c3063dbae4c5"). InnerVolumeSpecName "kube-api-access-wg2qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.299213 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46f858df-0b0b-4e6a-ada4-c3063dbae4c5-scripts" (OuterVolumeSpecName: "scripts") pod "46f858df-0b0b-4e6a-ada4-c3063dbae4c5" (UID: "46f858df-0b0b-4e6a-ada4-c3063dbae4c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.319875 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46f858df-0b0b-4e6a-ada4-c3063dbae4c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46f858df-0b0b-4e6a-ada4-c3063dbae4c5" (UID: "46f858df-0b0b-4e6a-ada4-c3063dbae4c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.335141 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46f858df-0b0b-4e6a-ada4-c3063dbae4c5-config-data" (OuterVolumeSpecName: "config-data") pod "46f858df-0b0b-4e6a-ada4-c3063dbae4c5" (UID: "46f858df-0b0b-4e6a-ada4-c3063dbae4c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.392520 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f858df-0b0b-4e6a-ada4-c3063dbae4c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.392557 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f858df-0b0b-4e6a-ada4-c3063dbae4c5-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.392567 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46f858df-0b0b-4e6a-ada4-c3063dbae4c5-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.392576 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wg2qg\" (UniqueName: \"kubernetes.io/projected/46f858df-0b0b-4e6a-ada4-c3063dbae4c5-kube-api-access-wg2qg\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.908096 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"872559f0-434b-4f87-b6de-c8c56cad33c3","Type":"ContainerDied","Data":"abf5950bdbfbf9f265d9023b1fdf57043643ce879cd4128f70cb8637917306e2"} Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.908134 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.908208 4749 scope.go:117] "RemoveContainer" containerID="4137f06e3b49a547062d31e61ecdaae382c94d659728ddfdd4ead2d540194717" Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.911665 4749 generic.go:334] "Generic (PLEG): container finished" podID="cb0ae847-95f0-442e-82e7-2c05a874e190" containerID="343d4bd70cd44f515f226d72b98302c83a6c50fada29f403a665b407b3568618" exitCode=0 Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.911696 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cb0ae847-95f0-442e-82e7-2c05a874e190","Type":"ContainerDied","Data":"343d4bd70cd44f515f226d72b98302c83a6c50fada29f403a665b407b3568618"} Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.914448 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-znh79" Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.916327 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-znh79" event={"ID":"46f858df-0b0b-4e6a-ada4-c3063dbae4c5","Type":"ContainerDied","Data":"84e6e54b5695689f3aa045d425ba9110e3cfd1e0e0b861348e303f70e9771ec6"} Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.916367 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84e6e54b5695689f3aa045d425ba9110e3cfd1e0e0b861348e303f70e9771ec6" Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.958075 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 18:54:39 crc kubenswrapper[4749]: E0219 18:54:39.958451 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f858df-0b0b-4e6a-ada4-c3063dbae4c5" containerName="nova-cell1-conductor-db-sync" Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.958467 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f858df-0b0b-4e6a-ada4-c3063dbae4c5" containerName="nova-cell1-conductor-db-sync" Feb 19 18:54:39 crc kubenswrapper[4749]: E0219 18:54:39.958497 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="872559f0-434b-4f87-b6de-c8c56cad33c3" containerName="kube-state-metrics" Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.958503 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="872559f0-434b-4f87-b6de-c8c56cad33c3" containerName="kube-state-metrics" Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.958660 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f858df-0b0b-4e6a-ada4-c3063dbae4c5" containerName="nova-cell1-conductor-db-sync" Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.958690 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="872559f0-434b-4f87-b6de-c8c56cad33c3" containerName="kube-state-metrics" Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.959381 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.964764 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 18:54:39 crc kubenswrapper[4749]: I0219 18:54:39.978281 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.011317 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0780a73c-852b-470b-9de7-61afde153d72-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0780a73c-852b-470b-9de7-61afde153d72\") " pod="openstack/nova-cell1-conductor-0" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.011435 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0780a73c-852b-470b-9de7-61afde153d72-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0780a73c-852b-470b-9de7-61afde153d72\") " pod="openstack/nova-cell1-conductor-0" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.011534 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqggm\" (UniqueName: \"kubernetes.io/projected/0780a73c-852b-470b-9de7-61afde153d72-kube-api-access-nqggm\") pod \"nova-cell1-conductor-0\" (UID: \"0780a73c-852b-470b-9de7-61afde153d72\") " pod="openstack/nova-cell1-conductor-0" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.089148 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.101892 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.111261 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.113569 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0780a73c-852b-470b-9de7-61afde153d72-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0780a73c-852b-470b-9de7-61afde153d72\") " pod="openstack/nova-cell1-conductor-0" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.113632 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0780a73c-852b-470b-9de7-61afde153d72-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0780a73c-852b-470b-9de7-61afde153d72\") " pod="openstack/nova-cell1-conductor-0" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.113703 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqggm\" (UniqueName: \"kubernetes.io/projected/0780a73c-852b-470b-9de7-61afde153d72-kube-api-access-nqggm\") pod \"nova-cell1-conductor-0\" (UID: \"0780a73c-852b-470b-9de7-61afde153d72\") " pod="openstack/nova-cell1-conductor-0" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.114071 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.120268 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.123507 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.123698 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.123838 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0780a73c-852b-470b-9de7-61afde153d72-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0780a73c-852b-470b-9de7-61afde153d72\") " pod="openstack/nova-cell1-conductor-0" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.128521 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0780a73c-852b-470b-9de7-61afde153d72-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0780a73c-852b-470b-9de7-61afde153d72\") " pod="openstack/nova-cell1-conductor-0" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.138834 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqggm\" (UniqueName: \"kubernetes.io/projected/0780a73c-852b-470b-9de7-61afde153d72-kube-api-access-nqggm\") pod \"nova-cell1-conductor-0\" (UID: \"0780a73c-852b-470b-9de7-61afde153d72\") " pod="openstack/nova-cell1-conductor-0" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.221377 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6855d59-78cd-4386-b41b-8670ebdadafb-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e6855d59-78cd-4386-b41b-8670ebdadafb\") " pod="openstack/kube-state-metrics-0" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.322993 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swtjd\" (UniqueName: \"kubernetes.io/projected/e6855d59-78cd-4386-b41b-8670ebdadafb-kube-api-access-swtjd\") pod \"kube-state-metrics-0\" (UID: \"e6855d59-78cd-4386-b41b-8670ebdadafb\") " pod="openstack/kube-state-metrics-0" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.323097 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6855d59-78cd-4386-b41b-8670ebdadafb-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e6855d59-78cd-4386-b41b-8670ebdadafb\") " pod="openstack/kube-state-metrics-0" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.323144 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e6855d59-78cd-4386-b41b-8670ebdadafb-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e6855d59-78cd-4386-b41b-8670ebdadafb\") " pod="openstack/kube-state-metrics-0" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.323190 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6855d59-78cd-4386-b41b-8670ebdadafb-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e6855d59-78cd-4386-b41b-8670ebdadafb\") " pod="openstack/kube-state-metrics-0" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.327854 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6855d59-78cd-4386-b41b-8670ebdadafb-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e6855d59-78cd-4386-b41b-8670ebdadafb\") " pod="openstack/kube-state-metrics-0" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.336240 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.348523 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.425848 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb0ae847-95f0-442e-82e7-2c05a874e190-combined-ca-bundle\") pod \"cb0ae847-95f0-442e-82e7-2c05a874e190\" (UID: \"cb0ae847-95f0-442e-82e7-2c05a874e190\") " Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.426339 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6855d59-78cd-4386-b41b-8670ebdadafb-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e6855d59-78cd-4386-b41b-8670ebdadafb\") " pod="openstack/kube-state-metrics-0" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.426413 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e6855d59-78cd-4386-b41b-8670ebdadafb-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e6855d59-78cd-4386-b41b-8670ebdadafb\") " pod="openstack/kube-state-metrics-0" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.426615 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swtjd\" (UniqueName: \"kubernetes.io/projected/e6855d59-78cd-4386-b41b-8670ebdadafb-kube-api-access-swtjd\") pod \"kube-state-metrics-0\" (UID: \"e6855d59-78cd-4386-b41b-8670ebdadafb\") " pod="openstack/kube-state-metrics-0" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.430848 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e6855d59-78cd-4386-b41b-8670ebdadafb-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e6855d59-78cd-4386-b41b-8670ebdadafb\") " pod="openstack/kube-state-metrics-0" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.430969 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6855d59-78cd-4386-b41b-8670ebdadafb-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e6855d59-78cd-4386-b41b-8670ebdadafb\") " pod="openstack/kube-state-metrics-0" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.445494 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swtjd\" (UniqueName: \"kubernetes.io/projected/e6855d59-78cd-4386-b41b-8670ebdadafb-kube-api-access-swtjd\") pod \"kube-state-metrics-0\" (UID: \"e6855d59-78cd-4386-b41b-8670ebdadafb\") " pod="openstack/kube-state-metrics-0" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.464538 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb0ae847-95f0-442e-82e7-2c05a874e190-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb0ae847-95f0-442e-82e7-2c05a874e190" (UID: "cb0ae847-95f0-442e-82e7-2c05a874e190"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.528039 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxmcc\" (UniqueName: \"kubernetes.io/projected/cb0ae847-95f0-442e-82e7-2c05a874e190-kube-api-access-qxmcc\") pod \"cb0ae847-95f0-442e-82e7-2c05a874e190\" (UID: \"cb0ae847-95f0-442e-82e7-2c05a874e190\") " Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.528083 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb0ae847-95f0-442e-82e7-2c05a874e190-config-data\") pod \"cb0ae847-95f0-442e-82e7-2c05a874e190\" (UID: \"cb0ae847-95f0-442e-82e7-2c05a874e190\") " Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.528145 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb0ae847-95f0-442e-82e7-2c05a874e190-logs\") pod \"cb0ae847-95f0-442e-82e7-2c05a874e190\" (UID: \"cb0ae847-95f0-442e-82e7-2c05a874e190\") " Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.528635 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb0ae847-95f0-442e-82e7-2c05a874e190-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.529081 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb0ae847-95f0-442e-82e7-2c05a874e190-logs" (OuterVolumeSpecName: "logs") pod "cb0ae847-95f0-442e-82e7-2c05a874e190" (UID: "cb0ae847-95f0-442e-82e7-2c05a874e190"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.530550 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.534473 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb0ae847-95f0-442e-82e7-2c05a874e190-kube-api-access-qxmcc" (OuterVolumeSpecName: "kube-api-access-qxmcc") pod "cb0ae847-95f0-442e-82e7-2c05a874e190" (UID: "cb0ae847-95f0-442e-82e7-2c05a874e190"). InnerVolumeSpecName "kube-api-access-qxmcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.575643 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb0ae847-95f0-442e-82e7-2c05a874e190-config-data" (OuterVolumeSpecName: "config-data") pod "cb0ae847-95f0-442e-82e7-2c05a874e190" (UID: "cb0ae847-95f0-442e-82e7-2c05a874e190"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.629383 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxmcc\" (UniqueName: \"kubernetes.io/projected/cb0ae847-95f0-442e-82e7-2c05a874e190-kube-api-access-qxmcc\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.629416 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb0ae847-95f0-442e-82e7-2c05a874e190-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.629428 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb0ae847-95f0-442e-82e7-2c05a874e190-logs\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.698659 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="872559f0-434b-4f87-b6de-c8c56cad33c3" path="/var/lib/kubelet/pods/872559f0-434b-4f87-b6de-c8c56cad33c3/volumes" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.808339 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.837116 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.837401 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b39e6857-5164-4568-b6e9-bbb966f59eaf" containerName="ceilometer-central-agent" containerID="cri-o://832baa879a55b54bdd345cc180f3475e0ad99e513766be4f53df607d27df6226" gracePeriod=30 Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.837533 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b39e6857-5164-4568-b6e9-bbb966f59eaf" containerName="proxy-httpd" containerID="cri-o://8fd21324a96ab0276be5194c939a899c0d772ed167b83498e1c9ef6604d5c909" gracePeriod=30 Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.837572 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b39e6857-5164-4568-b6e9-bbb966f59eaf" containerName="sg-core" containerID="cri-o://1c75e71934d67d6b6d41b48e045a78a1d625974cc7bca63d43248edaf85f80a9" gracePeriod=30 Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.837655 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b39e6857-5164-4568-b6e9-bbb966f59eaf" containerName="ceilometer-notification-agent" containerID="cri-o://5799f1bf305e09e0667f2b557c57b80f10435c0e0b580337933ffeaa46c08bde" gracePeriod=30 Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.923189 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0780a73c-852b-470b-9de7-61afde153d72","Type":"ContainerStarted","Data":"24c709e416690d175a8818d2951e48a45bfc5ff6928d5574bb7a13b0c01e8592"} Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.926004 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cb0ae847-95f0-442e-82e7-2c05a874e190","Type":"ContainerDied","Data":"889adf59bc131d682b291d5893b8863f04dacd374041707eead2f0b484206306"} Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.926098 4749 scope.go:117] "RemoveContainer" containerID="343d4bd70cd44f515f226d72b98302c83a6c50fada29f403a665b407b3568618" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.926194 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.958714 4749 scope.go:117] "RemoveContainer" containerID="247756fae8a812af4a873a1af74ad7383188c41a2b9446417410a2a8690ff733" Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.977051 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.986533 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 18:54:40 crc kubenswrapper[4749]: I0219 18:54:40.993737 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.014315 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 18:54:41 crc kubenswrapper[4749]: E0219 18:54:41.014720 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb0ae847-95f0-442e-82e7-2c05a874e190" containerName="nova-api-log" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.014730 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb0ae847-95f0-442e-82e7-2c05a874e190" containerName="nova-api-log" Feb 19 18:54:41 crc kubenswrapper[4749]: E0219 18:54:41.014763 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb0ae847-95f0-442e-82e7-2c05a874e190" containerName="nova-api-api" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.014769 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb0ae847-95f0-442e-82e7-2c05a874e190" containerName="nova-api-api" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.014966 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb0ae847-95f0-442e-82e7-2c05a874e190" containerName="nova-api-log" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.014981 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb0ae847-95f0-442e-82e7-2c05a874e190" containerName="nova-api-api" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.015946 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.019264 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.027018 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.138861 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71f6b71f-4791-4a38-b5ae-282a0f643b6c-config-data\") pod \"nova-api-0\" (UID: \"71f6b71f-4791-4a38-b5ae-282a0f643b6c\") " pod="openstack/nova-api-0" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.138977 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71f6b71f-4791-4a38-b5ae-282a0f643b6c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"71f6b71f-4791-4a38-b5ae-282a0f643b6c\") " pod="openstack/nova-api-0" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.139219 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68vdt\" (UniqueName: \"kubernetes.io/projected/71f6b71f-4791-4a38-b5ae-282a0f643b6c-kube-api-access-68vdt\") pod \"nova-api-0\" (UID: \"71f6b71f-4791-4a38-b5ae-282a0f643b6c\") " pod="openstack/nova-api-0" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.139372 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71f6b71f-4791-4a38-b5ae-282a0f643b6c-logs\") pod \"nova-api-0\" (UID: \"71f6b71f-4791-4a38-b5ae-282a0f643b6c\") " pod="openstack/nova-api-0" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.240568 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71f6b71f-4791-4a38-b5ae-282a0f643b6c-config-data\") pod \"nova-api-0\" (UID: \"71f6b71f-4791-4a38-b5ae-282a0f643b6c\") " pod="openstack/nova-api-0" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.240626 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71f6b71f-4791-4a38-b5ae-282a0f643b6c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"71f6b71f-4791-4a38-b5ae-282a0f643b6c\") " pod="openstack/nova-api-0" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.240723 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68vdt\" (UniqueName: \"kubernetes.io/projected/71f6b71f-4791-4a38-b5ae-282a0f643b6c-kube-api-access-68vdt\") pod \"nova-api-0\" (UID: \"71f6b71f-4791-4a38-b5ae-282a0f643b6c\") " pod="openstack/nova-api-0" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.240768 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71f6b71f-4791-4a38-b5ae-282a0f643b6c-logs\") pod \"nova-api-0\" (UID: \"71f6b71f-4791-4a38-b5ae-282a0f643b6c\") " pod="openstack/nova-api-0" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.241217 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71f6b71f-4791-4a38-b5ae-282a0f643b6c-logs\") pod \"nova-api-0\" (UID: \"71f6b71f-4791-4a38-b5ae-282a0f643b6c\") " pod="openstack/nova-api-0" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.247733 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71f6b71f-4791-4a38-b5ae-282a0f643b6c-config-data\") pod \"nova-api-0\" (UID: \"71f6b71f-4791-4a38-b5ae-282a0f643b6c\") " pod="openstack/nova-api-0" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.248495 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71f6b71f-4791-4a38-b5ae-282a0f643b6c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"71f6b71f-4791-4a38-b5ae-282a0f643b6c\") " pod="openstack/nova-api-0" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.262500 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68vdt\" (UniqueName: \"kubernetes.io/projected/71f6b71f-4791-4a38-b5ae-282a0f643b6c-kube-api-access-68vdt\") pod \"nova-api-0\" (UID: \"71f6b71f-4791-4a38-b5ae-282a0f643b6c\") " pod="openstack/nova-api-0" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.367931 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.381157 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.545800 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6f83341-b64f-4544-b33d-bd987d9be64e-config-data\") pod \"c6f83341-b64f-4544-b33d-bd987d9be64e\" (UID: \"c6f83341-b64f-4544-b33d-bd987d9be64e\") " Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.545955 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr7qz\" (UniqueName: \"kubernetes.io/projected/c6f83341-b64f-4544-b33d-bd987d9be64e-kube-api-access-gr7qz\") pod \"c6f83341-b64f-4544-b33d-bd987d9be64e\" (UID: \"c6f83341-b64f-4544-b33d-bd987d9be64e\") " Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.546051 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f83341-b64f-4544-b33d-bd987d9be64e-combined-ca-bundle\") pod \"c6f83341-b64f-4544-b33d-bd987d9be64e\" (UID: \"c6f83341-b64f-4544-b33d-bd987d9be64e\") " Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.553215 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6f83341-b64f-4544-b33d-bd987d9be64e-kube-api-access-gr7qz" (OuterVolumeSpecName: "kube-api-access-gr7qz") pod "c6f83341-b64f-4544-b33d-bd987d9be64e" (UID: "c6f83341-b64f-4544-b33d-bd987d9be64e"). InnerVolumeSpecName "kube-api-access-gr7qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.577243 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6f83341-b64f-4544-b33d-bd987d9be64e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6f83341-b64f-4544-b33d-bd987d9be64e" (UID: "c6f83341-b64f-4544-b33d-bd987d9be64e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.578263 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6f83341-b64f-4544-b33d-bd987d9be64e-config-data" (OuterVolumeSpecName: "config-data") pod "c6f83341-b64f-4544-b33d-bd987d9be64e" (UID: "c6f83341-b64f-4544-b33d-bd987d9be64e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.647678 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f83341-b64f-4544-b33d-bd987d9be64e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.647706 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6f83341-b64f-4544-b33d-bd987d9be64e-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.647717 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr7qz\" (UniqueName: \"kubernetes.io/projected/c6f83341-b64f-4544-b33d-bd987d9be64e-kube-api-access-gr7qz\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.849466 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.938242 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0780a73c-852b-470b-9de7-61afde153d72","Type":"ContainerStarted","Data":"e8ea032a49730150e5f135283fbd4dd24ff9efb79f3a500a726ff2c2f5d14f15"} Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.938318 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.940634 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"71f6b71f-4791-4a38-b5ae-282a0f643b6c","Type":"ContainerStarted","Data":"47d5a6add3446140db91922bebc6c24432b676470566ea0f1d6b087db762c784"} Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.942845 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e6855d59-78cd-4386-b41b-8670ebdadafb","Type":"ContainerStarted","Data":"9de380f8dfe2f5dcee9fdc4c5c262e8eccb0d1a73ecd56f7da7bc54df00e5b7b"} Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.942873 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e6855d59-78cd-4386-b41b-8670ebdadafb","Type":"ContainerStarted","Data":"701593ce7ab855dc7975ad68cb1b87483ee9433dddc4ff960c48383e7705c184"} Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.942988 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.945296 4749 generic.go:334] "Generic (PLEG): container finished" podID="b39e6857-5164-4568-b6e9-bbb966f59eaf" containerID="8fd21324a96ab0276be5194c939a899c0d772ed167b83498e1c9ef6604d5c909" exitCode=0 Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.945327 4749 generic.go:334] "Generic (PLEG): container finished" podID="b39e6857-5164-4568-b6e9-bbb966f59eaf" containerID="1c75e71934d67d6b6d41b48e045a78a1d625974cc7bca63d43248edaf85f80a9" exitCode=2 Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.945338 4749 generic.go:334] "Generic (PLEG): container finished" podID="b39e6857-5164-4568-b6e9-bbb966f59eaf" containerID="832baa879a55b54bdd345cc180f3475e0ad99e513766be4f53df607d27df6226" exitCode=0 Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.945396 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b39e6857-5164-4568-b6e9-bbb966f59eaf","Type":"ContainerDied","Data":"8fd21324a96ab0276be5194c939a899c0d772ed167b83498e1c9ef6604d5c909"} Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.945427 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b39e6857-5164-4568-b6e9-bbb966f59eaf","Type":"ContainerDied","Data":"1c75e71934d67d6b6d41b48e045a78a1d625974cc7bca63d43248edaf85f80a9"} Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.945441 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b39e6857-5164-4568-b6e9-bbb966f59eaf","Type":"ContainerDied","Data":"832baa879a55b54bdd345cc180f3475e0ad99e513766be4f53df607d27df6226"} Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.947525 4749 generic.go:334] "Generic (PLEG): container finished" podID="c6f83341-b64f-4544-b33d-bd987d9be64e" containerID="156dc8d78f3f06e8526d7a8a2b7bd2e6e407211d1471e6febb54fd3b49d91161" exitCode=0 Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.947575 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c6f83341-b64f-4544-b33d-bd987d9be64e","Type":"ContainerDied","Data":"156dc8d78f3f06e8526d7a8a2b7bd2e6e407211d1471e6febb54fd3b49d91161"} Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.947606 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c6f83341-b64f-4544-b33d-bd987d9be64e","Type":"ContainerDied","Data":"f9fd49ba5e07af95a9bdbff645fa68a36b92b727d0b25a508be7aebd450578b3"} Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.947627 4749 scope.go:117] "RemoveContainer" containerID="156dc8d78f3f06e8526d7a8a2b7bd2e6e407211d1471e6febb54fd3b49d91161" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.947721 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.963051 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.963017421 podStartE2EDuration="2.963017421s" podCreationTimestamp="2026-02-19 18:54:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:54:41.955965551 +0000 UTC m=+1255.917185515" watchObservedRunningTime="2026-02-19 18:54:41.963017421 +0000 UTC m=+1255.924237375" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.976209 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.467370312 podStartE2EDuration="1.97619251s" podCreationTimestamp="2026-02-19 18:54:40 +0000 UTC" firstStartedPulling="2026-02-19 18:54:41.09595453 +0000 UTC m=+1255.057174484" lastFinishedPulling="2026-02-19 18:54:41.604776728 +0000 UTC m=+1255.565996682" observedRunningTime="2026-02-19 18:54:41.972677075 +0000 UTC m=+1255.933897019" watchObservedRunningTime="2026-02-19 18:54:41.97619251 +0000 UTC m=+1255.937412464" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.981670 4749 scope.go:117] "RemoveContainer" containerID="156dc8d78f3f06e8526d7a8a2b7bd2e6e407211d1471e6febb54fd3b49d91161" Feb 19 18:54:41 crc kubenswrapper[4749]: E0219 18:54:41.986527 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"156dc8d78f3f06e8526d7a8a2b7bd2e6e407211d1471e6febb54fd3b49d91161\": container with ID starting with 156dc8d78f3f06e8526d7a8a2b7bd2e6e407211d1471e6febb54fd3b49d91161 not found: ID does not exist" containerID="156dc8d78f3f06e8526d7a8a2b7bd2e6e407211d1471e6febb54fd3b49d91161" Feb 19 18:54:41 crc kubenswrapper[4749]: I0219 18:54:41.986572 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"156dc8d78f3f06e8526d7a8a2b7bd2e6e407211d1471e6febb54fd3b49d91161"} err="failed to get container status \"156dc8d78f3f06e8526d7a8a2b7bd2e6e407211d1471e6febb54fd3b49d91161\": rpc error: code = NotFound desc = could not find container \"156dc8d78f3f06e8526d7a8a2b7bd2e6e407211d1471e6febb54fd3b49d91161\": container with ID starting with 156dc8d78f3f06e8526d7a8a2b7bd2e6e407211d1471e6febb54fd3b49d91161 not found: ID does not exist" Feb 19 18:54:42 crc kubenswrapper[4749]: I0219 18:54:42.004957 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 18:54:42 crc kubenswrapper[4749]: I0219 18:54:42.027726 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 18:54:42 crc kubenswrapper[4749]: I0219 18:54:42.042218 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 18:54:42 crc kubenswrapper[4749]: E0219 18:54:42.044648 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6f83341-b64f-4544-b33d-bd987d9be64e" containerName="nova-scheduler-scheduler" Feb 19 18:54:42 crc kubenswrapper[4749]: I0219 18:54:42.044676 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f83341-b64f-4544-b33d-bd987d9be64e" containerName="nova-scheduler-scheduler" Feb 19 18:54:42 crc kubenswrapper[4749]: I0219 18:54:42.044920 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6f83341-b64f-4544-b33d-bd987d9be64e" containerName="nova-scheduler-scheduler" Feb 19 18:54:42 crc kubenswrapper[4749]: I0219 18:54:42.045784 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 18:54:42 crc kubenswrapper[4749]: I0219 18:54:42.049520 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 18:54:42 crc kubenswrapper[4749]: I0219 18:54:42.054369 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 18:54:42 crc kubenswrapper[4749]: I0219 18:54:42.158689 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/471c4100-a672-4b83-842f-03a42aae3a92-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"471c4100-a672-4b83-842f-03a42aae3a92\") " pod="openstack/nova-scheduler-0" Feb 19 18:54:42 crc kubenswrapper[4749]: I0219 18:54:42.158986 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/471c4100-a672-4b83-842f-03a42aae3a92-config-data\") pod \"nova-scheduler-0\" (UID: \"471c4100-a672-4b83-842f-03a42aae3a92\") " pod="openstack/nova-scheduler-0" Feb 19 18:54:42 crc kubenswrapper[4749]: I0219 18:54:42.159179 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftqcb\" (UniqueName: \"kubernetes.io/projected/471c4100-a672-4b83-842f-03a42aae3a92-kube-api-access-ftqcb\") pod \"nova-scheduler-0\" (UID: \"471c4100-a672-4b83-842f-03a42aae3a92\") " pod="openstack/nova-scheduler-0" Feb 19 18:54:42 crc kubenswrapper[4749]: I0219 18:54:42.261404 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftqcb\" (UniqueName: \"kubernetes.io/projected/471c4100-a672-4b83-842f-03a42aae3a92-kube-api-access-ftqcb\") pod \"nova-scheduler-0\" (UID: \"471c4100-a672-4b83-842f-03a42aae3a92\") " pod="openstack/nova-scheduler-0" Feb 19 18:54:42 crc kubenswrapper[4749]: I0219 18:54:42.261985 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/471c4100-a672-4b83-842f-03a42aae3a92-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"471c4100-a672-4b83-842f-03a42aae3a92\") " pod="openstack/nova-scheduler-0" Feb 19 18:54:42 crc kubenswrapper[4749]: I0219 18:54:42.262694 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/471c4100-a672-4b83-842f-03a42aae3a92-config-data\") pod \"nova-scheduler-0\" (UID: \"471c4100-a672-4b83-842f-03a42aae3a92\") " pod="openstack/nova-scheduler-0" Feb 19 18:54:42 crc kubenswrapper[4749]: I0219 18:54:42.268222 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/471c4100-a672-4b83-842f-03a42aae3a92-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"471c4100-a672-4b83-842f-03a42aae3a92\") " pod="openstack/nova-scheduler-0" Feb 19 18:54:42 crc kubenswrapper[4749]: I0219 18:54:42.268755 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/471c4100-a672-4b83-842f-03a42aae3a92-config-data\") pod \"nova-scheduler-0\" (UID: \"471c4100-a672-4b83-842f-03a42aae3a92\") " pod="openstack/nova-scheduler-0" Feb 19 18:54:42 crc kubenswrapper[4749]: I0219 18:54:42.280863 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftqcb\" (UniqueName: \"kubernetes.io/projected/471c4100-a672-4b83-842f-03a42aae3a92-kube-api-access-ftqcb\") pod \"nova-scheduler-0\" (UID: \"471c4100-a672-4b83-842f-03a42aae3a92\") " pod="openstack/nova-scheduler-0" Feb 19 18:54:42 crc kubenswrapper[4749]: I0219 18:54:42.377192 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 18:54:43 crc kubenswrapper[4749]: I0219 18:54:42.562236 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 18:54:43 crc kubenswrapper[4749]: I0219 18:54:42.563534 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 18:54:43 crc kubenswrapper[4749]: I0219 18:54:42.689187 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6f83341-b64f-4544-b33d-bd987d9be64e" path="/var/lib/kubelet/pods/c6f83341-b64f-4544-b33d-bd987d9be64e/volumes" Feb 19 18:54:43 crc kubenswrapper[4749]: I0219 18:54:42.689917 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb0ae847-95f0-442e-82e7-2c05a874e190" path="/var/lib/kubelet/pods/cb0ae847-95f0-442e-82e7-2c05a874e190/volumes" Feb 19 18:54:43 crc kubenswrapper[4749]: I0219 18:54:42.868046 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 18:54:43 crc kubenswrapper[4749]: I0219 18:54:42.966525 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"471c4100-a672-4b83-842f-03a42aae3a92","Type":"ContainerStarted","Data":"76e825c462dc9ffbb756f73bc814b5648f7fd9c75734fae235683cce027c3600"} Feb 19 18:54:43 crc kubenswrapper[4749]: I0219 18:54:42.988015 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"71f6b71f-4791-4a38-b5ae-282a0f643b6c","Type":"ContainerStarted","Data":"5f89aff398323f23140302978b1bc2e20e0516f69246cf35bcc4b634a078617c"} Feb 19 18:54:43 crc kubenswrapper[4749]: I0219 18:54:42.988075 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"71f6b71f-4791-4a38-b5ae-282a0f643b6c","Type":"ContainerStarted","Data":"007ad0b9a3629df8e24aab6c43fc1a95728a66d547f53b1015e7f820e0165f33"} Feb 19 18:54:43 crc kubenswrapper[4749]: I0219 18:54:43.033839 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.033814335 podStartE2EDuration="3.033814335s" podCreationTimestamp="2026-02-19 18:54:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:54:43.011735761 +0000 UTC m=+1256.972955715" watchObservedRunningTime="2026-02-19 18:54:43.033814335 +0000 UTC m=+1256.995034289" Feb 19 18:54:43 crc kubenswrapper[4749]: I0219 18:54:43.997127 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"471c4100-a672-4b83-842f-03a42aae3a92","Type":"ContainerStarted","Data":"c29fa33ea1c8273b6bfee895a8e4ed0b697e18720714858f0daabdf5f958be85"} Feb 19 18:54:44 crc kubenswrapper[4749]: I0219 18:54:44.016291 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.01626779 podStartE2EDuration="3.01626779s" podCreationTimestamp="2026-02-19 18:54:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:54:44.011930585 +0000 UTC m=+1257.973150569" watchObservedRunningTime="2026-02-19 18:54:44.01626779 +0000 UTC m=+1257.977487734" Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.741804 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.871068 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht4pz\" (UniqueName: \"kubernetes.io/projected/b39e6857-5164-4568-b6e9-bbb966f59eaf-kube-api-access-ht4pz\") pod \"b39e6857-5164-4568-b6e9-bbb966f59eaf\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.871124 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b39e6857-5164-4568-b6e9-bbb966f59eaf-log-httpd\") pod \"b39e6857-5164-4568-b6e9-bbb966f59eaf\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.871157 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b39e6857-5164-4568-b6e9-bbb966f59eaf-scripts\") pod \"b39e6857-5164-4568-b6e9-bbb966f59eaf\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.871343 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b39e6857-5164-4568-b6e9-bbb966f59eaf-run-httpd\") pod \"b39e6857-5164-4568-b6e9-bbb966f59eaf\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.871364 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b39e6857-5164-4568-b6e9-bbb966f59eaf-sg-core-conf-yaml\") pod \"b39e6857-5164-4568-b6e9-bbb966f59eaf\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.871405 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b39e6857-5164-4568-b6e9-bbb966f59eaf-config-data\") pod \"b39e6857-5164-4568-b6e9-bbb966f59eaf\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.871454 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b39e6857-5164-4568-b6e9-bbb966f59eaf-combined-ca-bundle\") pod \"b39e6857-5164-4568-b6e9-bbb966f59eaf\" (UID: \"b39e6857-5164-4568-b6e9-bbb966f59eaf\") " Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.873144 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b39e6857-5164-4568-b6e9-bbb966f59eaf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b39e6857-5164-4568-b6e9-bbb966f59eaf" (UID: "b39e6857-5164-4568-b6e9-bbb966f59eaf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.873383 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b39e6857-5164-4568-b6e9-bbb966f59eaf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b39e6857-5164-4568-b6e9-bbb966f59eaf" (UID: "b39e6857-5164-4568-b6e9-bbb966f59eaf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.878473 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w2ns9"] Feb 19 18:54:46 crc kubenswrapper[4749]: E0219 18:54:46.879003 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39e6857-5164-4568-b6e9-bbb966f59eaf" containerName="ceilometer-notification-agent" Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.879042 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39e6857-5164-4568-b6e9-bbb966f59eaf" containerName="ceilometer-notification-agent" Feb 19 18:54:46 crc kubenswrapper[4749]: E0219 18:54:46.879058 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39e6857-5164-4568-b6e9-bbb966f59eaf" containerName="sg-core" Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.879067 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39e6857-5164-4568-b6e9-bbb966f59eaf" containerName="sg-core" Feb 19 18:54:46 crc kubenswrapper[4749]: E0219 18:54:46.879085 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39e6857-5164-4568-b6e9-bbb966f59eaf" containerName="ceilometer-central-agent" Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.879093 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39e6857-5164-4568-b6e9-bbb966f59eaf" containerName="ceilometer-central-agent" Feb 19 18:54:46 crc kubenswrapper[4749]: E0219 18:54:46.879106 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39e6857-5164-4568-b6e9-bbb966f59eaf" containerName="proxy-httpd" Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.879114 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39e6857-5164-4568-b6e9-bbb966f59eaf" containerName="proxy-httpd" Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.879321 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b39e6857-5164-4568-b6e9-bbb966f59eaf" containerName="ceilometer-central-agent" Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.879359 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b39e6857-5164-4568-b6e9-bbb966f59eaf" containerName="ceilometer-notification-agent" Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.879369 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b39e6857-5164-4568-b6e9-bbb966f59eaf" containerName="sg-core" Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.879381 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b39e6857-5164-4568-b6e9-bbb966f59eaf" containerName="proxy-httpd" Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.880823 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w2ns9" Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.889830 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b39e6857-5164-4568-b6e9-bbb966f59eaf-scripts" (OuterVolumeSpecName: "scripts") pod "b39e6857-5164-4568-b6e9-bbb966f59eaf" (UID: "b39e6857-5164-4568-b6e9-bbb966f59eaf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.895668 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b39e6857-5164-4568-b6e9-bbb966f59eaf-kube-api-access-ht4pz" (OuterVolumeSpecName: "kube-api-access-ht4pz") pod "b39e6857-5164-4568-b6e9-bbb966f59eaf" (UID: "b39e6857-5164-4568-b6e9-bbb966f59eaf"). InnerVolumeSpecName "kube-api-access-ht4pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.938086 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w2ns9"] Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.976253 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/144a66be-49a6-470b-af60-764fa5d4d7dd-utilities\") pod \"redhat-operators-w2ns9\" (UID: \"144a66be-49a6-470b-af60-764fa5d4d7dd\") " pod="openshift-marketplace/redhat-operators-w2ns9" Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.976332 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79hxw\" (UniqueName: \"kubernetes.io/projected/144a66be-49a6-470b-af60-764fa5d4d7dd-kube-api-access-79hxw\") pod \"redhat-operators-w2ns9\" (UID: \"144a66be-49a6-470b-af60-764fa5d4d7dd\") " pod="openshift-marketplace/redhat-operators-w2ns9" Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.976420 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/144a66be-49a6-470b-af60-764fa5d4d7dd-catalog-content\") pod \"redhat-operators-w2ns9\" (UID: \"144a66be-49a6-470b-af60-764fa5d4d7dd\") " pod="openshift-marketplace/redhat-operators-w2ns9" Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.976595 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht4pz\" (UniqueName: \"kubernetes.io/projected/b39e6857-5164-4568-b6e9-bbb966f59eaf-kube-api-access-ht4pz\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.976611 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b39e6857-5164-4568-b6e9-bbb966f59eaf-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.976626 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b39e6857-5164-4568-b6e9-bbb966f59eaf-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.976637 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b39e6857-5164-4568-b6e9-bbb966f59eaf-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:46 crc kubenswrapper[4749]: I0219 18:54:46.977201 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b39e6857-5164-4568-b6e9-bbb966f59eaf-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b39e6857-5164-4568-b6e9-bbb966f59eaf" (UID: "b39e6857-5164-4568-b6e9-bbb966f59eaf"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.055222 4749 generic.go:334] "Generic (PLEG): container finished" podID="b39e6857-5164-4568-b6e9-bbb966f59eaf" containerID="5799f1bf305e09e0667f2b557c57b80f10435c0e0b580337933ffeaa46c08bde" exitCode=0 Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.055863 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b39e6857-5164-4568-b6e9-bbb966f59eaf","Type":"ContainerDied","Data":"5799f1bf305e09e0667f2b557c57b80f10435c0e0b580337933ffeaa46c08bde"} Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.068002 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b39e6857-5164-4568-b6e9-bbb966f59eaf","Type":"ContainerDied","Data":"6965ff897db76677d359d2a97eaab40f61d5e0413240ad0f40f827670935a5b3"} Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.068056 4749 scope.go:117] "RemoveContainer" containerID="8fd21324a96ab0276be5194c939a899c0d772ed167b83498e1c9ef6604d5c909" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.055915 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.083281 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b39e6857-5164-4568-b6e9-bbb966f59eaf-config-data" (OuterVolumeSpecName: "config-data") pod "b39e6857-5164-4568-b6e9-bbb966f59eaf" (UID: "b39e6857-5164-4568-b6e9-bbb966f59eaf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.087588 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b39e6857-5164-4568-b6e9-bbb966f59eaf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b39e6857-5164-4568-b6e9-bbb966f59eaf" (UID: "b39e6857-5164-4568-b6e9-bbb966f59eaf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.087754 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/144a66be-49a6-470b-af60-764fa5d4d7dd-catalog-content\") pod \"redhat-operators-w2ns9\" (UID: \"144a66be-49a6-470b-af60-764fa5d4d7dd\") " pod="openshift-marketplace/redhat-operators-w2ns9" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.088094 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/144a66be-49a6-470b-af60-764fa5d4d7dd-utilities\") pod \"redhat-operators-w2ns9\" (UID: \"144a66be-49a6-470b-af60-764fa5d4d7dd\") " pod="openshift-marketplace/redhat-operators-w2ns9" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.088163 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79hxw\" (UniqueName: \"kubernetes.io/projected/144a66be-49a6-470b-af60-764fa5d4d7dd-kube-api-access-79hxw\") pod \"redhat-operators-w2ns9\" (UID: \"144a66be-49a6-470b-af60-764fa5d4d7dd\") " pod="openshift-marketplace/redhat-operators-w2ns9" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.088356 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b39e6857-5164-4568-b6e9-bbb966f59eaf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.088373 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b39e6857-5164-4568-b6e9-bbb966f59eaf-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.088388 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b39e6857-5164-4568-b6e9-bbb966f59eaf-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.089020 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/144a66be-49a6-470b-af60-764fa5d4d7dd-utilities\") pod \"redhat-operators-w2ns9\" (UID: \"144a66be-49a6-470b-af60-764fa5d4d7dd\") " pod="openshift-marketplace/redhat-operators-w2ns9" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.090108 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/144a66be-49a6-470b-af60-764fa5d4d7dd-catalog-content\") pod \"redhat-operators-w2ns9\" (UID: \"144a66be-49a6-470b-af60-764fa5d4d7dd\") " pod="openshift-marketplace/redhat-operators-w2ns9" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.109232 4749 scope.go:117] "RemoveContainer" containerID="1c75e71934d67d6b6d41b48e045a78a1d625974cc7bca63d43248edaf85f80a9" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.113787 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79hxw\" (UniqueName: \"kubernetes.io/projected/144a66be-49a6-470b-af60-764fa5d4d7dd-kube-api-access-79hxw\") pod \"redhat-operators-w2ns9\" (UID: \"144a66be-49a6-470b-af60-764fa5d4d7dd\") " pod="openshift-marketplace/redhat-operators-w2ns9" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.137431 4749 scope.go:117] "RemoveContainer" containerID="5799f1bf305e09e0667f2b557c57b80f10435c0e0b580337933ffeaa46c08bde" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.166318 4749 scope.go:117] "RemoveContainer" containerID="832baa879a55b54bdd345cc180f3475e0ad99e513766be4f53df607d27df6226" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.190419 4749 scope.go:117] "RemoveContainer" containerID="8fd21324a96ab0276be5194c939a899c0d772ed167b83498e1c9ef6604d5c909" Feb 19 18:54:47 crc kubenswrapper[4749]: E0219 18:54:47.190848 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fd21324a96ab0276be5194c939a899c0d772ed167b83498e1c9ef6604d5c909\": container with ID starting with 8fd21324a96ab0276be5194c939a899c0d772ed167b83498e1c9ef6604d5c909 not found: ID does not exist" containerID="8fd21324a96ab0276be5194c939a899c0d772ed167b83498e1c9ef6604d5c909" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.190894 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fd21324a96ab0276be5194c939a899c0d772ed167b83498e1c9ef6604d5c909"} err="failed to get container status \"8fd21324a96ab0276be5194c939a899c0d772ed167b83498e1c9ef6604d5c909\": rpc error: code = NotFound desc = could not find container \"8fd21324a96ab0276be5194c939a899c0d772ed167b83498e1c9ef6604d5c909\": container with ID starting with 8fd21324a96ab0276be5194c939a899c0d772ed167b83498e1c9ef6604d5c909 not found: ID does not exist" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.190925 4749 scope.go:117] "RemoveContainer" containerID="1c75e71934d67d6b6d41b48e045a78a1d625974cc7bca63d43248edaf85f80a9" Feb 19 18:54:47 crc kubenswrapper[4749]: E0219 18:54:47.191232 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c75e71934d67d6b6d41b48e045a78a1d625974cc7bca63d43248edaf85f80a9\": container with ID starting with 1c75e71934d67d6b6d41b48e045a78a1d625974cc7bca63d43248edaf85f80a9 not found: ID does not exist" containerID="1c75e71934d67d6b6d41b48e045a78a1d625974cc7bca63d43248edaf85f80a9" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.191268 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c75e71934d67d6b6d41b48e045a78a1d625974cc7bca63d43248edaf85f80a9"} err="failed to get container status \"1c75e71934d67d6b6d41b48e045a78a1d625974cc7bca63d43248edaf85f80a9\": rpc error: code = NotFound desc = could not find container \"1c75e71934d67d6b6d41b48e045a78a1d625974cc7bca63d43248edaf85f80a9\": container with ID starting with 1c75e71934d67d6b6d41b48e045a78a1d625974cc7bca63d43248edaf85f80a9 not found: ID does not exist" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.191294 4749 scope.go:117] "RemoveContainer" containerID="5799f1bf305e09e0667f2b557c57b80f10435c0e0b580337933ffeaa46c08bde" Feb 19 18:54:47 crc kubenswrapper[4749]: E0219 18:54:47.191543 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5799f1bf305e09e0667f2b557c57b80f10435c0e0b580337933ffeaa46c08bde\": container with ID starting with 5799f1bf305e09e0667f2b557c57b80f10435c0e0b580337933ffeaa46c08bde not found: ID does not exist" containerID="5799f1bf305e09e0667f2b557c57b80f10435c0e0b580337933ffeaa46c08bde" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.191590 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5799f1bf305e09e0667f2b557c57b80f10435c0e0b580337933ffeaa46c08bde"} err="failed to get container status \"5799f1bf305e09e0667f2b557c57b80f10435c0e0b580337933ffeaa46c08bde\": rpc error: code = NotFound desc = could not find container \"5799f1bf305e09e0667f2b557c57b80f10435c0e0b580337933ffeaa46c08bde\": container with ID starting with 5799f1bf305e09e0667f2b557c57b80f10435c0e0b580337933ffeaa46c08bde not found: ID does not exist" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.191625 4749 scope.go:117] "RemoveContainer" containerID="832baa879a55b54bdd345cc180f3475e0ad99e513766be4f53df607d27df6226" Feb 19 18:54:47 crc kubenswrapper[4749]: E0219 18:54:47.191866 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"832baa879a55b54bdd345cc180f3475e0ad99e513766be4f53df607d27df6226\": container with ID starting with 832baa879a55b54bdd345cc180f3475e0ad99e513766be4f53df607d27df6226 not found: ID does not exist" containerID="832baa879a55b54bdd345cc180f3475e0ad99e513766be4f53df607d27df6226" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.191892 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"832baa879a55b54bdd345cc180f3475e0ad99e513766be4f53df607d27df6226"} err="failed to get container status \"832baa879a55b54bdd345cc180f3475e0ad99e513766be4f53df607d27df6226\": rpc error: code = NotFound desc = could not find container \"832baa879a55b54bdd345cc180f3475e0ad99e513766be4f53df607d27df6226\": container with ID starting with 832baa879a55b54bdd345cc180f3475e0ad99e513766be4f53df607d27df6226 not found: ID does not exist" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.300191 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w2ns9" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.377424 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.477990 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.499588 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.510752 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.513309 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.515864 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.516100 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.516233 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.518766 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.558406 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.558686 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.701285 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnr4m\" (UniqueName: \"kubernetes.io/projected/db79f1f6-dfad-45fd-b30c-c88c883f6318-kube-api-access-nnr4m\") pod \"ceilometer-0\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " pod="openstack/ceilometer-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.701418 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " pod="openstack/ceilometer-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.701450 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db79f1f6-dfad-45fd-b30c-c88c883f6318-run-httpd\") pod \"ceilometer-0\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " pod="openstack/ceilometer-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.701532 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " pod="openstack/ceilometer-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.701611 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " pod="openstack/ceilometer-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.701673 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-scripts\") pod \"ceilometer-0\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " pod="openstack/ceilometer-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.701711 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db79f1f6-dfad-45fd-b30c-c88c883f6318-log-httpd\") pod \"ceilometer-0\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " pod="openstack/ceilometer-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.701762 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-config-data\") pod \"ceilometer-0\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " pod="openstack/ceilometer-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.773967 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w2ns9"] Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.803537 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " pod="openstack/ceilometer-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.803582 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db79f1f6-dfad-45fd-b30c-c88c883f6318-run-httpd\") pod \"ceilometer-0\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " pod="openstack/ceilometer-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.803612 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " pod="openstack/ceilometer-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.803664 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " pod="openstack/ceilometer-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.803695 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-scripts\") pod \"ceilometer-0\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " pod="openstack/ceilometer-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.804206 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db79f1f6-dfad-45fd-b30c-c88c883f6318-run-httpd\") pod \"ceilometer-0\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " pod="openstack/ceilometer-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.805869 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db79f1f6-dfad-45fd-b30c-c88c883f6318-log-httpd\") pod \"ceilometer-0\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " pod="openstack/ceilometer-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.806209 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db79f1f6-dfad-45fd-b30c-c88c883f6318-log-httpd\") pod \"ceilometer-0\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " pod="openstack/ceilometer-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.806295 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-config-data\") pod \"ceilometer-0\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " pod="openstack/ceilometer-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.807279 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnr4m\" (UniqueName: \"kubernetes.io/projected/db79f1f6-dfad-45fd-b30c-c88c883f6318-kube-api-access-nnr4m\") pod \"ceilometer-0\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " pod="openstack/ceilometer-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.810885 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-scripts\") pod \"ceilometer-0\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " pod="openstack/ceilometer-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.810147 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " pod="openstack/ceilometer-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.813892 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " pod="openstack/ceilometer-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.814884 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " pod="openstack/ceilometer-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.815287 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-config-data\") pod \"ceilometer-0\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " pod="openstack/ceilometer-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.827684 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnr4m\" (UniqueName: \"kubernetes.io/projected/db79f1f6-dfad-45fd-b30c-c88c883f6318-kube-api-access-nnr4m\") pod \"ceilometer-0\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " pod="openstack/ceilometer-0" Feb 19 18:54:47 crc kubenswrapper[4749]: I0219 18:54:47.846009 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 18:54:48 crc kubenswrapper[4749]: I0219 18:54:48.085452 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2ns9" event={"ID":"144a66be-49a6-470b-af60-764fa5d4d7dd","Type":"ContainerStarted","Data":"5f98f8ccf04f332955e0536c36623a5b4a323636df4e8c3616d0c65c611ab242"} Feb 19 18:54:48 crc kubenswrapper[4749]: I0219 18:54:48.085816 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2ns9" event={"ID":"144a66be-49a6-470b-af60-764fa5d4d7dd","Type":"ContainerStarted","Data":"dd7d03124fb75f94107be6384eb24863fa1c970bad4219cbc9709d64b1ec0dda"} Feb 19 18:54:48 crc kubenswrapper[4749]: I0219 18:54:48.379443 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:54:48 crc kubenswrapper[4749]: I0219 18:54:48.577175 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f489af23-b6c5-42a0-84ef-f5fb1910c79a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.218:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 18:54:48 crc kubenswrapper[4749]: I0219 18:54:48.577204 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f489af23-b6c5-42a0-84ef-f5fb1910c79a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.218:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 18:54:48 crc kubenswrapper[4749]: I0219 18:54:48.690922 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b39e6857-5164-4568-b6e9-bbb966f59eaf" path="/var/lib/kubelet/pods/b39e6857-5164-4568-b6e9-bbb966f59eaf/volumes" Feb 19 18:54:49 crc kubenswrapper[4749]: I0219 18:54:49.095984 4749 generic.go:334] "Generic (PLEG): container finished" podID="144a66be-49a6-470b-af60-764fa5d4d7dd" containerID="5f98f8ccf04f332955e0536c36623a5b4a323636df4e8c3616d0c65c611ab242" exitCode=0 Feb 19 18:54:49 crc kubenswrapper[4749]: I0219 18:54:49.096102 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2ns9" event={"ID":"144a66be-49a6-470b-af60-764fa5d4d7dd","Type":"ContainerDied","Data":"5f98f8ccf04f332955e0536c36623a5b4a323636df4e8c3616d0c65c611ab242"} Feb 19 18:54:49 crc kubenswrapper[4749]: I0219 18:54:49.099881 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db79f1f6-dfad-45fd-b30c-c88c883f6318","Type":"ContainerStarted","Data":"13d8986b86d04e8742e9dfa924c64dabc518bb4c0682fe13091c69165fcf7213"} Feb 19 18:54:49 crc kubenswrapper[4749]: I0219 18:54:49.099930 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db79f1f6-dfad-45fd-b30c-c88c883f6318","Type":"ContainerStarted","Data":"d9bca9cba8909506dc6f1e91e386c4d7134a54c719d90cb4eb55667f722f0732"} Feb 19 18:54:49 crc kubenswrapper[4749]: I0219 18:54:49.099946 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db79f1f6-dfad-45fd-b30c-c88c883f6318","Type":"ContainerStarted","Data":"9f94aa30d4128b3575752d64f28313893e857332b924199f3d2c3bc6cfcbda4e"} Feb 19 18:54:50 crc kubenswrapper[4749]: I0219 18:54:50.109409 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2ns9" event={"ID":"144a66be-49a6-470b-af60-764fa5d4d7dd","Type":"ContainerStarted","Data":"f1c7f8342a4d4cdb025c9a31cb33b77d99cc6d046bf761f23eea9338cf1f205d"} Feb 19 18:54:50 crc kubenswrapper[4749]: I0219 18:54:50.111629 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db79f1f6-dfad-45fd-b30c-c88c883f6318","Type":"ContainerStarted","Data":"821fb946e5eaecd9878194e09e11f4e1ee230f1ecfd0cdb47cda586ed4fc4d3c"} Feb 19 18:54:50 crc kubenswrapper[4749]: I0219 18:54:50.387646 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 19 18:54:50 crc kubenswrapper[4749]: I0219 18:54:50.842459 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 19 18:54:51 crc kubenswrapper[4749]: I0219 18:54:51.382802 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 18:54:51 crc kubenswrapper[4749]: I0219 18:54:51.383122 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 18:54:52 crc kubenswrapper[4749]: I0219 18:54:52.132130 4749 generic.go:334] "Generic (PLEG): container finished" podID="144a66be-49a6-470b-af60-764fa5d4d7dd" containerID="f1c7f8342a4d4cdb025c9a31cb33b77d99cc6d046bf761f23eea9338cf1f205d" exitCode=0 Feb 19 18:54:52 crc kubenswrapper[4749]: I0219 18:54:52.132174 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2ns9" event={"ID":"144a66be-49a6-470b-af60-764fa5d4d7dd","Type":"ContainerDied","Data":"f1c7f8342a4d4cdb025c9a31cb33b77d99cc6d046bf761f23eea9338cf1f205d"} Feb 19 18:54:52 crc kubenswrapper[4749]: I0219 18:54:52.377592 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 18:54:52 crc kubenswrapper[4749]: I0219 18:54:52.418793 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 18:54:52 crc kubenswrapper[4749]: I0219 18:54:52.467562 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="71f6b71f-4791-4a38-b5ae-282a0f643b6c" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.221:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 18:54:52 crc kubenswrapper[4749]: I0219 18:54:52.467987 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="71f6b71f-4791-4a38-b5ae-282a0f643b6c" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.221:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 18:54:53 crc kubenswrapper[4749]: I0219 18:54:53.187259 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 18:54:55 crc kubenswrapper[4749]: I0219 18:54:55.169495 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2ns9" event={"ID":"144a66be-49a6-470b-af60-764fa5d4d7dd","Type":"ContainerStarted","Data":"f369e021e3ae4ce848155df51969ee6960287f7ad18467ecc119b54858cce9e5"} Feb 19 18:54:55 crc kubenswrapper[4749]: I0219 18:54:55.172410 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db79f1f6-dfad-45fd-b30c-c88c883f6318","Type":"ContainerStarted","Data":"3f08a9edaffa8ffbd551c257cbb05e791b5dda16a4cb91d5c0bec5aeebdd7218"} Feb 19 18:54:55 crc kubenswrapper[4749]: I0219 18:54:55.172557 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 18:54:55 crc kubenswrapper[4749]: I0219 18:54:55.233687 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w2ns9" podStartSLOduration=4.068674298 podStartE2EDuration="9.233666141s" podCreationTimestamp="2026-02-19 18:54:46 +0000 UTC" firstStartedPulling="2026-02-19 18:54:49.098159382 +0000 UTC m=+1263.059379346" lastFinishedPulling="2026-02-19 18:54:54.263151235 +0000 UTC m=+1268.224371189" observedRunningTime="2026-02-19 18:54:55.201321038 +0000 UTC m=+1269.162541002" watchObservedRunningTime="2026-02-19 18:54:55.233666141 +0000 UTC m=+1269.194886095" Feb 19 18:54:57 crc kubenswrapper[4749]: I0219 18:54:57.300781 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w2ns9" Feb 19 18:54:57 crc kubenswrapper[4749]: I0219 18:54:57.301144 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w2ns9" Feb 19 18:54:57 crc kubenswrapper[4749]: I0219 18:54:57.563097 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 18:54:57 crc kubenswrapper[4749]: I0219 18:54:57.568287 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 18:54:57 crc kubenswrapper[4749]: I0219 18:54:57.572891 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 18:54:57 crc kubenswrapper[4749]: I0219 18:54:57.582814 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.833534144 podStartE2EDuration="10.582790533s" podCreationTimestamp="2026-02-19 18:54:47 +0000 UTC" firstStartedPulling="2026-02-19 18:54:48.382137806 +0000 UTC m=+1262.343357760" lastFinishedPulling="2026-02-19 18:54:54.131394195 +0000 UTC m=+1268.092614149" observedRunningTime="2026-02-19 18:54:55.236151701 +0000 UTC m=+1269.197371655" watchObservedRunningTime="2026-02-19 18:54:57.582790533 +0000 UTC m=+1271.544010497" Feb 19 18:54:58 crc kubenswrapper[4749]: I0219 18:54:58.221326 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 18:54:58 crc kubenswrapper[4749]: I0219 18:54:58.343491 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w2ns9" podUID="144a66be-49a6-470b-af60-764fa5d4d7dd" containerName="registry-server" probeResult="failure" output=< Feb 19 18:54:58 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Feb 19 18:54:58 crc kubenswrapper[4749]: > Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.126409 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.217641 4749 generic.go:334] "Generic (PLEG): container finished" podID="0fab25af-587f-47c5-a11c-c733e64c3be9" containerID="561f69d879ff790f3fb762c5bc4b844f1bfbd3aa45865f17527ccb3a53362382" exitCode=137 Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.218185 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.218195 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0fab25af-587f-47c5-a11c-c733e64c3be9","Type":"ContainerDied","Data":"561f69d879ff790f3fb762c5bc4b844f1bfbd3aa45865f17527ccb3a53362382"} Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.218311 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0fab25af-587f-47c5-a11c-c733e64c3be9","Type":"ContainerDied","Data":"79752a59192bc5af7e0392926a900018b90511789646065101c112bd6c1fc347"} Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.218330 4749 scope.go:117] "RemoveContainer" containerID="561f69d879ff790f3fb762c5bc4b844f1bfbd3aa45865f17527ccb3a53362382" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.235637 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc8p9\" (UniqueName: \"kubernetes.io/projected/0fab25af-587f-47c5-a11c-c733e64c3be9-kube-api-access-cc8p9\") pod \"0fab25af-587f-47c5-a11c-c733e64c3be9\" (UID: \"0fab25af-587f-47c5-a11c-c733e64c3be9\") " Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.236542 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fab25af-587f-47c5-a11c-c733e64c3be9-config-data\") pod \"0fab25af-587f-47c5-a11c-c733e64c3be9\" (UID: \"0fab25af-587f-47c5-a11c-c733e64c3be9\") " Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.236589 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fab25af-587f-47c5-a11c-c733e64c3be9-combined-ca-bundle\") pod \"0fab25af-587f-47c5-a11c-c733e64c3be9\" (UID: \"0fab25af-587f-47c5-a11c-c733e64c3be9\") " Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.243249 4749 scope.go:117] "RemoveContainer" containerID="561f69d879ff790f3fb762c5bc4b844f1bfbd3aa45865f17527ccb3a53362382" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.243539 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fab25af-587f-47c5-a11c-c733e64c3be9-kube-api-access-cc8p9" (OuterVolumeSpecName: "kube-api-access-cc8p9") pod "0fab25af-587f-47c5-a11c-c733e64c3be9" (UID: "0fab25af-587f-47c5-a11c-c733e64c3be9"). InnerVolumeSpecName "kube-api-access-cc8p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:55:00 crc kubenswrapper[4749]: E0219 18:55:00.243731 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"561f69d879ff790f3fb762c5bc4b844f1bfbd3aa45865f17527ccb3a53362382\": container with ID starting with 561f69d879ff790f3fb762c5bc4b844f1bfbd3aa45865f17527ccb3a53362382 not found: ID does not exist" containerID="561f69d879ff790f3fb762c5bc4b844f1bfbd3aa45865f17527ccb3a53362382" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.243788 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"561f69d879ff790f3fb762c5bc4b844f1bfbd3aa45865f17527ccb3a53362382"} err="failed to get container status \"561f69d879ff790f3fb762c5bc4b844f1bfbd3aa45865f17527ccb3a53362382\": rpc error: code = NotFound desc = could not find container \"561f69d879ff790f3fb762c5bc4b844f1bfbd3aa45865f17527ccb3a53362382\": container with ID starting with 561f69d879ff790f3fb762c5bc4b844f1bfbd3aa45865f17527ccb3a53362382 not found: ID does not exist" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.262176 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fab25af-587f-47c5-a11c-c733e64c3be9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0fab25af-587f-47c5-a11c-c733e64c3be9" (UID: "0fab25af-587f-47c5-a11c-c733e64c3be9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.270762 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fab25af-587f-47c5-a11c-c733e64c3be9-config-data" (OuterVolumeSpecName: "config-data") pod "0fab25af-587f-47c5-a11c-c733e64c3be9" (UID: "0fab25af-587f-47c5-a11c-c733e64c3be9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.338944 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fab25af-587f-47c5-a11c-c733e64c3be9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.338986 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fab25af-587f-47c5-a11c-c733e64c3be9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.339001 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc8p9\" (UniqueName: \"kubernetes.io/projected/0fab25af-587f-47c5-a11c-c733e64c3be9-kube-api-access-cc8p9\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.548810 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.558942 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.569314 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 18:55:00 crc kubenswrapper[4749]: E0219 18:55:00.569921 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fab25af-587f-47c5-a11c-c733e64c3be9" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.570280 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fab25af-587f-47c5-a11c-c733e64c3be9" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.570589 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fab25af-587f-47c5-a11c-c733e64c3be9" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.571461 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.574377 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.574777 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.574944 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.577314 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.691965 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fab25af-587f-47c5-a11c-c733e64c3be9" path="/var/lib/kubelet/pods/0fab25af-587f-47c5-a11c-c733e64c3be9/volumes" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.747247 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a05f7aea-8655-4484-ad07-c9c6f0e98880-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a05f7aea-8655-4484-ad07-c9c6f0e98880\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.748365 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a05f7aea-8655-4484-ad07-c9c6f0e98880-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a05f7aea-8655-4484-ad07-c9c6f0e98880\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.748437 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a05f7aea-8655-4484-ad07-c9c6f0e98880-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a05f7aea-8655-4484-ad07-c9c6f0e98880\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.748465 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a05f7aea-8655-4484-ad07-c9c6f0e98880-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a05f7aea-8655-4484-ad07-c9c6f0e98880\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.748612 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffgp6\" (UniqueName: \"kubernetes.io/projected/a05f7aea-8655-4484-ad07-c9c6f0e98880-kube-api-access-ffgp6\") pod \"nova-cell1-novncproxy-0\" (UID: \"a05f7aea-8655-4484-ad07-c9c6f0e98880\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.850890 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffgp6\" (UniqueName: \"kubernetes.io/projected/a05f7aea-8655-4484-ad07-c9c6f0e98880-kube-api-access-ffgp6\") pod \"nova-cell1-novncproxy-0\" (UID: \"a05f7aea-8655-4484-ad07-c9c6f0e98880\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.851001 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a05f7aea-8655-4484-ad07-c9c6f0e98880-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a05f7aea-8655-4484-ad07-c9c6f0e98880\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.851139 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a05f7aea-8655-4484-ad07-c9c6f0e98880-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a05f7aea-8655-4484-ad07-c9c6f0e98880\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.851176 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a05f7aea-8655-4484-ad07-c9c6f0e98880-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a05f7aea-8655-4484-ad07-c9c6f0e98880\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.851200 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a05f7aea-8655-4484-ad07-c9c6f0e98880-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a05f7aea-8655-4484-ad07-c9c6f0e98880\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.854766 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a05f7aea-8655-4484-ad07-c9c6f0e98880-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a05f7aea-8655-4484-ad07-c9c6f0e98880\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.854777 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a05f7aea-8655-4484-ad07-c9c6f0e98880-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a05f7aea-8655-4484-ad07-c9c6f0e98880\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.856692 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a05f7aea-8655-4484-ad07-c9c6f0e98880-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a05f7aea-8655-4484-ad07-c9c6f0e98880\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.860091 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a05f7aea-8655-4484-ad07-c9c6f0e98880-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a05f7aea-8655-4484-ad07-c9c6f0e98880\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.868341 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffgp6\" (UniqueName: \"kubernetes.io/projected/a05f7aea-8655-4484-ad07-c9c6f0e98880-kube-api-access-ffgp6\") pod \"nova-cell1-novncproxy-0\" (UID: \"a05f7aea-8655-4484-ad07-c9c6f0e98880\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:55:00 crc kubenswrapper[4749]: I0219 18:55:00.930813 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:55:01 crc kubenswrapper[4749]: I0219 18:55:01.360827 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 18:55:01 crc kubenswrapper[4749]: I0219 18:55:01.390330 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 18:55:01 crc kubenswrapper[4749]: I0219 18:55:01.391694 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 18:55:01 crc kubenswrapper[4749]: I0219 18:55:01.391837 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 18:55:01 crc kubenswrapper[4749]: I0219 18:55:01.399825 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 18:55:02 crc kubenswrapper[4749]: I0219 18:55:02.243311 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a05f7aea-8655-4484-ad07-c9c6f0e98880","Type":"ContainerStarted","Data":"eb9a037a5926ff29c4c3fe9cccf815444ee3341fa09d6365971ce591daee6a53"} Feb 19 18:55:02 crc kubenswrapper[4749]: I0219 18:55:02.243686 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a05f7aea-8655-4484-ad07-c9c6f0e98880","Type":"ContainerStarted","Data":"3569b09cdd4ddfac23a277416d68ac1789c351eba9cd3651d46fb951fadc828f"} Feb 19 18:55:02 crc kubenswrapper[4749]: I0219 18:55:02.244154 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 18:55:02 crc kubenswrapper[4749]: I0219 18:55:02.249465 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 18:55:02 crc kubenswrapper[4749]: I0219 18:55:02.259940 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.259919675 podStartE2EDuration="2.259919675s" podCreationTimestamp="2026-02-19 18:55:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:55:02.257864346 +0000 UTC m=+1276.219084310" watchObservedRunningTime="2026-02-19 18:55:02.259919675 +0000 UTC m=+1276.221139649" Feb 19 18:55:02 crc kubenswrapper[4749]: I0219 18:55:02.418589 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-597b667f69-7hpgv"] Feb 19 18:55:02 crc kubenswrapper[4749]: I0219 18:55:02.423970 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-597b667f69-7hpgv" Feb 19 18:55:02 crc kubenswrapper[4749]: I0219 18:55:02.435985 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-597b667f69-7hpgv"] Feb 19 18:55:02 crc kubenswrapper[4749]: I0219 18:55:02.582879 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-dns-svc\") pod \"dnsmasq-dns-597b667f69-7hpgv\" (UID: \"1c41744d-1aa4-4ea7-bef0-fa0838093916\") " pod="openstack/dnsmasq-dns-597b667f69-7hpgv" Feb 19 18:55:02 crc kubenswrapper[4749]: I0219 18:55:02.583509 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfdlj\" (UniqueName: \"kubernetes.io/projected/1c41744d-1aa4-4ea7-bef0-fa0838093916-kube-api-access-pfdlj\") pod \"dnsmasq-dns-597b667f69-7hpgv\" (UID: \"1c41744d-1aa4-4ea7-bef0-fa0838093916\") " pod="openstack/dnsmasq-dns-597b667f69-7hpgv" Feb 19 18:55:02 crc kubenswrapper[4749]: I0219 18:55:02.583558 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-ovsdbserver-sb\") pod \"dnsmasq-dns-597b667f69-7hpgv\" (UID: \"1c41744d-1aa4-4ea7-bef0-fa0838093916\") " pod="openstack/dnsmasq-dns-597b667f69-7hpgv" Feb 19 18:55:02 crc kubenswrapper[4749]: I0219 18:55:02.583784 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-ovsdbserver-nb\") pod \"dnsmasq-dns-597b667f69-7hpgv\" (UID: \"1c41744d-1aa4-4ea7-bef0-fa0838093916\") " pod="openstack/dnsmasq-dns-597b667f69-7hpgv" Feb 19 18:55:02 crc kubenswrapper[4749]: I0219 18:55:02.583834 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-dns-swift-storage-0\") pod \"dnsmasq-dns-597b667f69-7hpgv\" (UID: \"1c41744d-1aa4-4ea7-bef0-fa0838093916\") " pod="openstack/dnsmasq-dns-597b667f69-7hpgv" Feb 19 18:55:02 crc kubenswrapper[4749]: I0219 18:55:02.584194 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-config\") pod \"dnsmasq-dns-597b667f69-7hpgv\" (UID: \"1c41744d-1aa4-4ea7-bef0-fa0838093916\") " pod="openstack/dnsmasq-dns-597b667f69-7hpgv" Feb 19 18:55:02 crc kubenswrapper[4749]: I0219 18:55:02.685547 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-ovsdbserver-nb\") pod \"dnsmasq-dns-597b667f69-7hpgv\" (UID: \"1c41744d-1aa4-4ea7-bef0-fa0838093916\") " pod="openstack/dnsmasq-dns-597b667f69-7hpgv" Feb 19 18:55:02 crc kubenswrapper[4749]: I0219 18:55:02.685584 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-dns-swift-storage-0\") pod \"dnsmasq-dns-597b667f69-7hpgv\" (UID: \"1c41744d-1aa4-4ea7-bef0-fa0838093916\") " pod="openstack/dnsmasq-dns-597b667f69-7hpgv" Feb 19 18:55:02 crc kubenswrapper[4749]: I0219 18:55:02.685644 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-config\") pod \"dnsmasq-dns-597b667f69-7hpgv\" (UID: \"1c41744d-1aa4-4ea7-bef0-fa0838093916\") " pod="openstack/dnsmasq-dns-597b667f69-7hpgv" Feb 19 18:55:02 crc kubenswrapper[4749]: I0219 18:55:02.685705 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-dns-svc\") pod \"dnsmasq-dns-597b667f69-7hpgv\" (UID: \"1c41744d-1aa4-4ea7-bef0-fa0838093916\") " pod="openstack/dnsmasq-dns-597b667f69-7hpgv" Feb 19 18:55:02 crc kubenswrapper[4749]: I0219 18:55:02.685734 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfdlj\" (UniqueName: \"kubernetes.io/projected/1c41744d-1aa4-4ea7-bef0-fa0838093916-kube-api-access-pfdlj\") pod \"dnsmasq-dns-597b667f69-7hpgv\" (UID: \"1c41744d-1aa4-4ea7-bef0-fa0838093916\") " pod="openstack/dnsmasq-dns-597b667f69-7hpgv" Feb 19 18:55:02 crc kubenswrapper[4749]: I0219 18:55:02.685754 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-ovsdbserver-sb\") pod \"dnsmasq-dns-597b667f69-7hpgv\" (UID: \"1c41744d-1aa4-4ea7-bef0-fa0838093916\") " pod="openstack/dnsmasq-dns-597b667f69-7hpgv" Feb 19 18:55:02 crc kubenswrapper[4749]: I0219 18:55:02.686790 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-ovsdbserver-nb\") pod \"dnsmasq-dns-597b667f69-7hpgv\" (UID: \"1c41744d-1aa4-4ea7-bef0-fa0838093916\") " pod="openstack/dnsmasq-dns-597b667f69-7hpgv" Feb 19 18:55:02 crc kubenswrapper[4749]: I0219 18:55:02.686794 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-config\") pod \"dnsmasq-dns-597b667f69-7hpgv\" (UID: \"1c41744d-1aa4-4ea7-bef0-fa0838093916\") " pod="openstack/dnsmasq-dns-597b667f69-7hpgv" Feb 19 18:55:02 crc kubenswrapper[4749]: I0219 18:55:02.686875 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-dns-swift-storage-0\") pod \"dnsmasq-dns-597b667f69-7hpgv\" (UID: \"1c41744d-1aa4-4ea7-bef0-fa0838093916\") " pod="openstack/dnsmasq-dns-597b667f69-7hpgv" Feb 19 18:55:02 crc kubenswrapper[4749]: I0219 18:55:02.686909 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-dns-svc\") pod \"dnsmasq-dns-597b667f69-7hpgv\" (UID: \"1c41744d-1aa4-4ea7-bef0-fa0838093916\") " pod="openstack/dnsmasq-dns-597b667f69-7hpgv" Feb 19 18:55:02 crc kubenswrapper[4749]: I0219 18:55:02.686962 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-ovsdbserver-sb\") pod \"dnsmasq-dns-597b667f69-7hpgv\" (UID: \"1c41744d-1aa4-4ea7-bef0-fa0838093916\") " pod="openstack/dnsmasq-dns-597b667f69-7hpgv" Feb 19 18:55:02 crc kubenswrapper[4749]: I0219 18:55:02.719006 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfdlj\" (UniqueName: \"kubernetes.io/projected/1c41744d-1aa4-4ea7-bef0-fa0838093916-kube-api-access-pfdlj\") pod \"dnsmasq-dns-597b667f69-7hpgv\" (UID: \"1c41744d-1aa4-4ea7-bef0-fa0838093916\") " pod="openstack/dnsmasq-dns-597b667f69-7hpgv" Feb 19 18:55:02 crc kubenswrapper[4749]: I0219 18:55:02.766925 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-597b667f69-7hpgv" Feb 19 18:55:03 crc kubenswrapper[4749]: I0219 18:55:03.327937 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-597b667f69-7hpgv"] Feb 19 18:55:04 crc kubenswrapper[4749]: I0219 18:55:04.264393 4749 generic.go:334] "Generic (PLEG): container finished" podID="1c41744d-1aa4-4ea7-bef0-fa0838093916" containerID="0517e53c34f55e02e9386a7ab69e171aa5140859bf45b2412c01fd4cd087964c" exitCode=0 Feb 19 18:55:04 crc kubenswrapper[4749]: I0219 18:55:04.264612 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-597b667f69-7hpgv" event={"ID":"1c41744d-1aa4-4ea7-bef0-fa0838093916","Type":"ContainerDied","Data":"0517e53c34f55e02e9386a7ab69e171aa5140859bf45b2412c01fd4cd087964c"} Feb 19 18:55:04 crc kubenswrapper[4749]: I0219 18:55:04.264822 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-597b667f69-7hpgv" event={"ID":"1c41744d-1aa4-4ea7-bef0-fa0838093916","Type":"ContainerStarted","Data":"e0d3a89faf507083f75f5c969d131a8c6bad0eb728637637c97600635fd47155"} Feb 19 18:55:04 crc kubenswrapper[4749]: I0219 18:55:04.447892 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:55:04 crc kubenswrapper[4749]: I0219 18:55:04.448719 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="db79f1f6-dfad-45fd-b30c-c88c883f6318" containerName="ceilometer-central-agent" containerID="cri-o://d9bca9cba8909506dc6f1e91e386c4d7134a54c719d90cb4eb55667f722f0732" gracePeriod=30 Feb 19 18:55:04 crc kubenswrapper[4749]: I0219 18:55:04.449089 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="db79f1f6-dfad-45fd-b30c-c88c883f6318" containerName="sg-core" containerID="cri-o://821fb946e5eaecd9878194e09e11f4e1ee230f1ecfd0cdb47cda586ed4fc4d3c" gracePeriod=30 Feb 19 18:55:04 crc kubenswrapper[4749]: I0219 18:55:04.449286 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="db79f1f6-dfad-45fd-b30c-c88c883f6318" containerName="proxy-httpd" containerID="cri-o://3f08a9edaffa8ffbd551c257cbb05e791b5dda16a4cb91d5c0bec5aeebdd7218" gracePeriod=30 Feb 19 18:55:04 crc kubenswrapper[4749]: I0219 18:55:04.449356 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="db79f1f6-dfad-45fd-b30c-c88c883f6318" containerName="ceilometer-notification-agent" containerID="cri-o://13d8986b86d04e8742e9dfa924c64dabc518bb4c0682fe13091c69165fcf7213" gracePeriod=30 Feb 19 18:55:04 crc kubenswrapper[4749]: I0219 18:55:04.555606 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="db79f1f6-dfad-45fd-b30c-c88c883f6318" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.224:3000/\": read tcp 10.217.0.2:38204->10.217.0.224:3000: read: connection reset by peer" Feb 19 18:55:04 crc kubenswrapper[4749]: I0219 18:55:04.906675 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 18:55:05 crc kubenswrapper[4749]: I0219 18:55:05.278441 4749 generic.go:334] "Generic (PLEG): container finished" podID="db79f1f6-dfad-45fd-b30c-c88c883f6318" containerID="3f08a9edaffa8ffbd551c257cbb05e791b5dda16a4cb91d5c0bec5aeebdd7218" exitCode=0 Feb 19 18:55:05 crc kubenswrapper[4749]: I0219 18:55:05.278736 4749 generic.go:334] "Generic (PLEG): container finished" podID="db79f1f6-dfad-45fd-b30c-c88c883f6318" containerID="821fb946e5eaecd9878194e09e11f4e1ee230f1ecfd0cdb47cda586ed4fc4d3c" exitCode=2 Feb 19 18:55:05 crc kubenswrapper[4749]: I0219 18:55:05.278745 4749 generic.go:334] "Generic (PLEG): container finished" podID="db79f1f6-dfad-45fd-b30c-c88c883f6318" containerID="d9bca9cba8909506dc6f1e91e386c4d7134a54c719d90cb4eb55667f722f0732" exitCode=0 Feb 19 18:55:05 crc kubenswrapper[4749]: I0219 18:55:05.278551 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db79f1f6-dfad-45fd-b30c-c88c883f6318","Type":"ContainerDied","Data":"3f08a9edaffa8ffbd551c257cbb05e791b5dda16a4cb91d5c0bec5aeebdd7218"} Feb 19 18:55:05 crc kubenswrapper[4749]: I0219 18:55:05.278813 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db79f1f6-dfad-45fd-b30c-c88c883f6318","Type":"ContainerDied","Data":"821fb946e5eaecd9878194e09e11f4e1ee230f1ecfd0cdb47cda586ed4fc4d3c"} Feb 19 18:55:05 crc kubenswrapper[4749]: I0219 18:55:05.278828 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db79f1f6-dfad-45fd-b30c-c88c883f6318","Type":"ContainerDied","Data":"d9bca9cba8909506dc6f1e91e386c4d7134a54c719d90cb4eb55667f722f0732"} Feb 19 18:55:05 crc kubenswrapper[4749]: I0219 18:55:05.282713 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-597b667f69-7hpgv" event={"ID":"1c41744d-1aa4-4ea7-bef0-fa0838093916","Type":"ContainerStarted","Data":"ba2f09c8738df305668a914b6221f68c65bd370e5b638999b837ad408c987979"} Feb 19 18:55:05 crc kubenswrapper[4749]: I0219 18:55:05.282994 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="71f6b71f-4791-4a38-b5ae-282a0f643b6c" containerName="nova-api-log" containerID="cri-o://007ad0b9a3629df8e24aab6c43fc1a95728a66d547f53b1015e7f820e0165f33" gracePeriod=30 Feb 19 18:55:05 crc kubenswrapper[4749]: I0219 18:55:05.284527 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="71f6b71f-4791-4a38-b5ae-282a0f643b6c" containerName="nova-api-api" containerID="cri-o://5f89aff398323f23140302978b1bc2e20e0516f69246cf35bcc4b634a078617c" gracePeriod=30 Feb 19 18:55:05 crc kubenswrapper[4749]: I0219 18:55:05.310889 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-597b667f69-7hpgv" podStartSLOduration=3.310872109 podStartE2EDuration="3.310872109s" podCreationTimestamp="2026-02-19 18:55:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:55:05.305668983 +0000 UTC m=+1279.266888957" watchObservedRunningTime="2026-02-19 18:55:05.310872109 +0000 UTC m=+1279.272092063" Feb 19 18:55:05 crc kubenswrapper[4749]: I0219 18:55:05.931755 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:55:06 crc kubenswrapper[4749]: I0219 18:55:06.298248 4749 generic.go:334] "Generic (PLEG): container finished" podID="71f6b71f-4791-4a38-b5ae-282a0f643b6c" containerID="007ad0b9a3629df8e24aab6c43fc1a95728a66d547f53b1015e7f820e0165f33" exitCode=143 Feb 19 18:55:06 crc kubenswrapper[4749]: I0219 18:55:06.298456 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"71f6b71f-4791-4a38-b5ae-282a0f643b6c","Type":"ContainerDied","Data":"007ad0b9a3629df8e24aab6c43fc1a95728a66d547f53b1015e7f820e0165f33"} Feb 19 18:55:06 crc kubenswrapper[4749]: I0219 18:55:06.298930 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-597b667f69-7hpgv" Feb 19 18:55:06 crc kubenswrapper[4749]: I0219 18:55:06.814826 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 18:55:06 crc kubenswrapper[4749]: I0219 18:55:06.976849 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71f6b71f-4791-4a38-b5ae-282a0f643b6c-logs\") pod \"71f6b71f-4791-4a38-b5ae-282a0f643b6c\" (UID: \"71f6b71f-4791-4a38-b5ae-282a0f643b6c\") " Feb 19 18:55:06 crc kubenswrapper[4749]: I0219 18:55:06.976908 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71f6b71f-4791-4a38-b5ae-282a0f643b6c-config-data\") pod \"71f6b71f-4791-4a38-b5ae-282a0f643b6c\" (UID: \"71f6b71f-4791-4a38-b5ae-282a0f643b6c\") " Feb 19 18:55:06 crc kubenswrapper[4749]: I0219 18:55:06.976931 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68vdt\" (UniqueName: \"kubernetes.io/projected/71f6b71f-4791-4a38-b5ae-282a0f643b6c-kube-api-access-68vdt\") pod \"71f6b71f-4791-4a38-b5ae-282a0f643b6c\" (UID: \"71f6b71f-4791-4a38-b5ae-282a0f643b6c\") " Feb 19 18:55:06 crc kubenswrapper[4749]: I0219 18:55:06.977049 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71f6b71f-4791-4a38-b5ae-282a0f643b6c-combined-ca-bundle\") pod \"71f6b71f-4791-4a38-b5ae-282a0f643b6c\" (UID: \"71f6b71f-4791-4a38-b5ae-282a0f643b6c\") " Feb 19 18:55:06 crc kubenswrapper[4749]: I0219 18:55:06.977294 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71f6b71f-4791-4a38-b5ae-282a0f643b6c-logs" (OuterVolumeSpecName: "logs") pod "71f6b71f-4791-4a38-b5ae-282a0f643b6c" (UID: "71f6b71f-4791-4a38-b5ae-282a0f643b6c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:55:06 crc kubenswrapper[4749]: I0219 18:55:06.979213 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71f6b71f-4791-4a38-b5ae-282a0f643b6c-logs\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.009941 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71f6b71f-4791-4a38-b5ae-282a0f643b6c-kube-api-access-68vdt" (OuterVolumeSpecName: "kube-api-access-68vdt") pod "71f6b71f-4791-4a38-b5ae-282a0f643b6c" (UID: "71f6b71f-4791-4a38-b5ae-282a0f643b6c"). InnerVolumeSpecName "kube-api-access-68vdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.019504 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71f6b71f-4791-4a38-b5ae-282a0f643b6c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71f6b71f-4791-4a38-b5ae-282a0f643b6c" (UID: "71f6b71f-4791-4a38-b5ae-282a0f643b6c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.026352 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71f6b71f-4791-4a38-b5ae-282a0f643b6c-config-data" (OuterVolumeSpecName: "config-data") pod "71f6b71f-4791-4a38-b5ae-282a0f643b6c" (UID: "71f6b71f-4791-4a38-b5ae-282a0f643b6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.080770 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71f6b71f-4791-4a38-b5ae-282a0f643b6c-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.080800 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68vdt\" (UniqueName: \"kubernetes.io/projected/71f6b71f-4791-4a38-b5ae-282a0f643b6c-kube-api-access-68vdt\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.080811 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71f6b71f-4791-4a38-b5ae-282a0f643b6c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.112263 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.283594 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db79f1f6-dfad-45fd-b30c-c88c883f6318-log-httpd\") pod \"db79f1f6-dfad-45fd-b30c-c88c883f6318\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.283982 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-sg-core-conf-yaml\") pod \"db79f1f6-dfad-45fd-b30c-c88c883f6318\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.284019 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db79f1f6-dfad-45fd-b30c-c88c883f6318-run-httpd\") pod \"db79f1f6-dfad-45fd-b30c-c88c883f6318\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.284061 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-combined-ca-bundle\") pod \"db79f1f6-dfad-45fd-b30c-c88c883f6318\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.284212 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-scripts\") pod \"db79f1f6-dfad-45fd-b30c-c88c883f6318\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.284264 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-ceilometer-tls-certs\") pod \"db79f1f6-dfad-45fd-b30c-c88c883f6318\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.284326 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnr4m\" (UniqueName: \"kubernetes.io/projected/db79f1f6-dfad-45fd-b30c-c88c883f6318-kube-api-access-nnr4m\") pod \"db79f1f6-dfad-45fd-b30c-c88c883f6318\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.284347 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-config-data\") pod \"db79f1f6-dfad-45fd-b30c-c88c883f6318\" (UID: \"db79f1f6-dfad-45fd-b30c-c88c883f6318\") " Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.288672 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db79f1f6-dfad-45fd-b30c-c88c883f6318-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "db79f1f6-dfad-45fd-b30c-c88c883f6318" (UID: "db79f1f6-dfad-45fd-b30c-c88c883f6318"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.290468 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db79f1f6-dfad-45fd-b30c-c88c883f6318-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "db79f1f6-dfad-45fd-b30c-c88c883f6318" (UID: "db79f1f6-dfad-45fd-b30c-c88c883f6318"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.295623 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-scripts" (OuterVolumeSpecName: "scripts") pod "db79f1f6-dfad-45fd-b30c-c88c883f6318" (UID: "db79f1f6-dfad-45fd-b30c-c88c883f6318"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.305299 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db79f1f6-dfad-45fd-b30c-c88c883f6318-kube-api-access-nnr4m" (OuterVolumeSpecName: "kube-api-access-nnr4m") pod "db79f1f6-dfad-45fd-b30c-c88c883f6318" (UID: "db79f1f6-dfad-45fd-b30c-c88c883f6318"). InnerVolumeSpecName "kube-api-access-nnr4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.331641 4749 generic.go:334] "Generic (PLEG): container finished" podID="71f6b71f-4791-4a38-b5ae-282a0f643b6c" containerID="5f89aff398323f23140302978b1bc2e20e0516f69246cf35bcc4b634a078617c" exitCode=0 Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.331699 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"71f6b71f-4791-4a38-b5ae-282a0f643b6c","Type":"ContainerDied","Data":"5f89aff398323f23140302978b1bc2e20e0516f69246cf35bcc4b634a078617c"} Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.331724 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"71f6b71f-4791-4a38-b5ae-282a0f643b6c","Type":"ContainerDied","Data":"47d5a6add3446140db91922bebc6c24432b676470566ea0f1d6b087db762c784"} Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.331741 4749 scope.go:117] "RemoveContainer" containerID="5f89aff398323f23140302978b1bc2e20e0516f69246cf35bcc4b634a078617c" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.331865 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.354322 4749 generic.go:334] "Generic (PLEG): container finished" podID="db79f1f6-dfad-45fd-b30c-c88c883f6318" containerID="13d8986b86d04e8742e9dfa924c64dabc518bb4c0682fe13091c69165fcf7213" exitCode=0 Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.355348 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.355844 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db79f1f6-dfad-45fd-b30c-c88c883f6318","Type":"ContainerDied","Data":"13d8986b86d04e8742e9dfa924c64dabc518bb4c0682fe13091c69165fcf7213"} Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.355871 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db79f1f6-dfad-45fd-b30c-c88c883f6318","Type":"ContainerDied","Data":"9f94aa30d4128b3575752d64f28313893e857332b924199f3d2c3bc6cfcbda4e"} Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.362279 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "db79f1f6-dfad-45fd-b30c-c88c883f6318" (UID: "db79f1f6-dfad-45fd-b30c-c88c883f6318"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.388977 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.389009 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db79f1f6-dfad-45fd-b30c-c88c883f6318-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.389020 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.389047 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnr4m\" (UniqueName: \"kubernetes.io/projected/db79f1f6-dfad-45fd-b30c-c88c883f6318-kube-api-access-nnr4m\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.389057 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db79f1f6-dfad-45fd-b30c-c88c883f6318-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.396584 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "db79f1f6-dfad-45fd-b30c-c88c883f6318" (UID: "db79f1f6-dfad-45fd-b30c-c88c883f6318"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.460517 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db79f1f6-dfad-45fd-b30c-c88c883f6318" (UID: "db79f1f6-dfad-45fd-b30c-c88c883f6318"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.463299 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-config-data" (OuterVolumeSpecName: "config-data") pod "db79f1f6-dfad-45fd-b30c-c88c883f6318" (UID: "db79f1f6-dfad-45fd-b30c-c88c883f6318"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.490676 4749 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.490909 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.490993 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db79f1f6-dfad-45fd-b30c-c88c883f6318-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.561558 4749 scope.go:117] "RemoveContainer" containerID="007ad0b9a3629df8e24aab6c43fc1a95728a66d547f53b1015e7f820e0165f33" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.566138 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.586721 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.594479 4749 scope.go:117] "RemoveContainer" containerID="5f89aff398323f23140302978b1bc2e20e0516f69246cf35bcc4b634a078617c" Feb 19 18:55:07 crc kubenswrapper[4749]: E0219 18:55:07.595087 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f89aff398323f23140302978b1bc2e20e0516f69246cf35bcc4b634a078617c\": container with ID starting with 5f89aff398323f23140302978b1bc2e20e0516f69246cf35bcc4b634a078617c not found: ID does not exist" containerID="5f89aff398323f23140302978b1bc2e20e0516f69246cf35bcc4b634a078617c" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.595126 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f89aff398323f23140302978b1bc2e20e0516f69246cf35bcc4b634a078617c"} err="failed to get container status \"5f89aff398323f23140302978b1bc2e20e0516f69246cf35bcc4b634a078617c\": rpc error: code = NotFound desc = could not find container \"5f89aff398323f23140302978b1bc2e20e0516f69246cf35bcc4b634a078617c\": container with ID starting with 5f89aff398323f23140302978b1bc2e20e0516f69246cf35bcc4b634a078617c not found: ID does not exist" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.595153 4749 scope.go:117] "RemoveContainer" containerID="007ad0b9a3629df8e24aab6c43fc1a95728a66d547f53b1015e7f820e0165f33" Feb 19 18:55:07 crc kubenswrapper[4749]: E0219 18:55:07.595547 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"007ad0b9a3629df8e24aab6c43fc1a95728a66d547f53b1015e7f820e0165f33\": container with ID starting with 007ad0b9a3629df8e24aab6c43fc1a95728a66d547f53b1015e7f820e0165f33 not found: ID does not exist" containerID="007ad0b9a3629df8e24aab6c43fc1a95728a66d547f53b1015e7f820e0165f33" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.595579 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"007ad0b9a3629df8e24aab6c43fc1a95728a66d547f53b1015e7f820e0165f33"} err="failed to get container status \"007ad0b9a3629df8e24aab6c43fc1a95728a66d547f53b1015e7f820e0165f33\": rpc error: code = NotFound desc = could not find container \"007ad0b9a3629df8e24aab6c43fc1a95728a66d547f53b1015e7f820e0165f33\": container with ID starting with 007ad0b9a3629df8e24aab6c43fc1a95728a66d547f53b1015e7f820e0165f33 not found: ID does not exist" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.595598 4749 scope.go:117] "RemoveContainer" containerID="3f08a9edaffa8ffbd551c257cbb05e791b5dda16a4cb91d5c0bec5aeebdd7218" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.603885 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 18:55:07 crc kubenswrapper[4749]: E0219 18:55:07.604437 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db79f1f6-dfad-45fd-b30c-c88c883f6318" containerName="sg-core" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.604466 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="db79f1f6-dfad-45fd-b30c-c88c883f6318" containerName="sg-core" Feb 19 18:55:07 crc kubenswrapper[4749]: E0219 18:55:07.604480 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db79f1f6-dfad-45fd-b30c-c88c883f6318" containerName="ceilometer-central-agent" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.604489 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="db79f1f6-dfad-45fd-b30c-c88c883f6318" containerName="ceilometer-central-agent" Feb 19 18:55:07 crc kubenswrapper[4749]: E0219 18:55:07.604527 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db79f1f6-dfad-45fd-b30c-c88c883f6318" containerName="ceilometer-notification-agent" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.604536 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="db79f1f6-dfad-45fd-b30c-c88c883f6318" containerName="ceilometer-notification-agent" Feb 19 18:55:07 crc kubenswrapper[4749]: E0219 18:55:07.604548 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71f6b71f-4791-4a38-b5ae-282a0f643b6c" containerName="nova-api-log" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.604555 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="71f6b71f-4791-4a38-b5ae-282a0f643b6c" containerName="nova-api-log" Feb 19 18:55:07 crc kubenswrapper[4749]: E0219 18:55:07.604570 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db79f1f6-dfad-45fd-b30c-c88c883f6318" containerName="proxy-httpd" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.604579 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="db79f1f6-dfad-45fd-b30c-c88c883f6318" containerName="proxy-httpd" Feb 19 18:55:07 crc kubenswrapper[4749]: E0219 18:55:07.604598 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71f6b71f-4791-4a38-b5ae-282a0f643b6c" containerName="nova-api-api" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.604606 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="71f6b71f-4791-4a38-b5ae-282a0f643b6c" containerName="nova-api-api" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.604825 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="db79f1f6-dfad-45fd-b30c-c88c883f6318" containerName="ceilometer-central-agent" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.604860 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="db79f1f6-dfad-45fd-b30c-c88c883f6318" containerName="ceilometer-notification-agent" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.604878 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="71f6b71f-4791-4a38-b5ae-282a0f643b6c" containerName="nova-api-log" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.604891 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="db79f1f6-dfad-45fd-b30c-c88c883f6318" containerName="sg-core" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.604907 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="71f6b71f-4791-4a38-b5ae-282a0f643b6c" containerName="nova-api-api" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.604919 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="db79f1f6-dfad-45fd-b30c-c88c883f6318" containerName="proxy-httpd" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.606250 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.608869 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.609336 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.609490 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.611955 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.633288 4749 scope.go:117] "RemoveContainer" containerID="821fb946e5eaecd9878194e09e11f4e1ee230f1ecfd0cdb47cda586ed4fc4d3c" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.680193 4749 scope.go:117] "RemoveContainer" containerID="13d8986b86d04e8742e9dfa924c64dabc518bb4c0682fe13091c69165fcf7213" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.727366 4749 scope.go:117] "RemoveContainer" containerID="d9bca9cba8909506dc6f1e91e386c4d7134a54c719d90cb4eb55667f722f0732" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.768553 4749 scope.go:117] "RemoveContainer" containerID="3f08a9edaffa8ffbd551c257cbb05e791b5dda16a4cb91d5c0bec5aeebdd7218" Feb 19 18:55:07 crc kubenswrapper[4749]: E0219 18:55:07.769988 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f08a9edaffa8ffbd551c257cbb05e791b5dda16a4cb91d5c0bec5aeebdd7218\": container with ID starting with 3f08a9edaffa8ffbd551c257cbb05e791b5dda16a4cb91d5c0bec5aeebdd7218 not found: ID does not exist" containerID="3f08a9edaffa8ffbd551c257cbb05e791b5dda16a4cb91d5c0bec5aeebdd7218" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.770086 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f08a9edaffa8ffbd551c257cbb05e791b5dda16a4cb91d5c0bec5aeebdd7218"} err="failed to get container status \"3f08a9edaffa8ffbd551c257cbb05e791b5dda16a4cb91d5c0bec5aeebdd7218\": rpc error: code = NotFound desc = could not find container \"3f08a9edaffa8ffbd551c257cbb05e791b5dda16a4cb91d5c0bec5aeebdd7218\": container with ID starting with 3f08a9edaffa8ffbd551c257cbb05e791b5dda16a4cb91d5c0bec5aeebdd7218 not found: ID does not exist" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.770146 4749 scope.go:117] "RemoveContainer" containerID="821fb946e5eaecd9878194e09e11f4e1ee230f1ecfd0cdb47cda586ed4fc4d3c" Feb 19 18:55:07 crc kubenswrapper[4749]: E0219 18:55:07.770593 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"821fb946e5eaecd9878194e09e11f4e1ee230f1ecfd0cdb47cda586ed4fc4d3c\": container with ID starting with 821fb946e5eaecd9878194e09e11f4e1ee230f1ecfd0cdb47cda586ed4fc4d3c not found: ID does not exist" containerID="821fb946e5eaecd9878194e09e11f4e1ee230f1ecfd0cdb47cda586ed4fc4d3c" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.770634 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"821fb946e5eaecd9878194e09e11f4e1ee230f1ecfd0cdb47cda586ed4fc4d3c"} err="failed to get container status \"821fb946e5eaecd9878194e09e11f4e1ee230f1ecfd0cdb47cda586ed4fc4d3c\": rpc error: code = NotFound desc = could not find container \"821fb946e5eaecd9878194e09e11f4e1ee230f1ecfd0cdb47cda586ed4fc4d3c\": container with ID starting with 821fb946e5eaecd9878194e09e11f4e1ee230f1ecfd0cdb47cda586ed4fc4d3c not found: ID does not exist" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.770655 4749 scope.go:117] "RemoveContainer" containerID="13d8986b86d04e8742e9dfa924c64dabc518bb4c0682fe13091c69165fcf7213" Feb 19 18:55:07 crc kubenswrapper[4749]: E0219 18:55:07.770914 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13d8986b86d04e8742e9dfa924c64dabc518bb4c0682fe13091c69165fcf7213\": container with ID starting with 13d8986b86d04e8742e9dfa924c64dabc518bb4c0682fe13091c69165fcf7213 not found: ID does not exist" containerID="13d8986b86d04e8742e9dfa924c64dabc518bb4c0682fe13091c69165fcf7213" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.770944 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d8986b86d04e8742e9dfa924c64dabc518bb4c0682fe13091c69165fcf7213"} err="failed to get container status \"13d8986b86d04e8742e9dfa924c64dabc518bb4c0682fe13091c69165fcf7213\": rpc error: code = NotFound desc = could not find container \"13d8986b86d04e8742e9dfa924c64dabc518bb4c0682fe13091c69165fcf7213\": container with ID starting with 13d8986b86d04e8742e9dfa924c64dabc518bb4c0682fe13091c69165fcf7213 not found: ID does not exist" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.770964 4749 scope.go:117] "RemoveContainer" containerID="d9bca9cba8909506dc6f1e91e386c4d7134a54c719d90cb4eb55667f722f0732" Feb 19 18:55:07 crc kubenswrapper[4749]: E0219 18:55:07.771199 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9bca9cba8909506dc6f1e91e386c4d7134a54c719d90cb4eb55667f722f0732\": container with ID starting with d9bca9cba8909506dc6f1e91e386c4d7134a54c719d90cb4eb55667f722f0732 not found: ID does not exist" containerID="d9bca9cba8909506dc6f1e91e386c4d7134a54c719d90cb4eb55667f722f0732" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.771237 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9bca9cba8909506dc6f1e91e386c4d7134a54c719d90cb4eb55667f722f0732"} err="failed to get container status \"d9bca9cba8909506dc6f1e91e386c4d7134a54c719d90cb4eb55667f722f0732\": rpc error: code = NotFound desc = could not find container \"d9bca9cba8909506dc6f1e91e386c4d7134a54c719d90cb4eb55667f722f0732\": container with ID starting with d9bca9cba8909506dc6f1e91e386c4d7134a54c719d90cb4eb55667f722f0732 not found: ID does not exist" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.778182 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.786758 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.804646 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.807143 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.809345 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.809541 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.810743 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.813063 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.829304 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ef85132-80d3-4289-a603-d9bbcb2b8637-logs\") pod \"nova-api-0\" (UID: \"7ef85132-80d3-4289-a603-d9bbcb2b8637\") " pod="openstack/nova-api-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.829360 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef85132-80d3-4289-a603-d9bbcb2b8637-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7ef85132-80d3-4289-a603-d9bbcb2b8637\") " pod="openstack/nova-api-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.829484 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ef85132-80d3-4289-a603-d9bbcb2b8637-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7ef85132-80d3-4289-a603-d9bbcb2b8637\") " pod="openstack/nova-api-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.829539 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d95c\" (UniqueName: \"kubernetes.io/projected/7ef85132-80d3-4289-a603-d9bbcb2b8637-kube-api-access-9d95c\") pod \"nova-api-0\" (UID: \"7ef85132-80d3-4289-a603-d9bbcb2b8637\") " pod="openstack/nova-api-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.829556 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ef85132-80d3-4289-a603-d9bbcb2b8637-public-tls-certs\") pod \"nova-api-0\" (UID: \"7ef85132-80d3-4289-a603-d9bbcb2b8637\") " pod="openstack/nova-api-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.829589 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef85132-80d3-4289-a603-d9bbcb2b8637-config-data\") pod \"nova-api-0\" (UID: \"7ef85132-80d3-4289-a603-d9bbcb2b8637\") " pod="openstack/nova-api-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.933046 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d95c\" (UniqueName: \"kubernetes.io/projected/7ef85132-80d3-4289-a603-d9bbcb2b8637-kube-api-access-9d95c\") pod \"nova-api-0\" (UID: \"7ef85132-80d3-4289-a603-d9bbcb2b8637\") " pod="openstack/nova-api-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.933248 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e481767e-68e7-4396-b8aa-51956e378132-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e481767e-68e7-4396-b8aa-51956e378132\") " pod="openstack/ceilometer-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.933342 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpncx\" (UniqueName: \"kubernetes.io/projected/e481767e-68e7-4396-b8aa-51956e378132-kube-api-access-wpncx\") pod \"ceilometer-0\" (UID: \"e481767e-68e7-4396-b8aa-51956e378132\") " pod="openstack/ceilometer-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.933409 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ef85132-80d3-4289-a603-d9bbcb2b8637-public-tls-certs\") pod \"nova-api-0\" (UID: \"7ef85132-80d3-4289-a603-d9bbcb2b8637\") " pod="openstack/nova-api-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.933482 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e481767e-68e7-4396-b8aa-51956e378132-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e481767e-68e7-4396-b8aa-51956e378132\") " pod="openstack/ceilometer-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.933559 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef85132-80d3-4289-a603-d9bbcb2b8637-config-data\") pod \"nova-api-0\" (UID: \"7ef85132-80d3-4289-a603-d9bbcb2b8637\") " pod="openstack/nova-api-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.933627 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e481767e-68e7-4396-b8aa-51956e378132-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e481767e-68e7-4396-b8aa-51956e378132\") " pod="openstack/ceilometer-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.933722 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ef85132-80d3-4289-a603-d9bbcb2b8637-logs\") pod \"nova-api-0\" (UID: \"7ef85132-80d3-4289-a603-d9bbcb2b8637\") " pod="openstack/nova-api-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.933800 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef85132-80d3-4289-a603-d9bbcb2b8637-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7ef85132-80d3-4289-a603-d9bbcb2b8637\") " pod="openstack/nova-api-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.933880 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e481767e-68e7-4396-b8aa-51956e378132-scripts\") pod \"ceilometer-0\" (UID: \"e481767e-68e7-4396-b8aa-51956e378132\") " pod="openstack/ceilometer-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.933957 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e481767e-68e7-4396-b8aa-51956e378132-log-httpd\") pod \"ceilometer-0\" (UID: \"e481767e-68e7-4396-b8aa-51956e378132\") " pod="openstack/ceilometer-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.934022 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e481767e-68e7-4396-b8aa-51956e378132-config-data\") pod \"ceilometer-0\" (UID: \"e481767e-68e7-4396-b8aa-51956e378132\") " pod="openstack/ceilometer-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.934177 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e481767e-68e7-4396-b8aa-51956e378132-run-httpd\") pod \"ceilometer-0\" (UID: \"e481767e-68e7-4396-b8aa-51956e378132\") " pod="openstack/ceilometer-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.934255 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ef85132-80d3-4289-a603-d9bbcb2b8637-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7ef85132-80d3-4289-a603-d9bbcb2b8637\") " pod="openstack/nova-api-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.934302 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ef85132-80d3-4289-a603-d9bbcb2b8637-logs\") pod \"nova-api-0\" (UID: \"7ef85132-80d3-4289-a603-d9bbcb2b8637\") " pod="openstack/nova-api-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.937254 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ef85132-80d3-4289-a603-d9bbcb2b8637-public-tls-certs\") pod \"nova-api-0\" (UID: \"7ef85132-80d3-4289-a603-d9bbcb2b8637\") " pod="openstack/nova-api-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.937890 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef85132-80d3-4289-a603-d9bbcb2b8637-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7ef85132-80d3-4289-a603-d9bbcb2b8637\") " pod="openstack/nova-api-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.949346 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ef85132-80d3-4289-a603-d9bbcb2b8637-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7ef85132-80d3-4289-a603-d9bbcb2b8637\") " pod="openstack/nova-api-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.950346 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef85132-80d3-4289-a603-d9bbcb2b8637-config-data\") pod \"nova-api-0\" (UID: \"7ef85132-80d3-4289-a603-d9bbcb2b8637\") " pod="openstack/nova-api-0" Feb 19 18:55:07 crc kubenswrapper[4749]: I0219 18:55:07.951951 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d95c\" (UniqueName: \"kubernetes.io/projected/7ef85132-80d3-4289-a603-d9bbcb2b8637-kube-api-access-9d95c\") pod \"nova-api-0\" (UID: \"7ef85132-80d3-4289-a603-d9bbcb2b8637\") " pod="openstack/nova-api-0" Feb 19 18:55:08 crc kubenswrapper[4749]: I0219 18:55:08.035950 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e481767e-68e7-4396-b8aa-51956e378132-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e481767e-68e7-4396-b8aa-51956e378132\") " pod="openstack/ceilometer-0" Feb 19 18:55:08 crc kubenswrapper[4749]: I0219 18:55:08.036640 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e481767e-68e7-4396-b8aa-51956e378132-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e481767e-68e7-4396-b8aa-51956e378132\") " pod="openstack/ceilometer-0" Feb 19 18:55:08 crc kubenswrapper[4749]: I0219 18:55:08.037079 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e481767e-68e7-4396-b8aa-51956e378132-scripts\") pod \"ceilometer-0\" (UID: \"e481767e-68e7-4396-b8aa-51956e378132\") " pod="openstack/ceilometer-0" Feb 19 18:55:08 crc kubenswrapper[4749]: I0219 18:55:08.037132 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e481767e-68e7-4396-b8aa-51956e378132-log-httpd\") pod \"ceilometer-0\" (UID: \"e481767e-68e7-4396-b8aa-51956e378132\") " pod="openstack/ceilometer-0" Feb 19 18:55:08 crc kubenswrapper[4749]: I0219 18:55:08.037172 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e481767e-68e7-4396-b8aa-51956e378132-config-data\") pod \"ceilometer-0\" (UID: \"e481767e-68e7-4396-b8aa-51956e378132\") " pod="openstack/ceilometer-0" Feb 19 18:55:08 crc kubenswrapper[4749]: I0219 18:55:08.037227 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e481767e-68e7-4396-b8aa-51956e378132-run-httpd\") pod \"ceilometer-0\" (UID: \"e481767e-68e7-4396-b8aa-51956e378132\") " pod="openstack/ceilometer-0" Feb 19 18:55:08 crc kubenswrapper[4749]: I0219 18:55:08.037359 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e481767e-68e7-4396-b8aa-51956e378132-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e481767e-68e7-4396-b8aa-51956e378132\") " pod="openstack/ceilometer-0" Feb 19 18:55:08 crc kubenswrapper[4749]: I0219 18:55:08.037382 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpncx\" (UniqueName: \"kubernetes.io/projected/e481767e-68e7-4396-b8aa-51956e378132-kube-api-access-wpncx\") pod \"ceilometer-0\" (UID: \"e481767e-68e7-4396-b8aa-51956e378132\") " pod="openstack/ceilometer-0" Feb 19 18:55:08 crc kubenswrapper[4749]: I0219 18:55:08.037652 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e481767e-68e7-4396-b8aa-51956e378132-log-httpd\") pod \"ceilometer-0\" (UID: \"e481767e-68e7-4396-b8aa-51956e378132\") " pod="openstack/ceilometer-0" Feb 19 18:55:08 crc kubenswrapper[4749]: I0219 18:55:08.037970 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e481767e-68e7-4396-b8aa-51956e378132-run-httpd\") pod \"ceilometer-0\" (UID: \"e481767e-68e7-4396-b8aa-51956e378132\") " pod="openstack/ceilometer-0" Feb 19 18:55:08 crc kubenswrapper[4749]: I0219 18:55:08.043539 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e481767e-68e7-4396-b8aa-51956e378132-scripts\") pod \"ceilometer-0\" (UID: \"e481767e-68e7-4396-b8aa-51956e378132\") " pod="openstack/ceilometer-0" Feb 19 18:55:08 crc kubenswrapper[4749]: I0219 18:55:08.043615 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e481767e-68e7-4396-b8aa-51956e378132-config-data\") pod \"ceilometer-0\" (UID: \"e481767e-68e7-4396-b8aa-51956e378132\") " pod="openstack/ceilometer-0" Feb 19 18:55:08 crc kubenswrapper[4749]: I0219 18:55:08.043917 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e481767e-68e7-4396-b8aa-51956e378132-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e481767e-68e7-4396-b8aa-51956e378132\") " pod="openstack/ceilometer-0" Feb 19 18:55:08 crc kubenswrapper[4749]: I0219 18:55:08.044556 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e481767e-68e7-4396-b8aa-51956e378132-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e481767e-68e7-4396-b8aa-51956e378132\") " pod="openstack/ceilometer-0" Feb 19 18:55:08 crc kubenswrapper[4749]: I0219 18:55:08.044916 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e481767e-68e7-4396-b8aa-51956e378132-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e481767e-68e7-4396-b8aa-51956e378132\") " pod="openstack/ceilometer-0" Feb 19 18:55:08 crc kubenswrapper[4749]: I0219 18:55:08.055399 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpncx\" (UniqueName: \"kubernetes.io/projected/e481767e-68e7-4396-b8aa-51956e378132-kube-api-access-wpncx\") pod \"ceilometer-0\" (UID: \"e481767e-68e7-4396-b8aa-51956e378132\") " pod="openstack/ceilometer-0" Feb 19 18:55:08 crc kubenswrapper[4749]: I0219 18:55:08.128665 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 18:55:08 crc kubenswrapper[4749]: I0219 18:55:08.242960 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 18:55:08 crc kubenswrapper[4749]: I0219 18:55:08.394313 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w2ns9" podUID="144a66be-49a6-470b-af60-764fa5d4d7dd" containerName="registry-server" probeResult="failure" output=< Feb 19 18:55:08 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Feb 19 18:55:08 crc kubenswrapper[4749]: > Feb 19 18:55:08 crc kubenswrapper[4749]: I0219 18:55:08.618008 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 18:55:08 crc kubenswrapper[4749]: I0219 18:55:08.688950 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71f6b71f-4791-4a38-b5ae-282a0f643b6c" path="/var/lib/kubelet/pods/71f6b71f-4791-4a38-b5ae-282a0f643b6c/volumes" Feb 19 18:55:08 crc kubenswrapper[4749]: I0219 18:55:08.689615 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db79f1f6-dfad-45fd-b30c-c88c883f6318" path="/var/lib/kubelet/pods/db79f1f6-dfad-45fd-b30c-c88c883f6318/volumes" Feb 19 18:55:08 crc kubenswrapper[4749]: I0219 18:55:08.757605 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 18:55:08 crc kubenswrapper[4749]: W0219 18:55:08.759506 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ef85132_80d3_4289_a603_d9bbcb2b8637.slice/crio-1a7b6b44e87066139ec386eb4bfe51eeaffb82cb061b3122d2e0b0ccf9760722 WatchSource:0}: Error finding container 1a7b6b44e87066139ec386eb4bfe51eeaffb82cb061b3122d2e0b0ccf9760722: Status 404 returned error can't find the container with id 1a7b6b44e87066139ec386eb4bfe51eeaffb82cb061b3122d2e0b0ccf9760722 Feb 19 18:55:09 crc kubenswrapper[4749]: I0219 18:55:09.388340 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e481767e-68e7-4396-b8aa-51956e378132","Type":"ContainerStarted","Data":"ad01b49a3b694436b9519acda967f917ee240f05bec10e08d4993768d848297c"} Feb 19 18:55:09 crc kubenswrapper[4749]: I0219 18:55:09.390635 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ef85132-80d3-4289-a603-d9bbcb2b8637","Type":"ContainerStarted","Data":"0253be515aabad40db4cfa4d13e56d2b582427f1ba71f2c5d50e4b81e162c242"} Feb 19 18:55:09 crc kubenswrapper[4749]: I0219 18:55:09.390716 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ef85132-80d3-4289-a603-d9bbcb2b8637","Type":"ContainerStarted","Data":"1a7b6b44e87066139ec386eb4bfe51eeaffb82cb061b3122d2e0b0ccf9760722"} Feb 19 18:55:10 crc kubenswrapper[4749]: I0219 18:55:10.400978 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ef85132-80d3-4289-a603-d9bbcb2b8637","Type":"ContainerStarted","Data":"d31ec1881a22895ad6394dd0e8f3ba624edd0a633207e7a2d4cdda21f256eafc"} Feb 19 18:55:10 crc kubenswrapper[4749]: I0219 18:55:10.404684 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e481767e-68e7-4396-b8aa-51956e378132","Type":"ContainerStarted","Data":"a9472be05f7963adea775cb1f68fefe409979ed3017298f035968e327da15969"} Feb 19 18:55:10 crc kubenswrapper[4749]: I0219 18:55:10.404758 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e481767e-68e7-4396-b8aa-51956e378132","Type":"ContainerStarted","Data":"fdcb8ebd474c6d3a8a3b25a862388573fc16e225042117f39f94ed4de889e608"} Feb 19 18:55:10 crc kubenswrapper[4749]: I0219 18:55:10.448470 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.448448278 podStartE2EDuration="3.448448278s" podCreationTimestamp="2026-02-19 18:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:55:10.417687443 +0000 UTC m=+1284.378907407" watchObservedRunningTime="2026-02-19 18:55:10.448448278 +0000 UTC m=+1284.409668232" Feb 19 18:55:10 crc kubenswrapper[4749]: I0219 18:55:10.931211 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:55:10 crc kubenswrapper[4749]: I0219 18:55:10.949819 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:55:11 crc kubenswrapper[4749]: I0219 18:55:11.418397 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e481767e-68e7-4396-b8aa-51956e378132","Type":"ContainerStarted","Data":"a199a179101c4413ed0df3c8f7bb775067bb4652f1842205b1d3f527deb93a0e"} Feb 19 18:55:11 crc kubenswrapper[4749]: I0219 18:55:11.436365 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 19 18:55:11 crc kubenswrapper[4749]: I0219 18:55:11.586399 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-4j58v"] Feb 19 18:55:11 crc kubenswrapper[4749]: I0219 18:55:11.587763 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4j58v" Feb 19 18:55:11 crc kubenswrapper[4749]: I0219 18:55:11.589394 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 19 18:55:11 crc kubenswrapper[4749]: I0219 18:55:11.590107 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 19 18:55:11 crc kubenswrapper[4749]: I0219 18:55:11.598285 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-4j58v"] Feb 19 18:55:11 crc kubenswrapper[4749]: I0219 18:55:11.714491 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886c599b-2da2-4553-bc21-4b0b4e50c3bc-config-data\") pod \"nova-cell1-cell-mapping-4j58v\" (UID: \"886c599b-2da2-4553-bc21-4b0b4e50c3bc\") " pod="openstack/nova-cell1-cell-mapping-4j58v" Feb 19 18:55:11 crc kubenswrapper[4749]: I0219 18:55:11.714557 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886c599b-2da2-4553-bc21-4b0b4e50c3bc-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4j58v\" (UID: \"886c599b-2da2-4553-bc21-4b0b4e50c3bc\") " pod="openstack/nova-cell1-cell-mapping-4j58v" Feb 19 18:55:11 crc kubenswrapper[4749]: I0219 18:55:11.714632 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b826d\" (UniqueName: \"kubernetes.io/projected/886c599b-2da2-4553-bc21-4b0b4e50c3bc-kube-api-access-b826d\") pod \"nova-cell1-cell-mapping-4j58v\" (UID: \"886c599b-2da2-4553-bc21-4b0b4e50c3bc\") " pod="openstack/nova-cell1-cell-mapping-4j58v" Feb 19 18:55:11 crc kubenswrapper[4749]: I0219 18:55:11.714680 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/886c599b-2da2-4553-bc21-4b0b4e50c3bc-scripts\") pod \"nova-cell1-cell-mapping-4j58v\" (UID: \"886c599b-2da2-4553-bc21-4b0b4e50c3bc\") " pod="openstack/nova-cell1-cell-mapping-4j58v" Feb 19 18:55:11 crc kubenswrapper[4749]: I0219 18:55:11.816537 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/886c599b-2da2-4553-bc21-4b0b4e50c3bc-scripts\") pod \"nova-cell1-cell-mapping-4j58v\" (UID: \"886c599b-2da2-4553-bc21-4b0b4e50c3bc\") " pod="openstack/nova-cell1-cell-mapping-4j58v" Feb 19 18:55:11 crc kubenswrapper[4749]: I0219 18:55:11.818419 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886c599b-2da2-4553-bc21-4b0b4e50c3bc-config-data\") pod \"nova-cell1-cell-mapping-4j58v\" (UID: \"886c599b-2da2-4553-bc21-4b0b4e50c3bc\") " pod="openstack/nova-cell1-cell-mapping-4j58v" Feb 19 18:55:11 crc kubenswrapper[4749]: I0219 18:55:11.818563 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886c599b-2da2-4553-bc21-4b0b4e50c3bc-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4j58v\" (UID: \"886c599b-2da2-4553-bc21-4b0b4e50c3bc\") " pod="openstack/nova-cell1-cell-mapping-4j58v" Feb 19 18:55:11 crc kubenswrapper[4749]: I0219 18:55:11.818616 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b826d\" (UniqueName: \"kubernetes.io/projected/886c599b-2da2-4553-bc21-4b0b4e50c3bc-kube-api-access-b826d\") pod \"nova-cell1-cell-mapping-4j58v\" (UID: \"886c599b-2da2-4553-bc21-4b0b4e50c3bc\") " pod="openstack/nova-cell1-cell-mapping-4j58v" Feb 19 18:55:11 crc kubenswrapper[4749]: I0219 18:55:11.821956 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/886c599b-2da2-4553-bc21-4b0b4e50c3bc-scripts\") pod \"nova-cell1-cell-mapping-4j58v\" (UID: \"886c599b-2da2-4553-bc21-4b0b4e50c3bc\") " pod="openstack/nova-cell1-cell-mapping-4j58v" Feb 19 18:55:11 crc kubenswrapper[4749]: I0219 18:55:11.822142 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886c599b-2da2-4553-bc21-4b0b4e50c3bc-config-data\") pod \"nova-cell1-cell-mapping-4j58v\" (UID: \"886c599b-2da2-4553-bc21-4b0b4e50c3bc\") " pod="openstack/nova-cell1-cell-mapping-4j58v" Feb 19 18:55:11 crc kubenswrapper[4749]: I0219 18:55:11.822813 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886c599b-2da2-4553-bc21-4b0b4e50c3bc-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4j58v\" (UID: \"886c599b-2da2-4553-bc21-4b0b4e50c3bc\") " pod="openstack/nova-cell1-cell-mapping-4j58v" Feb 19 18:55:11 crc kubenswrapper[4749]: I0219 18:55:11.843126 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b826d\" (UniqueName: \"kubernetes.io/projected/886c599b-2da2-4553-bc21-4b0b4e50c3bc-kube-api-access-b826d\") pod \"nova-cell1-cell-mapping-4j58v\" (UID: \"886c599b-2da2-4553-bc21-4b0b4e50c3bc\") " pod="openstack/nova-cell1-cell-mapping-4j58v" Feb 19 18:55:11 crc kubenswrapper[4749]: I0219 18:55:11.907293 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4j58v" Feb 19 18:55:12 crc kubenswrapper[4749]: W0219 18:55:12.368605 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod886c599b_2da2_4553_bc21_4b0b4e50c3bc.slice/crio-95511f3c830d2125df0060ba6d6a028e34a6e76568df57792743ad947ce722f4 WatchSource:0}: Error finding container 95511f3c830d2125df0060ba6d6a028e34a6e76568df57792743ad947ce722f4: Status 404 returned error can't find the container with id 95511f3c830d2125df0060ba6d6a028e34a6e76568df57792743ad947ce722f4 Feb 19 18:55:12 crc kubenswrapper[4749]: I0219 18:55:12.371695 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-4j58v"] Feb 19 18:55:12 crc kubenswrapper[4749]: I0219 18:55:12.436487 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4j58v" event={"ID":"886c599b-2da2-4553-bc21-4b0b4e50c3bc","Type":"ContainerStarted","Data":"95511f3c830d2125df0060ba6d6a028e34a6e76568df57792743ad947ce722f4"} Feb 19 18:55:12 crc kubenswrapper[4749]: I0219 18:55:12.439664 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e481767e-68e7-4396-b8aa-51956e378132","Type":"ContainerStarted","Data":"62a371f88caa3c81ebef765523b1c1460b12eafc1c2f9e560ba8bbd8c6a054aa"} Feb 19 18:55:12 crc kubenswrapper[4749]: I0219 18:55:12.476422 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.927403075 podStartE2EDuration="5.476397585s" podCreationTimestamp="2026-02-19 18:55:07 +0000 UTC" firstStartedPulling="2026-02-19 18:55:08.623400045 +0000 UTC m=+1282.584619999" lastFinishedPulling="2026-02-19 18:55:12.172394565 +0000 UTC m=+1286.133614509" observedRunningTime="2026-02-19 18:55:12.462107759 +0000 UTC m=+1286.423327803" watchObservedRunningTime="2026-02-19 18:55:12.476397585 +0000 UTC m=+1286.437617549" Feb 19 18:55:12 crc kubenswrapper[4749]: I0219 18:55:12.769193 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-597b667f69-7hpgv" Feb 19 18:55:12 crc kubenswrapper[4749]: I0219 18:55:12.846836 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65c47c5f7-8xl4v"] Feb 19 18:55:12 crc kubenswrapper[4749]: I0219 18:55:12.847380 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" podUID="6ba77726-b8ed-4851-8796-ddda0f4d4406" containerName="dnsmasq-dns" containerID="cri-o://aecfb5f214ab6faa87dc13e057b0bcb9d6481569718066b3b2af4a54e23798ab" gracePeriod=10 Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.372794 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.448701 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4j58v" event={"ID":"886c599b-2da2-4553-bc21-4b0b4e50c3bc","Type":"ContainerStarted","Data":"f59c625d7a05c202e5b8535a609c6df2377e8d2d375519dd3af845ed1b81b0a2"} Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.450929 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-dns-svc\") pod \"6ba77726-b8ed-4851-8796-ddda0f4d4406\" (UID: \"6ba77726-b8ed-4851-8796-ddda0f4d4406\") " Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.451070 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-config\") pod \"6ba77726-b8ed-4851-8796-ddda0f4d4406\" (UID: \"6ba77726-b8ed-4851-8796-ddda0f4d4406\") " Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.451118 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-ovsdbserver-nb\") pod \"6ba77726-b8ed-4851-8796-ddda0f4d4406\" (UID: \"6ba77726-b8ed-4851-8796-ddda0f4d4406\") " Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.451168 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-dns-swift-storage-0\") pod \"6ba77726-b8ed-4851-8796-ddda0f4d4406\" (UID: \"6ba77726-b8ed-4851-8796-ddda0f4d4406\") " Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.451200 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwsck\" (UniqueName: \"kubernetes.io/projected/6ba77726-b8ed-4851-8796-ddda0f4d4406-kube-api-access-nwsck\") pod \"6ba77726-b8ed-4851-8796-ddda0f4d4406\" (UID: \"6ba77726-b8ed-4851-8796-ddda0f4d4406\") " Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.451224 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-ovsdbserver-sb\") pod \"6ba77726-b8ed-4851-8796-ddda0f4d4406\" (UID: \"6ba77726-b8ed-4851-8796-ddda0f4d4406\") " Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.455531 4749 generic.go:334] "Generic (PLEG): container finished" podID="6ba77726-b8ed-4851-8796-ddda0f4d4406" containerID="aecfb5f214ab6faa87dc13e057b0bcb9d6481569718066b3b2af4a54e23798ab" exitCode=0 Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.456536 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.456697 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" event={"ID":"6ba77726-b8ed-4851-8796-ddda0f4d4406","Type":"ContainerDied","Data":"aecfb5f214ab6faa87dc13e057b0bcb9d6481569718066b3b2af4a54e23798ab"} Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.456731 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.456743 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c47c5f7-8xl4v" event={"ID":"6ba77726-b8ed-4851-8796-ddda0f4d4406","Type":"ContainerDied","Data":"b42eb91c31ae37329644415de1ac81fb41a50509a6e3277817978af05bc574b5"} Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.456762 4749 scope.go:117] "RemoveContainer" containerID="aecfb5f214ab6faa87dc13e057b0bcb9d6481569718066b3b2af4a54e23798ab" Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.478678 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-4j58v" podStartSLOduration=2.47865984 podStartE2EDuration="2.47865984s" podCreationTimestamp="2026-02-19 18:55:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:55:13.472177152 +0000 UTC m=+1287.433397116" watchObservedRunningTime="2026-02-19 18:55:13.47865984 +0000 UTC m=+1287.439879794" Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.494631 4749 scope.go:117] "RemoveContainer" containerID="a05887b283e626e6dc36df5094cee78370b324544e091f5b966b839486ceda81" Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.505306 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ba77726-b8ed-4851-8796-ddda0f4d4406-kube-api-access-nwsck" (OuterVolumeSpecName: "kube-api-access-nwsck") pod "6ba77726-b8ed-4851-8796-ddda0f4d4406" (UID: "6ba77726-b8ed-4851-8796-ddda0f4d4406"). InnerVolumeSpecName "kube-api-access-nwsck". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.515951 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-config" (OuterVolumeSpecName: "config") pod "6ba77726-b8ed-4851-8796-ddda0f4d4406" (UID: "6ba77726-b8ed-4851-8796-ddda0f4d4406"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.541073 4749 scope.go:117] "RemoveContainer" containerID="aecfb5f214ab6faa87dc13e057b0bcb9d6481569718066b3b2af4a54e23798ab" Feb 19 18:55:13 crc kubenswrapper[4749]: E0219 18:55:13.542820 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aecfb5f214ab6faa87dc13e057b0bcb9d6481569718066b3b2af4a54e23798ab\": container with ID starting with aecfb5f214ab6faa87dc13e057b0bcb9d6481569718066b3b2af4a54e23798ab not found: ID does not exist" containerID="aecfb5f214ab6faa87dc13e057b0bcb9d6481569718066b3b2af4a54e23798ab" Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.542901 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aecfb5f214ab6faa87dc13e057b0bcb9d6481569718066b3b2af4a54e23798ab"} err="failed to get container status \"aecfb5f214ab6faa87dc13e057b0bcb9d6481569718066b3b2af4a54e23798ab\": rpc error: code = NotFound desc = could not find container \"aecfb5f214ab6faa87dc13e057b0bcb9d6481569718066b3b2af4a54e23798ab\": container with ID starting with aecfb5f214ab6faa87dc13e057b0bcb9d6481569718066b3b2af4a54e23798ab not found: ID does not exist" Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.542922 4749 scope.go:117] "RemoveContainer" containerID="a05887b283e626e6dc36df5094cee78370b324544e091f5b966b839486ceda81" Feb 19 18:55:13 crc kubenswrapper[4749]: E0219 18:55:13.543388 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a05887b283e626e6dc36df5094cee78370b324544e091f5b966b839486ceda81\": container with ID starting with a05887b283e626e6dc36df5094cee78370b324544e091f5b966b839486ceda81 not found: ID does not exist" containerID="a05887b283e626e6dc36df5094cee78370b324544e091f5b966b839486ceda81" Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.543411 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a05887b283e626e6dc36df5094cee78370b324544e091f5b966b839486ceda81"} err="failed to get container status \"a05887b283e626e6dc36df5094cee78370b324544e091f5b966b839486ceda81\": rpc error: code = NotFound desc = could not find container \"a05887b283e626e6dc36df5094cee78370b324544e091f5b966b839486ceda81\": container with ID starting with a05887b283e626e6dc36df5094cee78370b324544e091f5b966b839486ceda81 not found: ID does not exist" Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.544051 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6ba77726-b8ed-4851-8796-ddda0f4d4406" (UID: "6ba77726-b8ed-4851-8796-ddda0f4d4406"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.553437 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6ba77726-b8ed-4851-8796-ddda0f4d4406" (UID: "6ba77726-b8ed-4851-8796-ddda0f4d4406"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.555343 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.555366 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwsck\" (UniqueName: \"kubernetes.io/projected/6ba77726-b8ed-4851-8796-ddda0f4d4406-kube-api-access-nwsck\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.555377 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.555387 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.573558 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6ba77726-b8ed-4851-8796-ddda0f4d4406" (UID: "6ba77726-b8ed-4851-8796-ddda0f4d4406"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.575964 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6ba77726-b8ed-4851-8796-ddda0f4d4406" (UID: "6ba77726-b8ed-4851-8796-ddda0f4d4406"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.657631 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.657665 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ba77726-b8ed-4851-8796-ddda0f4d4406-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.806489 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65c47c5f7-8xl4v"] Feb 19 18:55:13 crc kubenswrapper[4749]: I0219 18:55:13.825905 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65c47c5f7-8xl4v"] Feb 19 18:55:14 crc kubenswrapper[4749]: I0219 18:55:14.697565 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ba77726-b8ed-4851-8796-ddda0f4d4406" path="/var/lib/kubelet/pods/6ba77726-b8ed-4851-8796-ddda0f4d4406/volumes" Feb 19 18:55:18 crc kubenswrapper[4749]: I0219 18:55:18.243801 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 18:55:18 crc kubenswrapper[4749]: I0219 18:55:18.244743 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 18:55:18 crc kubenswrapper[4749]: I0219 18:55:18.353607 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w2ns9" podUID="144a66be-49a6-470b-af60-764fa5d4d7dd" containerName="registry-server" probeResult="failure" output=< Feb 19 18:55:18 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Feb 19 18:55:18 crc kubenswrapper[4749]: > Feb 19 18:55:18 crc kubenswrapper[4749]: I0219 18:55:18.512777 4749 generic.go:334] "Generic (PLEG): container finished" podID="886c599b-2da2-4553-bc21-4b0b4e50c3bc" containerID="f59c625d7a05c202e5b8535a609c6df2377e8d2d375519dd3af845ed1b81b0a2" exitCode=0 Feb 19 18:55:18 crc kubenswrapper[4749]: I0219 18:55:18.512826 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4j58v" event={"ID":"886c599b-2da2-4553-bc21-4b0b4e50c3bc","Type":"ContainerDied","Data":"f59c625d7a05c202e5b8535a609c6df2377e8d2d375519dd3af845ed1b81b0a2"} Feb 19 18:55:19 crc kubenswrapper[4749]: I0219 18:55:19.258201 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7ef85132-80d3-4289-a603-d9bbcb2b8637" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.227:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 18:55:19 crc kubenswrapper[4749]: I0219 18:55:19.258226 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7ef85132-80d3-4289-a603-d9bbcb2b8637" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.227:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 18:55:19 crc kubenswrapper[4749]: I0219 18:55:19.857460 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4j58v" Feb 19 18:55:19 crc kubenswrapper[4749]: I0219 18:55:19.979834 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886c599b-2da2-4553-bc21-4b0b4e50c3bc-config-data\") pod \"886c599b-2da2-4553-bc21-4b0b4e50c3bc\" (UID: \"886c599b-2da2-4553-bc21-4b0b4e50c3bc\") " Feb 19 18:55:19 crc kubenswrapper[4749]: I0219 18:55:19.979871 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/886c599b-2da2-4553-bc21-4b0b4e50c3bc-scripts\") pod \"886c599b-2da2-4553-bc21-4b0b4e50c3bc\" (UID: \"886c599b-2da2-4553-bc21-4b0b4e50c3bc\") " Feb 19 18:55:19 crc kubenswrapper[4749]: I0219 18:55:19.979941 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886c599b-2da2-4553-bc21-4b0b4e50c3bc-combined-ca-bundle\") pod \"886c599b-2da2-4553-bc21-4b0b4e50c3bc\" (UID: \"886c599b-2da2-4553-bc21-4b0b4e50c3bc\") " Feb 19 18:55:19 crc kubenswrapper[4749]: I0219 18:55:19.980005 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b826d\" (UniqueName: \"kubernetes.io/projected/886c599b-2da2-4553-bc21-4b0b4e50c3bc-kube-api-access-b826d\") pod \"886c599b-2da2-4553-bc21-4b0b4e50c3bc\" (UID: \"886c599b-2da2-4553-bc21-4b0b4e50c3bc\") " Feb 19 18:55:19 crc kubenswrapper[4749]: I0219 18:55:19.991792 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/886c599b-2da2-4553-bc21-4b0b4e50c3bc-scripts" (OuterVolumeSpecName: "scripts") pod "886c599b-2da2-4553-bc21-4b0b4e50c3bc" (UID: "886c599b-2da2-4553-bc21-4b0b4e50c3bc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:55:19 crc kubenswrapper[4749]: I0219 18:55:19.992207 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/886c599b-2da2-4553-bc21-4b0b4e50c3bc-kube-api-access-b826d" (OuterVolumeSpecName: "kube-api-access-b826d") pod "886c599b-2da2-4553-bc21-4b0b4e50c3bc" (UID: "886c599b-2da2-4553-bc21-4b0b4e50c3bc"). InnerVolumeSpecName "kube-api-access-b826d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:55:20 crc kubenswrapper[4749]: I0219 18:55:20.013560 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/886c599b-2da2-4553-bc21-4b0b4e50c3bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "886c599b-2da2-4553-bc21-4b0b4e50c3bc" (UID: "886c599b-2da2-4553-bc21-4b0b4e50c3bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:55:20 crc kubenswrapper[4749]: I0219 18:55:20.023530 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/886c599b-2da2-4553-bc21-4b0b4e50c3bc-config-data" (OuterVolumeSpecName: "config-data") pod "886c599b-2da2-4553-bc21-4b0b4e50c3bc" (UID: "886c599b-2da2-4553-bc21-4b0b4e50c3bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:55:20 crc kubenswrapper[4749]: I0219 18:55:20.082120 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886c599b-2da2-4553-bc21-4b0b4e50c3bc-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:20 crc kubenswrapper[4749]: I0219 18:55:20.082149 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/886c599b-2da2-4553-bc21-4b0b4e50c3bc-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:20 crc kubenswrapper[4749]: I0219 18:55:20.082158 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886c599b-2da2-4553-bc21-4b0b4e50c3bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:20 crc kubenswrapper[4749]: I0219 18:55:20.082171 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b826d\" (UniqueName: \"kubernetes.io/projected/886c599b-2da2-4553-bc21-4b0b4e50c3bc-kube-api-access-b826d\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:20 crc kubenswrapper[4749]: I0219 18:55:20.539151 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4j58v" event={"ID":"886c599b-2da2-4553-bc21-4b0b4e50c3bc","Type":"ContainerDied","Data":"95511f3c830d2125df0060ba6d6a028e34a6e76568df57792743ad947ce722f4"} Feb 19 18:55:20 crc kubenswrapper[4749]: I0219 18:55:20.539554 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95511f3c830d2125df0060ba6d6a028e34a6e76568df57792743ad947ce722f4" Feb 19 18:55:20 crc kubenswrapper[4749]: I0219 18:55:20.539216 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4j58v" Feb 19 18:55:20 crc kubenswrapper[4749]: I0219 18:55:20.700849 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 18:55:20 crc kubenswrapper[4749]: I0219 18:55:20.701337 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7ef85132-80d3-4289-a603-d9bbcb2b8637" containerName="nova-api-log" containerID="cri-o://0253be515aabad40db4cfa4d13e56d2b582427f1ba71f2c5d50e4b81e162c242" gracePeriod=30 Feb 19 18:55:20 crc kubenswrapper[4749]: I0219 18:55:20.701468 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7ef85132-80d3-4289-a603-d9bbcb2b8637" containerName="nova-api-api" containerID="cri-o://d31ec1881a22895ad6394dd0e8f3ba624edd0a633207e7a2d4cdda21f256eafc" gracePeriod=30 Feb 19 18:55:20 crc kubenswrapper[4749]: I0219 18:55:20.714714 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 18:55:20 crc kubenswrapper[4749]: I0219 18:55:20.715169 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="471c4100-a672-4b83-842f-03a42aae3a92" containerName="nova-scheduler-scheduler" containerID="cri-o://c29fa33ea1c8273b6bfee895a8e4ed0b697e18720714858f0daabdf5f958be85" gracePeriod=30 Feb 19 18:55:20 crc kubenswrapper[4749]: I0219 18:55:20.768171 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 18:55:20 crc kubenswrapper[4749]: I0219 18:55:20.768457 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f489af23-b6c5-42a0-84ef-f5fb1910c79a" containerName="nova-metadata-log" containerID="cri-o://07e4157fa590b68f6ecc27b24f85c08e555259a8f57179f4ad33b2e8eac11a80" gracePeriod=30 Feb 19 18:55:20 crc kubenswrapper[4749]: I0219 18:55:20.769075 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f489af23-b6c5-42a0-84ef-f5fb1910c79a" containerName="nova-metadata-metadata" containerID="cri-o://6ae26512a4b1f22b7d58307ebf2651e439bb635fc341d284ccf0f05624a449ad" gracePeriod=30 Feb 19 18:55:21 crc kubenswrapper[4749]: I0219 18:55:21.548645 4749 generic.go:334] "Generic (PLEG): container finished" podID="7ef85132-80d3-4289-a603-d9bbcb2b8637" containerID="0253be515aabad40db4cfa4d13e56d2b582427f1ba71f2c5d50e4b81e162c242" exitCode=143 Feb 19 18:55:21 crc kubenswrapper[4749]: I0219 18:55:21.548834 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ef85132-80d3-4289-a603-d9bbcb2b8637","Type":"ContainerDied","Data":"0253be515aabad40db4cfa4d13e56d2b582427f1ba71f2c5d50e4b81e162c242"} Feb 19 18:55:21 crc kubenswrapper[4749]: I0219 18:55:21.550953 4749 generic.go:334] "Generic (PLEG): container finished" podID="f489af23-b6c5-42a0-84ef-f5fb1910c79a" containerID="07e4157fa590b68f6ecc27b24f85c08e555259a8f57179f4ad33b2e8eac11a80" exitCode=143 Feb 19 18:55:21 crc kubenswrapper[4749]: I0219 18:55:21.550981 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f489af23-b6c5-42a0-84ef-f5fb1910c79a","Type":"ContainerDied","Data":"07e4157fa590b68f6ecc27b24f85c08e555259a8f57179f4ad33b2e8eac11a80"} Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.064000 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.226505 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f489af23-b6c5-42a0-84ef-f5fb1910c79a-nova-metadata-tls-certs\") pod \"f489af23-b6c5-42a0-84ef-f5fb1910c79a\" (UID: \"f489af23-b6c5-42a0-84ef-f5fb1910c79a\") " Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.226594 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f489af23-b6c5-42a0-84ef-f5fb1910c79a-combined-ca-bundle\") pod \"f489af23-b6c5-42a0-84ef-f5fb1910c79a\" (UID: \"f489af23-b6c5-42a0-84ef-f5fb1910c79a\") " Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.226629 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f489af23-b6c5-42a0-84ef-f5fb1910c79a-logs\") pod \"f489af23-b6c5-42a0-84ef-f5fb1910c79a\" (UID: \"f489af23-b6c5-42a0-84ef-f5fb1910c79a\") " Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.226674 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f489af23-b6c5-42a0-84ef-f5fb1910c79a-config-data\") pod \"f489af23-b6c5-42a0-84ef-f5fb1910c79a\" (UID: \"f489af23-b6c5-42a0-84ef-f5fb1910c79a\") " Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.226797 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6w6g\" (UniqueName: \"kubernetes.io/projected/f489af23-b6c5-42a0-84ef-f5fb1910c79a-kube-api-access-z6w6g\") pod \"f489af23-b6c5-42a0-84ef-f5fb1910c79a\" (UID: \"f489af23-b6c5-42a0-84ef-f5fb1910c79a\") " Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.227655 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f489af23-b6c5-42a0-84ef-f5fb1910c79a-logs" (OuterVolumeSpecName: "logs") pod "f489af23-b6c5-42a0-84ef-f5fb1910c79a" (UID: "f489af23-b6c5-42a0-84ef-f5fb1910c79a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.234138 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f489af23-b6c5-42a0-84ef-f5fb1910c79a-kube-api-access-z6w6g" (OuterVolumeSpecName: "kube-api-access-z6w6g") pod "f489af23-b6c5-42a0-84ef-f5fb1910c79a" (UID: "f489af23-b6c5-42a0-84ef-f5fb1910c79a"). InnerVolumeSpecName "kube-api-access-z6w6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.259612 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f489af23-b6c5-42a0-84ef-f5fb1910c79a-config-data" (OuterVolumeSpecName: "config-data") pod "f489af23-b6c5-42a0-84ef-f5fb1910c79a" (UID: "f489af23-b6c5-42a0-84ef-f5fb1910c79a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.272535 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f489af23-b6c5-42a0-84ef-f5fb1910c79a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f489af23-b6c5-42a0-84ef-f5fb1910c79a" (UID: "f489af23-b6c5-42a0-84ef-f5fb1910c79a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.317590 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f489af23-b6c5-42a0-84ef-f5fb1910c79a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f489af23-b6c5-42a0-84ef-f5fb1910c79a" (UID: "f489af23-b6c5-42a0-84ef-f5fb1910c79a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.330138 4749 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f489af23-b6c5-42a0-84ef-f5fb1910c79a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.330170 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f489af23-b6c5-42a0-84ef-f5fb1910c79a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.330180 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f489af23-b6c5-42a0-84ef-f5fb1910c79a-logs\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.330189 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f489af23-b6c5-42a0-84ef-f5fb1910c79a-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.330199 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6w6g\" (UniqueName: \"kubernetes.io/projected/f489af23-b6c5-42a0-84ef-f5fb1910c79a-kube-api-access-z6w6g\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:22 crc kubenswrapper[4749]: E0219 18:55:22.379397 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c29fa33ea1c8273b6bfee895a8e4ed0b697e18720714858f0daabdf5f958be85" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 18:55:22 crc kubenswrapper[4749]: E0219 18:55:22.380663 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c29fa33ea1c8273b6bfee895a8e4ed0b697e18720714858f0daabdf5f958be85" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 18:55:22 crc kubenswrapper[4749]: E0219 18:55:22.383341 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c29fa33ea1c8273b6bfee895a8e4ed0b697e18720714858f0daabdf5f958be85" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 18:55:22 crc kubenswrapper[4749]: E0219 18:55:22.383369 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="471c4100-a672-4b83-842f-03a42aae3a92" containerName="nova-scheduler-scheduler" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.561204 4749 generic.go:334] "Generic (PLEG): container finished" podID="f489af23-b6c5-42a0-84ef-f5fb1910c79a" containerID="6ae26512a4b1f22b7d58307ebf2651e439bb635fc341d284ccf0f05624a449ad" exitCode=0 Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.561242 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f489af23-b6c5-42a0-84ef-f5fb1910c79a","Type":"ContainerDied","Data":"6ae26512a4b1f22b7d58307ebf2651e439bb635fc341d284ccf0f05624a449ad"} Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.561267 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f489af23-b6c5-42a0-84ef-f5fb1910c79a","Type":"ContainerDied","Data":"43e74321512e355c3daa85b6e126ec17b1a2227bd926560bf1df662688e4d10a"} Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.561283 4749 scope.go:117] "RemoveContainer" containerID="6ae26512a4b1f22b7d58307ebf2651e439bb635fc341d284ccf0f05624a449ad" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.561393 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.595228 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.595381 4749 scope.go:117] "RemoveContainer" containerID="07e4157fa590b68f6ecc27b24f85c08e555259a8f57179f4ad33b2e8eac11a80" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.609404 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.630267 4749 scope.go:117] "RemoveContainer" containerID="6ae26512a4b1f22b7d58307ebf2651e439bb635fc341d284ccf0f05624a449ad" Feb 19 18:55:22 crc kubenswrapper[4749]: E0219 18:55:22.630738 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ae26512a4b1f22b7d58307ebf2651e439bb635fc341d284ccf0f05624a449ad\": container with ID starting with 6ae26512a4b1f22b7d58307ebf2651e439bb635fc341d284ccf0f05624a449ad not found: ID does not exist" containerID="6ae26512a4b1f22b7d58307ebf2651e439bb635fc341d284ccf0f05624a449ad" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.630772 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ae26512a4b1f22b7d58307ebf2651e439bb635fc341d284ccf0f05624a449ad"} err="failed to get container status \"6ae26512a4b1f22b7d58307ebf2651e439bb635fc341d284ccf0f05624a449ad\": rpc error: code = NotFound desc = could not find container \"6ae26512a4b1f22b7d58307ebf2651e439bb635fc341d284ccf0f05624a449ad\": container with ID starting with 6ae26512a4b1f22b7d58307ebf2651e439bb635fc341d284ccf0f05624a449ad not found: ID does not exist" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.630796 4749 scope.go:117] "RemoveContainer" containerID="07e4157fa590b68f6ecc27b24f85c08e555259a8f57179f4ad33b2e8eac11a80" Feb 19 18:55:22 crc kubenswrapper[4749]: E0219 18:55:22.631193 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07e4157fa590b68f6ecc27b24f85c08e555259a8f57179f4ad33b2e8eac11a80\": container with ID starting with 07e4157fa590b68f6ecc27b24f85c08e555259a8f57179f4ad33b2e8eac11a80 not found: ID does not exist" containerID="07e4157fa590b68f6ecc27b24f85c08e555259a8f57179f4ad33b2e8eac11a80" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.631216 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07e4157fa590b68f6ecc27b24f85c08e555259a8f57179f4ad33b2e8eac11a80"} err="failed to get container status \"07e4157fa590b68f6ecc27b24f85c08e555259a8f57179f4ad33b2e8eac11a80\": rpc error: code = NotFound desc = could not find container \"07e4157fa590b68f6ecc27b24f85c08e555259a8f57179f4ad33b2e8eac11a80\": container with ID starting with 07e4157fa590b68f6ecc27b24f85c08e555259a8f57179f4ad33b2e8eac11a80 not found: ID does not exist" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.633249 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 18:55:22 crc kubenswrapper[4749]: E0219 18:55:22.633656 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f489af23-b6c5-42a0-84ef-f5fb1910c79a" containerName="nova-metadata-metadata" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.633673 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f489af23-b6c5-42a0-84ef-f5fb1910c79a" containerName="nova-metadata-metadata" Feb 19 18:55:22 crc kubenswrapper[4749]: E0219 18:55:22.633692 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba77726-b8ed-4851-8796-ddda0f4d4406" containerName="dnsmasq-dns" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.633699 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba77726-b8ed-4851-8796-ddda0f4d4406" containerName="dnsmasq-dns" Feb 19 18:55:22 crc kubenswrapper[4749]: E0219 18:55:22.633706 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f489af23-b6c5-42a0-84ef-f5fb1910c79a" containerName="nova-metadata-log" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.633712 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f489af23-b6c5-42a0-84ef-f5fb1910c79a" containerName="nova-metadata-log" Feb 19 18:55:22 crc kubenswrapper[4749]: E0219 18:55:22.633725 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="886c599b-2da2-4553-bc21-4b0b4e50c3bc" containerName="nova-manage" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.633731 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="886c599b-2da2-4553-bc21-4b0b4e50c3bc" containerName="nova-manage" Feb 19 18:55:22 crc kubenswrapper[4749]: E0219 18:55:22.633745 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba77726-b8ed-4851-8796-ddda0f4d4406" containerName="init" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.633752 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba77726-b8ed-4851-8796-ddda0f4d4406" containerName="init" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.633949 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f489af23-b6c5-42a0-84ef-f5fb1910c79a" containerName="nova-metadata-metadata" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.633960 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ba77726-b8ed-4851-8796-ddda0f4d4406" containerName="dnsmasq-dns" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.633983 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="886c599b-2da2-4553-bc21-4b0b4e50c3bc" containerName="nova-manage" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.633995 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f489af23-b6c5-42a0-84ef-f5fb1910c79a" containerName="nova-metadata-log" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.635171 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.636782 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.637323 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.641522 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.697907 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f489af23-b6c5-42a0-84ef-f5fb1910c79a" path="/var/lib/kubelet/pods/f489af23-b6c5-42a0-84ef-f5fb1910c79a/volumes" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.737263 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9csrd\" (UniqueName: \"kubernetes.io/projected/3380219c-08a7-4ecd-8646-6e39cb13137b-kube-api-access-9csrd\") pod \"nova-metadata-0\" (UID: \"3380219c-08a7-4ecd-8646-6e39cb13137b\") " pod="openstack/nova-metadata-0" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.737320 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3380219c-08a7-4ecd-8646-6e39cb13137b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3380219c-08a7-4ecd-8646-6e39cb13137b\") " pod="openstack/nova-metadata-0" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.737664 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3380219c-08a7-4ecd-8646-6e39cb13137b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3380219c-08a7-4ecd-8646-6e39cb13137b\") " pod="openstack/nova-metadata-0" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.737707 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3380219c-08a7-4ecd-8646-6e39cb13137b-logs\") pod \"nova-metadata-0\" (UID: \"3380219c-08a7-4ecd-8646-6e39cb13137b\") " pod="openstack/nova-metadata-0" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.737762 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3380219c-08a7-4ecd-8646-6e39cb13137b-config-data\") pod \"nova-metadata-0\" (UID: \"3380219c-08a7-4ecd-8646-6e39cb13137b\") " pod="openstack/nova-metadata-0" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.839142 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3380219c-08a7-4ecd-8646-6e39cb13137b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3380219c-08a7-4ecd-8646-6e39cb13137b\") " pod="openstack/nova-metadata-0" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.839223 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3380219c-08a7-4ecd-8646-6e39cb13137b-logs\") pod \"nova-metadata-0\" (UID: \"3380219c-08a7-4ecd-8646-6e39cb13137b\") " pod="openstack/nova-metadata-0" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.839258 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3380219c-08a7-4ecd-8646-6e39cb13137b-config-data\") pod \"nova-metadata-0\" (UID: \"3380219c-08a7-4ecd-8646-6e39cb13137b\") " pod="openstack/nova-metadata-0" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.839287 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9csrd\" (UniqueName: \"kubernetes.io/projected/3380219c-08a7-4ecd-8646-6e39cb13137b-kube-api-access-9csrd\") pod \"nova-metadata-0\" (UID: \"3380219c-08a7-4ecd-8646-6e39cb13137b\") " pod="openstack/nova-metadata-0" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.839307 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3380219c-08a7-4ecd-8646-6e39cb13137b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3380219c-08a7-4ecd-8646-6e39cb13137b\") " pod="openstack/nova-metadata-0" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.840732 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3380219c-08a7-4ecd-8646-6e39cb13137b-logs\") pod \"nova-metadata-0\" (UID: \"3380219c-08a7-4ecd-8646-6e39cb13137b\") " pod="openstack/nova-metadata-0" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.843284 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3380219c-08a7-4ecd-8646-6e39cb13137b-config-data\") pod \"nova-metadata-0\" (UID: \"3380219c-08a7-4ecd-8646-6e39cb13137b\") " pod="openstack/nova-metadata-0" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.843996 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3380219c-08a7-4ecd-8646-6e39cb13137b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3380219c-08a7-4ecd-8646-6e39cb13137b\") " pod="openstack/nova-metadata-0" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.844409 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3380219c-08a7-4ecd-8646-6e39cb13137b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3380219c-08a7-4ecd-8646-6e39cb13137b\") " pod="openstack/nova-metadata-0" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.854717 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9csrd\" (UniqueName: \"kubernetes.io/projected/3380219c-08a7-4ecd-8646-6e39cb13137b-kube-api-access-9csrd\") pod \"nova-metadata-0\" (UID: \"3380219c-08a7-4ecd-8646-6e39cb13137b\") " pod="openstack/nova-metadata-0" Feb 19 18:55:22 crc kubenswrapper[4749]: I0219 18:55:22.973575 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.355716 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.456999 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.552860 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ef85132-80d3-4289-a603-d9bbcb2b8637-internal-tls-certs\") pod \"7ef85132-80d3-4289-a603-d9bbcb2b8637\" (UID: \"7ef85132-80d3-4289-a603-d9bbcb2b8637\") " Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.552935 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9d95c\" (UniqueName: \"kubernetes.io/projected/7ef85132-80d3-4289-a603-d9bbcb2b8637-kube-api-access-9d95c\") pod \"7ef85132-80d3-4289-a603-d9bbcb2b8637\" (UID: \"7ef85132-80d3-4289-a603-d9bbcb2b8637\") " Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.553060 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef85132-80d3-4289-a603-d9bbcb2b8637-config-data\") pod \"7ef85132-80d3-4289-a603-d9bbcb2b8637\" (UID: \"7ef85132-80d3-4289-a603-d9bbcb2b8637\") " Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.553096 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ef85132-80d3-4289-a603-d9bbcb2b8637-logs\") pod \"7ef85132-80d3-4289-a603-d9bbcb2b8637\" (UID: \"7ef85132-80d3-4289-a603-d9bbcb2b8637\") " Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.553116 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef85132-80d3-4289-a603-d9bbcb2b8637-combined-ca-bundle\") pod \"7ef85132-80d3-4289-a603-d9bbcb2b8637\" (UID: \"7ef85132-80d3-4289-a603-d9bbcb2b8637\") " Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.553208 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ef85132-80d3-4289-a603-d9bbcb2b8637-public-tls-certs\") pod \"7ef85132-80d3-4289-a603-d9bbcb2b8637\" (UID: \"7ef85132-80d3-4289-a603-d9bbcb2b8637\") " Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.554479 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ef85132-80d3-4289-a603-d9bbcb2b8637-logs" (OuterVolumeSpecName: "logs") pod "7ef85132-80d3-4289-a603-d9bbcb2b8637" (UID: "7ef85132-80d3-4289-a603-d9bbcb2b8637"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.559273 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ef85132-80d3-4289-a603-d9bbcb2b8637-kube-api-access-9d95c" (OuterVolumeSpecName: "kube-api-access-9d95c") pod "7ef85132-80d3-4289-a603-d9bbcb2b8637" (UID: "7ef85132-80d3-4289-a603-d9bbcb2b8637"). InnerVolumeSpecName "kube-api-access-9d95c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.574146 4749 generic.go:334] "Generic (PLEG): container finished" podID="7ef85132-80d3-4289-a603-d9bbcb2b8637" containerID="d31ec1881a22895ad6394dd0e8f3ba624edd0a633207e7a2d4cdda21f256eafc" exitCode=0 Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.574218 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.574240 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ef85132-80d3-4289-a603-d9bbcb2b8637","Type":"ContainerDied","Data":"d31ec1881a22895ad6394dd0e8f3ba624edd0a633207e7a2d4cdda21f256eafc"} Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.575354 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ef85132-80d3-4289-a603-d9bbcb2b8637","Type":"ContainerDied","Data":"1a7b6b44e87066139ec386eb4bfe51eeaffb82cb061b3122d2e0b0ccf9760722"} Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.575388 4749 scope.go:117] "RemoveContainer" containerID="d31ec1881a22895ad6394dd0e8f3ba624edd0a633207e7a2d4cdda21f256eafc" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.578925 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3380219c-08a7-4ecd-8646-6e39cb13137b","Type":"ContainerStarted","Data":"e5e041293d084c74f7ecdc8ae703e7d1bd4156f5bea71a11c66e53c41ff5ab68"} Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.584974 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef85132-80d3-4289-a603-d9bbcb2b8637-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ef85132-80d3-4289-a603-d9bbcb2b8637" (UID: "7ef85132-80d3-4289-a603-d9bbcb2b8637"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.585430 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef85132-80d3-4289-a603-d9bbcb2b8637-config-data" (OuterVolumeSpecName: "config-data") pod "7ef85132-80d3-4289-a603-d9bbcb2b8637" (UID: "7ef85132-80d3-4289-a603-d9bbcb2b8637"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.598442 4749 scope.go:117] "RemoveContainer" containerID="0253be515aabad40db4cfa4d13e56d2b582427f1ba71f2c5d50e4b81e162c242" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.618392 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef85132-80d3-4289-a603-d9bbcb2b8637-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7ef85132-80d3-4289-a603-d9bbcb2b8637" (UID: "7ef85132-80d3-4289-a603-d9bbcb2b8637"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.619164 4749 scope.go:117] "RemoveContainer" containerID="d31ec1881a22895ad6394dd0e8f3ba624edd0a633207e7a2d4cdda21f256eafc" Feb 19 18:55:23 crc kubenswrapper[4749]: E0219 18:55:23.619629 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d31ec1881a22895ad6394dd0e8f3ba624edd0a633207e7a2d4cdda21f256eafc\": container with ID starting with d31ec1881a22895ad6394dd0e8f3ba624edd0a633207e7a2d4cdda21f256eafc not found: ID does not exist" containerID="d31ec1881a22895ad6394dd0e8f3ba624edd0a633207e7a2d4cdda21f256eafc" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.619664 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d31ec1881a22895ad6394dd0e8f3ba624edd0a633207e7a2d4cdda21f256eafc"} err="failed to get container status \"d31ec1881a22895ad6394dd0e8f3ba624edd0a633207e7a2d4cdda21f256eafc\": rpc error: code = NotFound desc = could not find container \"d31ec1881a22895ad6394dd0e8f3ba624edd0a633207e7a2d4cdda21f256eafc\": container with ID starting with d31ec1881a22895ad6394dd0e8f3ba624edd0a633207e7a2d4cdda21f256eafc not found: ID does not exist" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.619692 4749 scope.go:117] "RemoveContainer" containerID="0253be515aabad40db4cfa4d13e56d2b582427f1ba71f2c5d50e4b81e162c242" Feb 19 18:55:23 crc kubenswrapper[4749]: E0219 18:55:23.621314 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0253be515aabad40db4cfa4d13e56d2b582427f1ba71f2c5d50e4b81e162c242\": container with ID starting with 0253be515aabad40db4cfa4d13e56d2b582427f1ba71f2c5d50e4b81e162c242 not found: ID does not exist" containerID="0253be515aabad40db4cfa4d13e56d2b582427f1ba71f2c5d50e4b81e162c242" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.621522 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0253be515aabad40db4cfa4d13e56d2b582427f1ba71f2c5d50e4b81e162c242"} err="failed to get container status \"0253be515aabad40db4cfa4d13e56d2b582427f1ba71f2c5d50e4b81e162c242\": rpc error: code = NotFound desc = could not find container \"0253be515aabad40db4cfa4d13e56d2b582427f1ba71f2c5d50e4b81e162c242\": container with ID starting with 0253be515aabad40db4cfa4d13e56d2b582427f1ba71f2c5d50e4b81e162c242 not found: ID does not exist" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.622795 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef85132-80d3-4289-a603-d9bbcb2b8637-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7ef85132-80d3-4289-a603-d9bbcb2b8637" (UID: "7ef85132-80d3-4289-a603-d9bbcb2b8637"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.656239 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ef85132-80d3-4289-a603-d9bbcb2b8637-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.656283 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ef85132-80d3-4289-a603-d9bbcb2b8637-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.656304 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9d95c\" (UniqueName: \"kubernetes.io/projected/7ef85132-80d3-4289-a603-d9bbcb2b8637-kube-api-access-9d95c\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.656320 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef85132-80d3-4289-a603-d9bbcb2b8637-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.656334 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ef85132-80d3-4289-a603-d9bbcb2b8637-logs\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.656346 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef85132-80d3-4289-a603-d9bbcb2b8637-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.910681 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.922793 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.943321 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 18:55:23 crc kubenswrapper[4749]: E0219 18:55:23.943756 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef85132-80d3-4289-a603-d9bbcb2b8637" containerName="nova-api-log" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.943775 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef85132-80d3-4289-a603-d9bbcb2b8637" containerName="nova-api-log" Feb 19 18:55:23 crc kubenswrapper[4749]: E0219 18:55:23.943798 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef85132-80d3-4289-a603-d9bbcb2b8637" containerName="nova-api-api" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.943805 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef85132-80d3-4289-a603-d9bbcb2b8637" containerName="nova-api-api" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.943969 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ef85132-80d3-4289-a603-d9bbcb2b8637" containerName="nova-api-api" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.943990 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ef85132-80d3-4289-a603-d9bbcb2b8637" containerName="nova-api-log" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.944924 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.947967 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.948187 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.948829 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 19 18:55:23 crc kubenswrapper[4749]: I0219 18:55:23.954679 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 18:55:24 crc kubenswrapper[4749]: I0219 18:55:24.069046 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26c708f4-7611-4c40-9dc9-8b941ff97b87-config-data\") pod \"nova-api-0\" (UID: \"26c708f4-7611-4c40-9dc9-8b941ff97b87\") " pod="openstack/nova-api-0" Feb 19 18:55:24 crc kubenswrapper[4749]: I0219 18:55:24.069263 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26c708f4-7611-4c40-9dc9-8b941ff97b87-internal-tls-certs\") pod \"nova-api-0\" (UID: \"26c708f4-7611-4c40-9dc9-8b941ff97b87\") " pod="openstack/nova-api-0" Feb 19 18:55:24 crc kubenswrapper[4749]: I0219 18:55:24.079080 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26c708f4-7611-4c40-9dc9-8b941ff97b87-public-tls-certs\") pod \"nova-api-0\" (UID: \"26c708f4-7611-4c40-9dc9-8b941ff97b87\") " pod="openstack/nova-api-0" Feb 19 18:55:24 crc kubenswrapper[4749]: I0219 18:55:24.079402 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26c708f4-7611-4c40-9dc9-8b941ff97b87-logs\") pod \"nova-api-0\" (UID: \"26c708f4-7611-4c40-9dc9-8b941ff97b87\") " pod="openstack/nova-api-0" Feb 19 18:55:24 crc kubenswrapper[4749]: I0219 18:55:24.079575 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndhgw\" (UniqueName: \"kubernetes.io/projected/26c708f4-7611-4c40-9dc9-8b941ff97b87-kube-api-access-ndhgw\") pod \"nova-api-0\" (UID: \"26c708f4-7611-4c40-9dc9-8b941ff97b87\") " pod="openstack/nova-api-0" Feb 19 18:55:24 crc kubenswrapper[4749]: I0219 18:55:24.079746 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26c708f4-7611-4c40-9dc9-8b941ff97b87-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"26c708f4-7611-4c40-9dc9-8b941ff97b87\") " pod="openstack/nova-api-0" Feb 19 18:55:24 crc kubenswrapper[4749]: I0219 18:55:24.181488 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26c708f4-7611-4c40-9dc9-8b941ff97b87-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"26c708f4-7611-4c40-9dc9-8b941ff97b87\") " pod="openstack/nova-api-0" Feb 19 18:55:24 crc kubenswrapper[4749]: I0219 18:55:24.181627 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26c708f4-7611-4c40-9dc9-8b941ff97b87-config-data\") pod \"nova-api-0\" (UID: \"26c708f4-7611-4c40-9dc9-8b941ff97b87\") " pod="openstack/nova-api-0" Feb 19 18:55:24 crc kubenswrapper[4749]: I0219 18:55:24.181894 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26c708f4-7611-4c40-9dc9-8b941ff97b87-internal-tls-certs\") pod \"nova-api-0\" (UID: \"26c708f4-7611-4c40-9dc9-8b941ff97b87\") " pod="openstack/nova-api-0" Feb 19 18:55:24 crc kubenswrapper[4749]: I0219 18:55:24.181927 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26c708f4-7611-4c40-9dc9-8b941ff97b87-public-tls-certs\") pod \"nova-api-0\" (UID: \"26c708f4-7611-4c40-9dc9-8b941ff97b87\") " pod="openstack/nova-api-0" Feb 19 18:55:24 crc kubenswrapper[4749]: I0219 18:55:24.181965 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26c708f4-7611-4c40-9dc9-8b941ff97b87-logs\") pod \"nova-api-0\" (UID: \"26c708f4-7611-4c40-9dc9-8b941ff97b87\") " pod="openstack/nova-api-0" Feb 19 18:55:24 crc kubenswrapper[4749]: I0219 18:55:24.182065 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndhgw\" (UniqueName: \"kubernetes.io/projected/26c708f4-7611-4c40-9dc9-8b941ff97b87-kube-api-access-ndhgw\") pod \"nova-api-0\" (UID: \"26c708f4-7611-4c40-9dc9-8b941ff97b87\") " pod="openstack/nova-api-0" Feb 19 18:55:24 crc kubenswrapper[4749]: I0219 18:55:24.182592 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26c708f4-7611-4c40-9dc9-8b941ff97b87-logs\") pod \"nova-api-0\" (UID: \"26c708f4-7611-4c40-9dc9-8b941ff97b87\") " pod="openstack/nova-api-0" Feb 19 18:55:24 crc kubenswrapper[4749]: I0219 18:55:24.186560 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26c708f4-7611-4c40-9dc9-8b941ff97b87-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"26c708f4-7611-4c40-9dc9-8b941ff97b87\") " pod="openstack/nova-api-0" Feb 19 18:55:24 crc kubenswrapper[4749]: I0219 18:55:24.187174 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26c708f4-7611-4c40-9dc9-8b941ff97b87-public-tls-certs\") pod \"nova-api-0\" (UID: \"26c708f4-7611-4c40-9dc9-8b941ff97b87\") " pod="openstack/nova-api-0" Feb 19 18:55:24 crc kubenswrapper[4749]: I0219 18:55:24.193760 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26c708f4-7611-4c40-9dc9-8b941ff97b87-internal-tls-certs\") pod \"nova-api-0\" (UID: \"26c708f4-7611-4c40-9dc9-8b941ff97b87\") " pod="openstack/nova-api-0" Feb 19 18:55:24 crc kubenswrapper[4749]: I0219 18:55:24.195986 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26c708f4-7611-4c40-9dc9-8b941ff97b87-config-data\") pod \"nova-api-0\" (UID: \"26c708f4-7611-4c40-9dc9-8b941ff97b87\") " pod="openstack/nova-api-0" Feb 19 18:55:24 crc kubenswrapper[4749]: I0219 18:55:24.200346 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndhgw\" (UniqueName: \"kubernetes.io/projected/26c708f4-7611-4c40-9dc9-8b941ff97b87-kube-api-access-ndhgw\") pod \"nova-api-0\" (UID: \"26c708f4-7611-4c40-9dc9-8b941ff97b87\") " pod="openstack/nova-api-0" Feb 19 18:55:24 crc kubenswrapper[4749]: I0219 18:55:24.340243 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 18:55:24 crc kubenswrapper[4749]: I0219 18:55:24.595336 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3380219c-08a7-4ecd-8646-6e39cb13137b","Type":"ContainerStarted","Data":"c7cbe4ae0a8b6fdc971e365e5361f37b6bef6f66721ab5ac130b701aed84c6d1"} Feb 19 18:55:24 crc kubenswrapper[4749]: I0219 18:55:24.595889 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3380219c-08a7-4ecd-8646-6e39cb13137b","Type":"ContainerStarted","Data":"6fdb54d4ac07567cd038d700dc25199f3e1d1bf8266c59936c947c54278744d3"} Feb 19 18:55:24 crc kubenswrapper[4749]: I0219 18:55:24.624008 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.623955365 podStartE2EDuration="2.623955365s" podCreationTimestamp="2026-02-19 18:55:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:55:24.613334537 +0000 UTC m=+1298.574554511" watchObservedRunningTime="2026-02-19 18:55:24.623955365 +0000 UTC m=+1298.585175319" Feb 19 18:55:24 crc kubenswrapper[4749]: I0219 18:55:24.692785 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ef85132-80d3-4289-a603-d9bbcb2b8637" path="/var/lib/kubelet/pods/7ef85132-80d3-4289-a603-d9bbcb2b8637/volumes" Feb 19 18:55:24 crc kubenswrapper[4749]: I0219 18:55:24.802251 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 18:55:24 crc kubenswrapper[4749]: W0219 18:55:24.802921 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26c708f4_7611_4c40_9dc9_8b941ff97b87.slice/crio-052b5cc4b68563607b742c508ecb5d6a5eb2ca23192fd0cc2df1e96b55aa0c83 WatchSource:0}: Error finding container 052b5cc4b68563607b742c508ecb5d6a5eb2ca23192fd0cc2df1e96b55aa0c83: Status 404 returned error can't find the container with id 052b5cc4b68563607b742c508ecb5d6a5eb2ca23192fd0cc2df1e96b55aa0c83 Feb 19 18:55:25 crc kubenswrapper[4749]: I0219 18:55:25.604742 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"26c708f4-7611-4c40-9dc9-8b941ff97b87","Type":"ContainerStarted","Data":"e107e4b745f44df49e92a6efc22585e8d29bee28f925bc3cdab25655631708bf"} Feb 19 18:55:25 crc kubenswrapper[4749]: I0219 18:55:25.605366 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"26c708f4-7611-4c40-9dc9-8b941ff97b87","Type":"ContainerStarted","Data":"58f110b962da66e21b33d5dfd53b1d089eb195fc28062daad85bd0dbfe14e89a"} Feb 19 18:55:25 crc kubenswrapper[4749]: I0219 18:55:25.605383 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"26c708f4-7611-4c40-9dc9-8b941ff97b87","Type":"ContainerStarted","Data":"052b5cc4b68563607b742c508ecb5d6a5eb2ca23192fd0cc2df1e96b55aa0c83"} Feb 19 18:55:25 crc kubenswrapper[4749]: I0219 18:55:25.633133 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.633109496 podStartE2EDuration="2.633109496s" podCreationTimestamp="2026-02-19 18:55:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:55:25.622565821 +0000 UTC m=+1299.583785795" watchObservedRunningTime="2026-02-19 18:55:25.633109496 +0000 UTC m=+1299.594329450" Feb 19 18:55:26 crc kubenswrapper[4749]: I0219 18:55:26.595422 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 18:55:26 crc kubenswrapper[4749]: I0219 18:55:26.617495 4749 generic.go:334] "Generic (PLEG): container finished" podID="471c4100-a672-4b83-842f-03a42aae3a92" containerID="c29fa33ea1c8273b6bfee895a8e4ed0b697e18720714858f0daabdf5f958be85" exitCode=0 Feb 19 18:55:26 crc kubenswrapper[4749]: I0219 18:55:26.617565 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 18:55:26 crc kubenswrapper[4749]: I0219 18:55:26.617613 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"471c4100-a672-4b83-842f-03a42aae3a92","Type":"ContainerDied","Data":"c29fa33ea1c8273b6bfee895a8e4ed0b697e18720714858f0daabdf5f958be85"} Feb 19 18:55:26 crc kubenswrapper[4749]: I0219 18:55:26.617663 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"471c4100-a672-4b83-842f-03a42aae3a92","Type":"ContainerDied","Data":"76e825c462dc9ffbb756f73bc814b5648f7fd9c75734fae235683cce027c3600"} Feb 19 18:55:26 crc kubenswrapper[4749]: I0219 18:55:26.617684 4749 scope.go:117] "RemoveContainer" containerID="c29fa33ea1c8273b6bfee895a8e4ed0b697e18720714858f0daabdf5f958be85" Feb 19 18:55:26 crc kubenswrapper[4749]: I0219 18:55:26.639392 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftqcb\" (UniqueName: \"kubernetes.io/projected/471c4100-a672-4b83-842f-03a42aae3a92-kube-api-access-ftqcb\") pod \"471c4100-a672-4b83-842f-03a42aae3a92\" (UID: \"471c4100-a672-4b83-842f-03a42aae3a92\") " Feb 19 18:55:26 crc kubenswrapper[4749]: I0219 18:55:26.639456 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/471c4100-a672-4b83-842f-03a42aae3a92-combined-ca-bundle\") pod \"471c4100-a672-4b83-842f-03a42aae3a92\" (UID: \"471c4100-a672-4b83-842f-03a42aae3a92\") " Feb 19 18:55:26 crc kubenswrapper[4749]: I0219 18:55:26.639707 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/471c4100-a672-4b83-842f-03a42aae3a92-config-data\") pod \"471c4100-a672-4b83-842f-03a42aae3a92\" (UID: \"471c4100-a672-4b83-842f-03a42aae3a92\") " Feb 19 18:55:26 crc kubenswrapper[4749]: I0219 18:55:26.641746 4749 scope.go:117] "RemoveContainer" containerID="c29fa33ea1c8273b6bfee895a8e4ed0b697e18720714858f0daabdf5f958be85" Feb 19 18:55:26 crc kubenswrapper[4749]: E0219 18:55:26.646537 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c29fa33ea1c8273b6bfee895a8e4ed0b697e18720714858f0daabdf5f958be85\": container with ID starting with c29fa33ea1c8273b6bfee895a8e4ed0b697e18720714858f0daabdf5f958be85 not found: ID does not exist" containerID="c29fa33ea1c8273b6bfee895a8e4ed0b697e18720714858f0daabdf5f958be85" Feb 19 18:55:26 crc kubenswrapper[4749]: I0219 18:55:26.646871 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c29fa33ea1c8273b6bfee895a8e4ed0b697e18720714858f0daabdf5f958be85"} err="failed to get container status \"c29fa33ea1c8273b6bfee895a8e4ed0b697e18720714858f0daabdf5f958be85\": rpc error: code = NotFound desc = could not find container \"c29fa33ea1c8273b6bfee895a8e4ed0b697e18720714858f0daabdf5f958be85\": container with ID starting with c29fa33ea1c8273b6bfee895a8e4ed0b697e18720714858f0daabdf5f958be85 not found: ID does not exist" Feb 19 18:55:26 crc kubenswrapper[4749]: I0219 18:55:26.650211 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/471c4100-a672-4b83-842f-03a42aae3a92-kube-api-access-ftqcb" (OuterVolumeSpecName: "kube-api-access-ftqcb") pod "471c4100-a672-4b83-842f-03a42aae3a92" (UID: "471c4100-a672-4b83-842f-03a42aae3a92"). InnerVolumeSpecName "kube-api-access-ftqcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:55:26 crc kubenswrapper[4749]: I0219 18:55:26.673759 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/471c4100-a672-4b83-842f-03a42aae3a92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "471c4100-a672-4b83-842f-03a42aae3a92" (UID: "471c4100-a672-4b83-842f-03a42aae3a92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:55:26 crc kubenswrapper[4749]: I0219 18:55:26.683847 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/471c4100-a672-4b83-842f-03a42aae3a92-config-data" (OuterVolumeSpecName: "config-data") pod "471c4100-a672-4b83-842f-03a42aae3a92" (UID: "471c4100-a672-4b83-842f-03a42aae3a92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:55:26 crc kubenswrapper[4749]: I0219 18:55:26.742094 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/471c4100-a672-4b83-842f-03a42aae3a92-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:26 crc kubenswrapper[4749]: I0219 18:55:26.742134 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftqcb\" (UniqueName: \"kubernetes.io/projected/471c4100-a672-4b83-842f-03a42aae3a92-kube-api-access-ftqcb\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:26 crc kubenswrapper[4749]: I0219 18:55:26.742146 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/471c4100-a672-4b83-842f-03a42aae3a92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:26 crc kubenswrapper[4749]: I0219 18:55:26.937825 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 18:55:26 crc kubenswrapper[4749]: I0219 18:55:26.947761 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 18:55:26 crc kubenswrapper[4749]: I0219 18:55:26.993795 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 18:55:26 crc kubenswrapper[4749]: E0219 18:55:26.994492 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="471c4100-a672-4b83-842f-03a42aae3a92" containerName="nova-scheduler-scheduler" Feb 19 18:55:26 crc kubenswrapper[4749]: I0219 18:55:26.994565 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="471c4100-a672-4b83-842f-03a42aae3a92" containerName="nova-scheduler-scheduler" Feb 19 18:55:26 crc kubenswrapper[4749]: I0219 18:55:26.994895 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="471c4100-a672-4b83-842f-03a42aae3a92" containerName="nova-scheduler-scheduler" Feb 19 18:55:26 crc kubenswrapper[4749]: I0219 18:55:26.995779 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 18:55:26 crc kubenswrapper[4749]: I0219 18:55:26.997888 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 18:55:27 crc kubenswrapper[4749]: I0219 18:55:27.004041 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 18:55:27 crc kubenswrapper[4749]: I0219 18:55:27.046793 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfdcc5f4-f33b-4d0a-bab9-0c5ac64c9704-config-data\") pod \"nova-scheduler-0\" (UID: \"cfdcc5f4-f33b-4d0a-bab9-0c5ac64c9704\") " pod="openstack/nova-scheduler-0" Feb 19 18:55:27 crc kubenswrapper[4749]: I0219 18:55:27.047766 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thk4d\" (UniqueName: \"kubernetes.io/projected/cfdcc5f4-f33b-4d0a-bab9-0c5ac64c9704-kube-api-access-thk4d\") pod \"nova-scheduler-0\" (UID: \"cfdcc5f4-f33b-4d0a-bab9-0c5ac64c9704\") " pod="openstack/nova-scheduler-0" Feb 19 18:55:27 crc kubenswrapper[4749]: I0219 18:55:27.047905 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfdcc5f4-f33b-4d0a-bab9-0c5ac64c9704-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cfdcc5f4-f33b-4d0a-bab9-0c5ac64c9704\") " pod="openstack/nova-scheduler-0" Feb 19 18:55:27 crc kubenswrapper[4749]: I0219 18:55:27.150038 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfdcc5f4-f33b-4d0a-bab9-0c5ac64c9704-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cfdcc5f4-f33b-4d0a-bab9-0c5ac64c9704\") " pod="openstack/nova-scheduler-0" Feb 19 18:55:27 crc kubenswrapper[4749]: I0219 18:55:27.150125 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfdcc5f4-f33b-4d0a-bab9-0c5ac64c9704-config-data\") pod \"nova-scheduler-0\" (UID: \"cfdcc5f4-f33b-4d0a-bab9-0c5ac64c9704\") " pod="openstack/nova-scheduler-0" Feb 19 18:55:27 crc kubenswrapper[4749]: I0219 18:55:27.150185 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thk4d\" (UniqueName: \"kubernetes.io/projected/cfdcc5f4-f33b-4d0a-bab9-0c5ac64c9704-kube-api-access-thk4d\") pod \"nova-scheduler-0\" (UID: \"cfdcc5f4-f33b-4d0a-bab9-0c5ac64c9704\") " pod="openstack/nova-scheduler-0" Feb 19 18:55:27 crc kubenswrapper[4749]: I0219 18:55:27.157761 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfdcc5f4-f33b-4d0a-bab9-0c5ac64c9704-config-data\") pod \"nova-scheduler-0\" (UID: \"cfdcc5f4-f33b-4d0a-bab9-0c5ac64c9704\") " pod="openstack/nova-scheduler-0" Feb 19 18:55:27 crc kubenswrapper[4749]: I0219 18:55:27.157829 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfdcc5f4-f33b-4d0a-bab9-0c5ac64c9704-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cfdcc5f4-f33b-4d0a-bab9-0c5ac64c9704\") " pod="openstack/nova-scheduler-0" Feb 19 18:55:27 crc kubenswrapper[4749]: I0219 18:55:27.167229 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thk4d\" (UniqueName: \"kubernetes.io/projected/cfdcc5f4-f33b-4d0a-bab9-0c5ac64c9704-kube-api-access-thk4d\") pod \"nova-scheduler-0\" (UID: \"cfdcc5f4-f33b-4d0a-bab9-0c5ac64c9704\") " pod="openstack/nova-scheduler-0" Feb 19 18:55:27 crc kubenswrapper[4749]: I0219 18:55:27.313537 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 18:55:27 crc kubenswrapper[4749]: I0219 18:55:27.347715 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w2ns9" Feb 19 18:55:27 crc kubenswrapper[4749]: I0219 18:55:27.426000 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w2ns9" Feb 19 18:55:27 crc kubenswrapper[4749]: I0219 18:55:27.586876 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w2ns9"] Feb 19 18:55:27 crc kubenswrapper[4749]: I0219 18:55:27.760071 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 18:55:27 crc kubenswrapper[4749]: W0219 18:55:27.767645 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfdcc5f4_f33b_4d0a_bab9_0c5ac64c9704.slice/crio-e1addae1879926aded37f9a69b49032d630561145335f5f68d4a60001aa2f2bb WatchSource:0}: Error finding container e1addae1879926aded37f9a69b49032d630561145335f5f68d4a60001aa2f2bb: Status 404 returned error can't find the container with id e1addae1879926aded37f9a69b49032d630561145335f5f68d4a60001aa2f2bb Feb 19 18:55:27 crc kubenswrapper[4749]: I0219 18:55:27.974458 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 18:55:27 crc kubenswrapper[4749]: I0219 18:55:27.974502 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 18:55:28 crc kubenswrapper[4749]: I0219 18:55:28.636359 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cfdcc5f4-f33b-4d0a-bab9-0c5ac64c9704","Type":"ContainerStarted","Data":"381a1da16876d79075ac6baa47e243cbd8bfa137cbbc3a3326103d3f8467f57e"} Feb 19 18:55:28 crc kubenswrapper[4749]: I0219 18:55:28.638818 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cfdcc5f4-f33b-4d0a-bab9-0c5ac64c9704","Type":"ContainerStarted","Data":"e1addae1879926aded37f9a69b49032d630561145335f5f68d4a60001aa2f2bb"} Feb 19 18:55:28 crc kubenswrapper[4749]: I0219 18:55:28.636508 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w2ns9" podUID="144a66be-49a6-470b-af60-764fa5d4d7dd" containerName="registry-server" containerID="cri-o://f369e021e3ae4ce848155df51969ee6960287f7ad18467ecc119b54858cce9e5" gracePeriod=2 Feb 19 18:55:28 crc kubenswrapper[4749]: I0219 18:55:28.696174 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="471c4100-a672-4b83-842f-03a42aae3a92" path="/var/lib/kubelet/pods/471c4100-a672-4b83-842f-03a42aae3a92/volumes" Feb 19 18:55:29 crc kubenswrapper[4749]: I0219 18:55:29.138801 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w2ns9" Feb 19 18:55:29 crc kubenswrapper[4749]: I0219 18:55:29.155557 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.155519462 podStartE2EDuration="3.155519462s" podCreationTimestamp="2026-02-19 18:55:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:55:28.665616872 +0000 UTC m=+1302.626836836" watchObservedRunningTime="2026-02-19 18:55:29.155519462 +0000 UTC m=+1303.116739416" Feb 19 18:55:29 crc kubenswrapper[4749]: I0219 18:55:29.298698 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79hxw\" (UniqueName: \"kubernetes.io/projected/144a66be-49a6-470b-af60-764fa5d4d7dd-kube-api-access-79hxw\") pod \"144a66be-49a6-470b-af60-764fa5d4d7dd\" (UID: \"144a66be-49a6-470b-af60-764fa5d4d7dd\") " Feb 19 18:55:29 crc kubenswrapper[4749]: I0219 18:55:29.298825 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/144a66be-49a6-470b-af60-764fa5d4d7dd-catalog-content\") pod \"144a66be-49a6-470b-af60-764fa5d4d7dd\" (UID: \"144a66be-49a6-470b-af60-764fa5d4d7dd\") " Feb 19 18:55:29 crc kubenswrapper[4749]: I0219 18:55:29.298886 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/144a66be-49a6-470b-af60-764fa5d4d7dd-utilities\") pod \"144a66be-49a6-470b-af60-764fa5d4d7dd\" (UID: \"144a66be-49a6-470b-af60-764fa5d4d7dd\") " Feb 19 18:55:29 crc kubenswrapper[4749]: I0219 18:55:29.299778 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/144a66be-49a6-470b-af60-764fa5d4d7dd-utilities" (OuterVolumeSpecName: "utilities") pod "144a66be-49a6-470b-af60-764fa5d4d7dd" (UID: "144a66be-49a6-470b-af60-764fa5d4d7dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:55:29 crc kubenswrapper[4749]: I0219 18:55:29.304931 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/144a66be-49a6-470b-af60-764fa5d4d7dd-kube-api-access-79hxw" (OuterVolumeSpecName: "kube-api-access-79hxw") pod "144a66be-49a6-470b-af60-764fa5d4d7dd" (UID: "144a66be-49a6-470b-af60-764fa5d4d7dd"). InnerVolumeSpecName "kube-api-access-79hxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:55:29 crc kubenswrapper[4749]: I0219 18:55:29.401428 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79hxw\" (UniqueName: \"kubernetes.io/projected/144a66be-49a6-470b-af60-764fa5d4d7dd-kube-api-access-79hxw\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:29 crc kubenswrapper[4749]: I0219 18:55:29.401471 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/144a66be-49a6-470b-af60-764fa5d4d7dd-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:29 crc kubenswrapper[4749]: I0219 18:55:29.410894 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/144a66be-49a6-470b-af60-764fa5d4d7dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "144a66be-49a6-470b-af60-764fa5d4d7dd" (UID: "144a66be-49a6-470b-af60-764fa5d4d7dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:55:29 crc kubenswrapper[4749]: I0219 18:55:29.504082 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/144a66be-49a6-470b-af60-764fa5d4d7dd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:29 crc kubenswrapper[4749]: I0219 18:55:29.645948 4749 generic.go:334] "Generic (PLEG): container finished" podID="144a66be-49a6-470b-af60-764fa5d4d7dd" containerID="f369e021e3ae4ce848155df51969ee6960287f7ad18467ecc119b54858cce9e5" exitCode=0 Feb 19 18:55:29 crc kubenswrapper[4749]: I0219 18:55:29.646045 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w2ns9" Feb 19 18:55:29 crc kubenswrapper[4749]: I0219 18:55:29.646011 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2ns9" event={"ID":"144a66be-49a6-470b-af60-764fa5d4d7dd","Type":"ContainerDied","Data":"f369e021e3ae4ce848155df51969ee6960287f7ad18467ecc119b54858cce9e5"} Feb 19 18:55:29 crc kubenswrapper[4749]: I0219 18:55:29.646118 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2ns9" event={"ID":"144a66be-49a6-470b-af60-764fa5d4d7dd","Type":"ContainerDied","Data":"dd7d03124fb75f94107be6384eb24863fa1c970bad4219cbc9709d64b1ec0dda"} Feb 19 18:55:29 crc kubenswrapper[4749]: I0219 18:55:29.646143 4749 scope.go:117] "RemoveContainer" containerID="f369e021e3ae4ce848155df51969ee6960287f7ad18467ecc119b54858cce9e5" Feb 19 18:55:29 crc kubenswrapper[4749]: I0219 18:55:29.687586 4749 scope.go:117] "RemoveContainer" containerID="f1c7f8342a4d4cdb025c9a31cb33b77d99cc6d046bf761f23eea9338cf1f205d" Feb 19 18:55:29 crc kubenswrapper[4749]: I0219 18:55:29.691773 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w2ns9"] Feb 19 18:55:29 crc kubenswrapper[4749]: I0219 18:55:29.703769 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w2ns9"] Feb 19 18:55:29 crc kubenswrapper[4749]: I0219 18:55:29.708293 4749 scope.go:117] "RemoveContainer" containerID="5f98f8ccf04f332955e0536c36623a5b4a323636df4e8c3616d0c65c611ab242" Feb 19 18:55:29 crc kubenswrapper[4749]: I0219 18:55:29.785014 4749 scope.go:117] "RemoveContainer" containerID="f369e021e3ae4ce848155df51969ee6960287f7ad18467ecc119b54858cce9e5" Feb 19 18:55:29 crc kubenswrapper[4749]: E0219 18:55:29.787297 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f369e021e3ae4ce848155df51969ee6960287f7ad18467ecc119b54858cce9e5\": container with ID starting with f369e021e3ae4ce848155df51969ee6960287f7ad18467ecc119b54858cce9e5 not found: ID does not exist" containerID="f369e021e3ae4ce848155df51969ee6960287f7ad18467ecc119b54858cce9e5" Feb 19 18:55:29 crc kubenswrapper[4749]: I0219 18:55:29.787365 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f369e021e3ae4ce848155df51969ee6960287f7ad18467ecc119b54858cce9e5"} err="failed to get container status \"f369e021e3ae4ce848155df51969ee6960287f7ad18467ecc119b54858cce9e5\": rpc error: code = NotFound desc = could not find container \"f369e021e3ae4ce848155df51969ee6960287f7ad18467ecc119b54858cce9e5\": container with ID starting with f369e021e3ae4ce848155df51969ee6960287f7ad18467ecc119b54858cce9e5 not found: ID does not exist" Feb 19 18:55:29 crc kubenswrapper[4749]: I0219 18:55:29.787400 4749 scope.go:117] "RemoveContainer" containerID="f1c7f8342a4d4cdb025c9a31cb33b77d99cc6d046bf761f23eea9338cf1f205d" Feb 19 18:55:29 crc kubenswrapper[4749]: E0219 18:55:29.790776 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1c7f8342a4d4cdb025c9a31cb33b77d99cc6d046bf761f23eea9338cf1f205d\": container with ID starting with f1c7f8342a4d4cdb025c9a31cb33b77d99cc6d046bf761f23eea9338cf1f205d not found: ID does not exist" containerID="f1c7f8342a4d4cdb025c9a31cb33b77d99cc6d046bf761f23eea9338cf1f205d" Feb 19 18:55:29 crc kubenswrapper[4749]: I0219 18:55:29.790826 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1c7f8342a4d4cdb025c9a31cb33b77d99cc6d046bf761f23eea9338cf1f205d"} err="failed to get container status \"f1c7f8342a4d4cdb025c9a31cb33b77d99cc6d046bf761f23eea9338cf1f205d\": rpc error: code = NotFound desc = could not find container \"f1c7f8342a4d4cdb025c9a31cb33b77d99cc6d046bf761f23eea9338cf1f205d\": container with ID starting with f1c7f8342a4d4cdb025c9a31cb33b77d99cc6d046bf761f23eea9338cf1f205d not found: ID does not exist" Feb 19 18:55:29 crc kubenswrapper[4749]: I0219 18:55:29.790857 4749 scope.go:117] "RemoveContainer" containerID="5f98f8ccf04f332955e0536c36623a5b4a323636df4e8c3616d0c65c611ab242" Feb 19 18:55:29 crc kubenswrapper[4749]: E0219 18:55:29.791624 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f98f8ccf04f332955e0536c36623a5b4a323636df4e8c3616d0c65c611ab242\": container with ID starting with 5f98f8ccf04f332955e0536c36623a5b4a323636df4e8c3616d0c65c611ab242 not found: ID does not exist" containerID="5f98f8ccf04f332955e0536c36623a5b4a323636df4e8c3616d0c65c611ab242" Feb 19 18:55:29 crc kubenswrapper[4749]: I0219 18:55:29.791651 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f98f8ccf04f332955e0536c36623a5b4a323636df4e8c3616d0c65c611ab242"} err="failed to get container status \"5f98f8ccf04f332955e0536c36623a5b4a323636df4e8c3616d0c65c611ab242\": rpc error: code = NotFound desc = could not find container \"5f98f8ccf04f332955e0536c36623a5b4a323636df4e8c3616d0c65c611ab242\": container with ID starting with 5f98f8ccf04f332955e0536c36623a5b4a323636df4e8c3616d0c65c611ab242 not found: ID does not exist" Feb 19 18:55:30 crc kubenswrapper[4749]: I0219 18:55:30.691084 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="144a66be-49a6-470b-af60-764fa5d4d7dd" path="/var/lib/kubelet/pods/144a66be-49a6-470b-af60-764fa5d4d7dd/volumes" Feb 19 18:55:32 crc kubenswrapper[4749]: I0219 18:55:32.313818 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 18:55:32 crc kubenswrapper[4749]: I0219 18:55:32.974251 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 18:55:32 crc kubenswrapper[4749]: I0219 18:55:32.974326 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 18:55:33 crc kubenswrapper[4749]: I0219 18:55:33.989244 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3380219c-08a7-4ecd-8646-6e39cb13137b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.230:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 18:55:33 crc kubenswrapper[4749]: I0219 18:55:33.989264 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3380219c-08a7-4ecd-8646-6e39cb13137b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.230:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 18:55:34 crc kubenswrapper[4749]: I0219 18:55:34.341261 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 18:55:34 crc kubenswrapper[4749]: I0219 18:55:34.341331 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 18:55:35 crc kubenswrapper[4749]: I0219 18:55:35.357279 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="26c708f4-7611-4c40-9dc9-8b941ff97b87" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.231:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 18:55:35 crc kubenswrapper[4749]: I0219 18:55:35.358077 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="26c708f4-7611-4c40-9dc9-8b941ff97b87" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.231:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 18:55:37 crc kubenswrapper[4749]: I0219 18:55:37.313679 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 18:55:37 crc kubenswrapper[4749]: I0219 18:55:37.346923 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 18:55:37 crc kubenswrapper[4749]: I0219 18:55:37.794141 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 18:55:38 crc kubenswrapper[4749]: I0219 18:55:38.146249 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 18:55:42 crc kubenswrapper[4749]: I0219 18:55:42.979275 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 18:55:42 crc kubenswrapper[4749]: I0219 18:55:42.979781 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 18:55:42 crc kubenswrapper[4749]: I0219 18:55:42.984175 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 18:55:42 crc kubenswrapper[4749]: I0219 18:55:42.984613 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 18:55:44 crc kubenswrapper[4749]: I0219 18:55:44.351669 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 18:55:44 crc kubenswrapper[4749]: I0219 18:55:44.352017 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 18:55:44 crc kubenswrapper[4749]: I0219 18:55:44.361159 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 18:55:44 crc kubenswrapper[4749]: I0219 18:55:44.363606 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 18:55:44 crc kubenswrapper[4749]: I0219 18:55:44.832483 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 18:55:44 crc kubenswrapper[4749]: I0219 18:55:44.841624 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 18:55:53 crc kubenswrapper[4749]: I0219 18:55:53.091839 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 18:55:54 crc kubenswrapper[4749]: I0219 18:55:54.472953 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 18:55:56 crc kubenswrapper[4749]: I0219 18:55:56.527092 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="042fb593-4898-4085-889e-7ccb375cf969" containerName="rabbitmq" containerID="cri-o://27a5fc8a7e6b23c377b6b816b449c02eb2f42223b70eefa4d494e95e767e7d8e" gracePeriod=604797 Feb 19 18:55:57 crc kubenswrapper[4749]: I0219 18:55:57.637318 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="008062c0-9ccf-4fd2-9b54-63196268da38" containerName="rabbitmq" containerID="cri-o://a725c718506d3e939f66c1ae53a59f357a04f1a719595919ba7965a418ce89e8" gracePeriod=604797 Feb 19 18:55:57 crc kubenswrapper[4749]: I0219 18:55:57.940430 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="008062c0-9ccf-4fd2-9b54-63196268da38" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.108:5671: connect: connection refused" Feb 19 18:55:57 crc kubenswrapper[4749]: I0219 18:55:57.951295 4749 generic.go:334] "Generic (PLEG): container finished" podID="042fb593-4898-4085-889e-7ccb375cf969" containerID="27a5fc8a7e6b23c377b6b816b449c02eb2f42223b70eefa4d494e95e767e7d8e" exitCode=0 Feb 19 18:55:57 crc kubenswrapper[4749]: I0219 18:55:57.951339 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"042fb593-4898-4085-889e-7ccb375cf969","Type":"ContainerDied","Data":"27a5fc8a7e6b23c377b6b816b449c02eb2f42223b70eefa4d494e95e767e7d8e"} Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.126772 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.172894 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/042fb593-4898-4085-889e-7ccb375cf969-server-conf\") pod \"042fb593-4898-4085-889e-7ccb375cf969\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.172972 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/042fb593-4898-4085-889e-7ccb375cf969-config-data\") pod \"042fb593-4898-4085-889e-7ccb375cf969\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.173062 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/042fb593-4898-4085-889e-7ccb375cf969-rabbitmq-plugins\") pod \"042fb593-4898-4085-889e-7ccb375cf969\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.173088 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/042fb593-4898-4085-889e-7ccb375cf969-rabbitmq-tls\") pod \"042fb593-4898-4085-889e-7ccb375cf969\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.173126 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"042fb593-4898-4085-889e-7ccb375cf969\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.173153 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/042fb593-4898-4085-889e-7ccb375cf969-rabbitmq-confd\") pod \"042fb593-4898-4085-889e-7ccb375cf969\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.173284 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/042fb593-4898-4085-889e-7ccb375cf969-plugins-conf\") pod \"042fb593-4898-4085-889e-7ccb375cf969\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.173311 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf58l\" (UniqueName: \"kubernetes.io/projected/042fb593-4898-4085-889e-7ccb375cf969-kube-api-access-kf58l\") pod \"042fb593-4898-4085-889e-7ccb375cf969\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.173377 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/042fb593-4898-4085-889e-7ccb375cf969-erlang-cookie-secret\") pod \"042fb593-4898-4085-889e-7ccb375cf969\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.173419 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/042fb593-4898-4085-889e-7ccb375cf969-pod-info\") pod \"042fb593-4898-4085-889e-7ccb375cf969\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.173444 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/042fb593-4898-4085-889e-7ccb375cf969-rabbitmq-erlang-cookie\") pod \"042fb593-4898-4085-889e-7ccb375cf969\" (UID: \"042fb593-4898-4085-889e-7ccb375cf969\") " Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.177930 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/042fb593-4898-4085-889e-7ccb375cf969-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "042fb593-4898-4085-889e-7ccb375cf969" (UID: "042fb593-4898-4085-889e-7ccb375cf969"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.181624 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/042fb593-4898-4085-889e-7ccb375cf969-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "042fb593-4898-4085-889e-7ccb375cf969" (UID: "042fb593-4898-4085-889e-7ccb375cf969"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.181944 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/042fb593-4898-4085-889e-7ccb375cf969-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "042fb593-4898-4085-889e-7ccb375cf969" (UID: "042fb593-4898-4085-889e-7ccb375cf969"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.187926 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/042fb593-4898-4085-889e-7ccb375cf969-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "042fb593-4898-4085-889e-7ccb375cf969" (UID: "042fb593-4898-4085-889e-7ccb375cf969"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.190390 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/042fb593-4898-4085-889e-7ccb375cf969-pod-info" (OuterVolumeSpecName: "pod-info") pod "042fb593-4898-4085-889e-7ccb375cf969" (UID: "042fb593-4898-4085-889e-7ccb375cf969"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.190792 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/042fb593-4898-4085-889e-7ccb375cf969-kube-api-access-kf58l" (OuterVolumeSpecName: "kube-api-access-kf58l") pod "042fb593-4898-4085-889e-7ccb375cf969" (UID: "042fb593-4898-4085-889e-7ccb375cf969"). InnerVolumeSpecName "kube-api-access-kf58l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.204536 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "042fb593-4898-4085-889e-7ccb375cf969" (UID: "042fb593-4898-4085-889e-7ccb375cf969"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.204791 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/042fb593-4898-4085-889e-7ccb375cf969-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "042fb593-4898-4085-889e-7ccb375cf969" (UID: "042fb593-4898-4085-889e-7ccb375cf969"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.224490 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/042fb593-4898-4085-889e-7ccb375cf969-config-data" (OuterVolumeSpecName: "config-data") pod "042fb593-4898-4085-889e-7ccb375cf969" (UID: "042fb593-4898-4085-889e-7ccb375cf969"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.275824 4749 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/042fb593-4898-4085-889e-7ccb375cf969-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.275862 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf58l\" (UniqueName: \"kubernetes.io/projected/042fb593-4898-4085-889e-7ccb375cf969-kube-api-access-kf58l\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.275875 4749 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/042fb593-4898-4085-889e-7ccb375cf969-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.275887 4749 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/042fb593-4898-4085-889e-7ccb375cf969-pod-info\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.275901 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/042fb593-4898-4085-889e-7ccb375cf969-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.275912 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/042fb593-4898-4085-889e-7ccb375cf969-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.275923 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/042fb593-4898-4085-889e-7ccb375cf969-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.275934 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/042fb593-4898-4085-889e-7ccb375cf969-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.275966 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.322940 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/042fb593-4898-4085-889e-7ccb375cf969-server-conf" (OuterVolumeSpecName: "server-conf") pod "042fb593-4898-4085-889e-7ccb375cf969" (UID: "042fb593-4898-4085-889e-7ccb375cf969"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.333943 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.377487 4749 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/042fb593-4898-4085-889e-7ccb375cf969-server-conf\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.377524 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.386772 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/042fb593-4898-4085-889e-7ccb375cf969-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "042fb593-4898-4085-889e-7ccb375cf969" (UID: "042fb593-4898-4085-889e-7ccb375cf969"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.481517 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/042fb593-4898-4085-889e-7ccb375cf969-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.966468 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"042fb593-4898-4085-889e-7ccb375cf969","Type":"ContainerDied","Data":"598b1485a47e9995ae401e652012917e716c7a7c520c41e6fc138c367cf7e054"} Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.966521 4749 scope.go:117] "RemoveContainer" containerID="27a5fc8a7e6b23c377b6b816b449c02eb2f42223b70eefa4d494e95e767e7d8e" Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.966689 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.987162 4749 generic.go:334] "Generic (PLEG): container finished" podID="008062c0-9ccf-4fd2-9b54-63196268da38" containerID="a725c718506d3e939f66c1ae53a59f357a04f1a719595919ba7965a418ce89e8" exitCode=0 Feb 19 18:55:58 crc kubenswrapper[4749]: I0219 18:55:58.987241 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"008062c0-9ccf-4fd2-9b54-63196268da38","Type":"ContainerDied","Data":"a725c718506d3e939f66c1ae53a59f357a04f1a719595919ba7965a418ce89e8"} Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.004643 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.020656 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.030401 4749 scope.go:117] "RemoveContainer" containerID="fd26a9103b0a88682847d38bd2a0a2ca1f91ec3eea2089769c23970dca2fdfd3" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.053414 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 18:55:59 crc kubenswrapper[4749]: E0219 18:55:59.053841 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="144a66be-49a6-470b-af60-764fa5d4d7dd" containerName="extract-content" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.054110 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="144a66be-49a6-470b-af60-764fa5d4d7dd" containerName="extract-content" Feb 19 18:55:59 crc kubenswrapper[4749]: E0219 18:55:59.054125 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="042fb593-4898-4085-889e-7ccb375cf969" containerName="setup-container" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.054131 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="042fb593-4898-4085-889e-7ccb375cf969" containerName="setup-container" Feb 19 18:55:59 crc kubenswrapper[4749]: E0219 18:55:59.054150 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="144a66be-49a6-470b-af60-764fa5d4d7dd" containerName="extract-utilities" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.054157 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="144a66be-49a6-470b-af60-764fa5d4d7dd" containerName="extract-utilities" Feb 19 18:55:59 crc kubenswrapper[4749]: E0219 18:55:59.054168 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="042fb593-4898-4085-889e-7ccb375cf969" containerName="rabbitmq" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.054173 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="042fb593-4898-4085-889e-7ccb375cf969" containerName="rabbitmq" Feb 19 18:55:59 crc kubenswrapper[4749]: E0219 18:55:59.054185 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="144a66be-49a6-470b-af60-764fa5d4d7dd" containerName="registry-server" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.054190 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="144a66be-49a6-470b-af60-764fa5d4d7dd" containerName="registry-server" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.054371 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="042fb593-4898-4085-889e-7ccb375cf969" containerName="rabbitmq" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.054395 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="144a66be-49a6-470b-af60-764fa5d4d7dd" containerName="registry-server" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.055340 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.060747 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.061020 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.061206 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-grnfr" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.061312 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.061453 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.061571 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.061659 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.065553 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.195392 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.195456 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8f675112-5cb9-4988-b346-b29f1e2699f9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.195494 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8f675112-5cb9-4988-b346-b29f1e2699f9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.195518 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm95k\" (UniqueName: \"kubernetes.io/projected/8f675112-5cb9-4988-b346-b29f1e2699f9-kube-api-access-pm95k\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.195547 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8f675112-5cb9-4988-b346-b29f1e2699f9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.195579 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8f675112-5cb9-4988-b346-b29f1e2699f9-config-data\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.195599 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8f675112-5cb9-4988-b346-b29f1e2699f9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.195626 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8f675112-5cb9-4988-b346-b29f1e2699f9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.195646 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8f675112-5cb9-4988-b346-b29f1e2699f9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.195668 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8f675112-5cb9-4988-b346-b29f1e2699f9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.195704 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8f675112-5cb9-4988-b346-b29f1e2699f9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.297712 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8f675112-5cb9-4988-b346-b29f1e2699f9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.297768 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8f675112-5cb9-4988-b346-b29f1e2699f9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.297793 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8f675112-5cb9-4988-b346-b29f1e2699f9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.297817 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8f675112-5cb9-4988-b346-b29f1e2699f9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.297858 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8f675112-5cb9-4988-b346-b29f1e2699f9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.297905 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.297952 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8f675112-5cb9-4988-b346-b29f1e2699f9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.297987 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8f675112-5cb9-4988-b346-b29f1e2699f9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.298009 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm95k\" (UniqueName: \"kubernetes.io/projected/8f675112-5cb9-4988-b346-b29f1e2699f9-kube-api-access-pm95k\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.298054 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8f675112-5cb9-4988-b346-b29f1e2699f9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.298088 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8f675112-5cb9-4988-b346-b29f1e2699f9-config-data\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.298287 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.298505 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8f675112-5cb9-4988-b346-b29f1e2699f9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.298520 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8f675112-5cb9-4988-b346-b29f1e2699f9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.299366 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8f675112-5cb9-4988-b346-b29f1e2699f9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.302913 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8f675112-5cb9-4988-b346-b29f1e2699f9-config-data\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.303939 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8f675112-5cb9-4988-b346-b29f1e2699f9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.303936 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8f675112-5cb9-4988-b346-b29f1e2699f9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.304458 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8f675112-5cb9-4988-b346-b29f1e2699f9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.306865 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8f675112-5cb9-4988-b346-b29f1e2699f9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.307059 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8f675112-5cb9-4988-b346-b29f1e2699f9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.316220 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.325804 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm95k\" (UniqueName: \"kubernetes.io/projected/8f675112-5cb9-4988-b346-b29f1e2699f9-kube-api-access-pm95k\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.374141 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"8f675112-5cb9-4988-b346-b29f1e2699f9\") " pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.454444 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.503464 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/008062c0-9ccf-4fd2-9b54-63196268da38-rabbitmq-erlang-cookie\") pod \"008062c0-9ccf-4fd2-9b54-63196268da38\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.503697 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"008062c0-9ccf-4fd2-9b54-63196268da38\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.503729 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/008062c0-9ccf-4fd2-9b54-63196268da38-rabbitmq-tls\") pod \"008062c0-9ccf-4fd2-9b54-63196268da38\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.503801 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9l8h\" (UniqueName: \"kubernetes.io/projected/008062c0-9ccf-4fd2-9b54-63196268da38-kube-api-access-j9l8h\") pod \"008062c0-9ccf-4fd2-9b54-63196268da38\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.503824 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/008062c0-9ccf-4fd2-9b54-63196268da38-config-data\") pod \"008062c0-9ccf-4fd2-9b54-63196268da38\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.503841 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/008062c0-9ccf-4fd2-9b54-63196268da38-pod-info\") pod \"008062c0-9ccf-4fd2-9b54-63196268da38\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.503862 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/008062c0-9ccf-4fd2-9b54-63196268da38-rabbitmq-confd\") pod \"008062c0-9ccf-4fd2-9b54-63196268da38\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.503902 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/008062c0-9ccf-4fd2-9b54-63196268da38-erlang-cookie-secret\") pod \"008062c0-9ccf-4fd2-9b54-63196268da38\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.503931 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/008062c0-9ccf-4fd2-9b54-63196268da38-server-conf\") pod \"008062c0-9ccf-4fd2-9b54-63196268da38\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.504101 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/008062c0-9ccf-4fd2-9b54-63196268da38-rabbitmq-plugins\") pod \"008062c0-9ccf-4fd2-9b54-63196268da38\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.504141 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/008062c0-9ccf-4fd2-9b54-63196268da38-plugins-conf\") pod \"008062c0-9ccf-4fd2-9b54-63196268da38\" (UID: \"008062c0-9ccf-4fd2-9b54-63196268da38\") " Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.505738 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/008062c0-9ccf-4fd2-9b54-63196268da38-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "008062c0-9ccf-4fd2-9b54-63196268da38" (UID: "008062c0-9ccf-4fd2-9b54-63196268da38"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.507648 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "008062c0-9ccf-4fd2-9b54-63196268da38" (UID: "008062c0-9ccf-4fd2-9b54-63196268da38"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.508386 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/008062c0-9ccf-4fd2-9b54-63196268da38-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "008062c0-9ccf-4fd2-9b54-63196268da38" (UID: "008062c0-9ccf-4fd2-9b54-63196268da38"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.508402 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/008062c0-9ccf-4fd2-9b54-63196268da38-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "008062c0-9ccf-4fd2-9b54-63196268da38" (UID: "008062c0-9ccf-4fd2-9b54-63196268da38"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.509737 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/008062c0-9ccf-4fd2-9b54-63196268da38-pod-info" (OuterVolumeSpecName: "pod-info") pod "008062c0-9ccf-4fd2-9b54-63196268da38" (UID: "008062c0-9ccf-4fd2-9b54-63196268da38"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.510922 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/008062c0-9ccf-4fd2-9b54-63196268da38-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "008062c0-9ccf-4fd2-9b54-63196268da38" (UID: "008062c0-9ccf-4fd2-9b54-63196268da38"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.514183 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/008062c0-9ccf-4fd2-9b54-63196268da38-kube-api-access-j9l8h" (OuterVolumeSpecName: "kube-api-access-j9l8h") pod "008062c0-9ccf-4fd2-9b54-63196268da38" (UID: "008062c0-9ccf-4fd2-9b54-63196268da38"). InnerVolumeSpecName "kube-api-access-j9l8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.514452 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/008062c0-9ccf-4fd2-9b54-63196268da38-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "008062c0-9ccf-4fd2-9b54-63196268da38" (UID: "008062c0-9ccf-4fd2-9b54-63196268da38"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.570855 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/008062c0-9ccf-4fd2-9b54-63196268da38-config-data" (OuterVolumeSpecName: "config-data") pod "008062c0-9ccf-4fd2-9b54-63196268da38" (UID: "008062c0-9ccf-4fd2-9b54-63196268da38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.576292 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/008062c0-9ccf-4fd2-9b54-63196268da38-server-conf" (OuterVolumeSpecName: "server-conf") pod "008062c0-9ccf-4fd2-9b54-63196268da38" (UID: "008062c0-9ccf-4fd2-9b54-63196268da38"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.607352 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/008062c0-9ccf-4fd2-9b54-63196268da38-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.607377 4749 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/008062c0-9ccf-4fd2-9b54-63196268da38-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.607388 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/008062c0-9ccf-4fd2-9b54-63196268da38-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.607425 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.607435 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/008062c0-9ccf-4fd2-9b54-63196268da38-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.607444 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9l8h\" (UniqueName: \"kubernetes.io/projected/008062c0-9ccf-4fd2-9b54-63196268da38-kube-api-access-j9l8h\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.607454 4749 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/008062c0-9ccf-4fd2-9b54-63196268da38-pod-info\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.607464 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/008062c0-9ccf-4fd2-9b54-63196268da38-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.607474 4749 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/008062c0-9ccf-4fd2-9b54-63196268da38-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.607484 4749 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/008062c0-9ccf-4fd2-9b54-63196268da38-server-conf\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.650364 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.709263 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.712749 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/008062c0-9ccf-4fd2-9b54-63196268da38-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "008062c0-9ccf-4fd2-9b54-63196268da38" (UID: "008062c0-9ccf-4fd2-9b54-63196268da38"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.818641 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/008062c0-9ccf-4fd2-9b54-63196268da38-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 19 18:55:59 crc kubenswrapper[4749]: I0219 18:55:59.852531 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.012559 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8f675112-5cb9-4988-b346-b29f1e2699f9","Type":"ContainerStarted","Data":"f5fbcfc9d92822161406523b474066d41fa0b2417a75184b29b2e43e761cf604"} Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.018677 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"008062c0-9ccf-4fd2-9b54-63196268da38","Type":"ContainerDied","Data":"0ab55f0f2b448d6438081f2a89273710872f8800d82fadb48c5541554c55c361"} Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.018903 4749 scope.go:117] "RemoveContainer" containerID="a725c718506d3e939f66c1ae53a59f357a04f1a719595919ba7965a418ce89e8" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.019211 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.050977 4749 scope.go:117] "RemoveContainer" containerID="d424cd182682a0c438d89dbb09758c27fa545c0f0dd6fcc5b71037d7ded4f1c4" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.071386 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.079265 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.101433 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 18:56:00 crc kubenswrapper[4749]: E0219 18:56:00.101810 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="008062c0-9ccf-4fd2-9b54-63196268da38" containerName="setup-container" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.105584 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="008062c0-9ccf-4fd2-9b54-63196268da38" containerName="setup-container" Feb 19 18:56:00 crc kubenswrapper[4749]: E0219 18:56:00.105636 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="008062c0-9ccf-4fd2-9b54-63196268da38" containerName="rabbitmq" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.105643 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="008062c0-9ccf-4fd2-9b54-63196268da38" containerName="rabbitmq" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.106032 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="008062c0-9ccf-4fd2-9b54-63196268da38" containerName="rabbitmq" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.107132 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.115612 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.115797 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-hjc7c" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.115935 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.116322 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.116721 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.116787 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.124665 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.128041 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.234135 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6783e255-9125-478b-8c87-61176c735e2c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.234191 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvscd\" (UniqueName: \"kubernetes.io/projected/6783e255-9125-478b-8c87-61176c735e2c-kube-api-access-gvscd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.234211 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6783e255-9125-478b-8c87-61176c735e2c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.234245 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6783e255-9125-478b-8c87-61176c735e2c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.234259 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.234332 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6783e255-9125-478b-8c87-61176c735e2c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.234357 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6783e255-9125-478b-8c87-61176c735e2c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.234385 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6783e255-9125-478b-8c87-61176c735e2c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.234401 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6783e255-9125-478b-8c87-61176c735e2c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.234433 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6783e255-9125-478b-8c87-61176c735e2c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.234470 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6783e255-9125-478b-8c87-61176c735e2c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.336215 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6783e255-9125-478b-8c87-61176c735e2c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.336625 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6783e255-9125-478b-8c87-61176c735e2c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.336673 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6783e255-9125-478b-8c87-61176c735e2c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.336690 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6783e255-9125-478b-8c87-61176c735e2c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.336727 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6783e255-9125-478b-8c87-61176c735e2c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.336768 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6783e255-9125-478b-8c87-61176c735e2c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.336790 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6783e255-9125-478b-8c87-61176c735e2c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.336814 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvscd\" (UniqueName: \"kubernetes.io/projected/6783e255-9125-478b-8c87-61176c735e2c-kube-api-access-gvscd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.336831 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6783e255-9125-478b-8c87-61176c735e2c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.336862 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6783e255-9125-478b-8c87-61176c735e2c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.336879 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.337883 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6783e255-9125-478b-8c87-61176c735e2c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.337915 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.338073 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6783e255-9125-478b-8c87-61176c735e2c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.338651 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6783e255-9125-478b-8c87-61176c735e2c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.339015 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6783e255-9125-478b-8c87-61176c735e2c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.339502 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6783e255-9125-478b-8c87-61176c735e2c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.341424 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6783e255-9125-478b-8c87-61176c735e2c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.341691 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6783e255-9125-478b-8c87-61176c735e2c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.345598 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6783e255-9125-478b-8c87-61176c735e2c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.346636 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6783e255-9125-478b-8c87-61176c735e2c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.362736 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvscd\" (UniqueName: \"kubernetes.io/projected/6783e255-9125-478b-8c87-61176c735e2c-kube-api-access-gvscd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.386806 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6783e255-9125-478b-8c87-61176c735e2c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.434514 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.691603 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="008062c0-9ccf-4fd2-9b54-63196268da38" path="/var/lib/kubelet/pods/008062c0-9ccf-4fd2-9b54-63196268da38/volumes" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.720296 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="042fb593-4898-4085-889e-7ccb375cf969" path="/var/lib/kubelet/pods/042fb593-4898-4085-889e-7ccb375cf969/volumes" Feb 19 18:56:00 crc kubenswrapper[4749]: I0219 18:56:00.973946 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 18:56:01 crc kubenswrapper[4749]: I0219 18:56:01.032695 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6783e255-9125-478b-8c87-61176c735e2c","Type":"ContainerStarted","Data":"a2f26a73977dbaae31515444fadc9c42dfcf12456c3e46dbcfef4e15f29d3f3b"} Feb 19 18:56:02 crc kubenswrapper[4749]: I0219 18:56:02.041765 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8f675112-5cb9-4988-b346-b29f1e2699f9","Type":"ContainerStarted","Data":"66e4d27e90172c5543ac1a50dbdfa888865b2ba95b3b176205cc4833661d72a1"} Feb 19 18:56:02 crc kubenswrapper[4749]: I0219 18:56:02.959720 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="042fb593-4898-4085-889e-7ccb375cf969" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: i/o timeout" Feb 19 18:56:03 crc kubenswrapper[4749]: I0219 18:56:03.056718 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6783e255-9125-478b-8c87-61176c735e2c","Type":"ContainerStarted","Data":"fa4ca1d0f9829a7e1f5a8fd10bcf6dc73dcccbe49bfe5e82c848865b69bb456e"} Feb 19 18:56:07 crc kubenswrapper[4749]: I0219 18:56:07.840687 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6454498675-gg82x"] Feb 19 18:56:07 crc kubenswrapper[4749]: I0219 18:56:07.843192 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6454498675-gg82x" Feb 19 18:56:07 crc kubenswrapper[4749]: I0219 18:56:07.846914 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 19 18:56:07 crc kubenswrapper[4749]: I0219 18:56:07.854521 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6454498675-gg82x"] Feb 19 18:56:08 crc kubenswrapper[4749]: I0219 18:56:08.008209 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-config\") pod \"dnsmasq-dns-6454498675-gg82x\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " pod="openstack/dnsmasq-dns-6454498675-gg82x" Feb 19 18:56:08 crc kubenswrapper[4749]: I0219 18:56:08.008331 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-dns-swift-storage-0\") pod \"dnsmasq-dns-6454498675-gg82x\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " pod="openstack/dnsmasq-dns-6454498675-gg82x" Feb 19 18:56:08 crc kubenswrapper[4749]: I0219 18:56:08.008397 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-openstack-edpm-ipam\") pod \"dnsmasq-dns-6454498675-gg82x\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " pod="openstack/dnsmasq-dns-6454498675-gg82x" Feb 19 18:56:08 crc kubenswrapper[4749]: I0219 18:56:08.008413 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5z6m\" (UniqueName: \"kubernetes.io/projected/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-kube-api-access-h5z6m\") pod \"dnsmasq-dns-6454498675-gg82x\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " pod="openstack/dnsmasq-dns-6454498675-gg82x" Feb 19 18:56:08 crc kubenswrapper[4749]: I0219 18:56:08.008462 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-dns-svc\") pod \"dnsmasq-dns-6454498675-gg82x\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " pod="openstack/dnsmasq-dns-6454498675-gg82x" Feb 19 18:56:08 crc kubenswrapper[4749]: I0219 18:56:08.008484 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-ovsdbserver-nb\") pod \"dnsmasq-dns-6454498675-gg82x\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " pod="openstack/dnsmasq-dns-6454498675-gg82x" Feb 19 18:56:08 crc kubenswrapper[4749]: I0219 18:56:08.008501 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-ovsdbserver-sb\") pod \"dnsmasq-dns-6454498675-gg82x\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " pod="openstack/dnsmasq-dns-6454498675-gg82x" Feb 19 18:56:08 crc kubenswrapper[4749]: I0219 18:56:08.110178 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-config\") pod \"dnsmasq-dns-6454498675-gg82x\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " pod="openstack/dnsmasq-dns-6454498675-gg82x" Feb 19 18:56:08 crc kubenswrapper[4749]: I0219 18:56:08.110277 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-dns-swift-storage-0\") pod \"dnsmasq-dns-6454498675-gg82x\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " pod="openstack/dnsmasq-dns-6454498675-gg82x" Feb 19 18:56:08 crc kubenswrapper[4749]: I0219 18:56:08.110340 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-openstack-edpm-ipam\") pod \"dnsmasq-dns-6454498675-gg82x\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " pod="openstack/dnsmasq-dns-6454498675-gg82x" Feb 19 18:56:08 crc kubenswrapper[4749]: I0219 18:56:08.110356 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5z6m\" (UniqueName: \"kubernetes.io/projected/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-kube-api-access-h5z6m\") pod \"dnsmasq-dns-6454498675-gg82x\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " pod="openstack/dnsmasq-dns-6454498675-gg82x" Feb 19 18:56:08 crc kubenswrapper[4749]: I0219 18:56:08.110405 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-dns-svc\") pod \"dnsmasq-dns-6454498675-gg82x\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " pod="openstack/dnsmasq-dns-6454498675-gg82x" Feb 19 18:56:08 crc kubenswrapper[4749]: I0219 18:56:08.110425 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-ovsdbserver-nb\") pod \"dnsmasq-dns-6454498675-gg82x\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " pod="openstack/dnsmasq-dns-6454498675-gg82x" Feb 19 18:56:08 crc kubenswrapper[4749]: I0219 18:56:08.110441 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-ovsdbserver-sb\") pod \"dnsmasq-dns-6454498675-gg82x\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " pod="openstack/dnsmasq-dns-6454498675-gg82x" Feb 19 18:56:08 crc kubenswrapper[4749]: I0219 18:56:08.111492 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-ovsdbserver-sb\") pod \"dnsmasq-dns-6454498675-gg82x\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " pod="openstack/dnsmasq-dns-6454498675-gg82x" Feb 19 18:56:08 crc kubenswrapper[4749]: I0219 18:56:08.111492 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-dns-swift-storage-0\") pod \"dnsmasq-dns-6454498675-gg82x\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " pod="openstack/dnsmasq-dns-6454498675-gg82x" Feb 19 18:56:08 crc kubenswrapper[4749]: I0219 18:56:08.111498 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-ovsdbserver-nb\") pod \"dnsmasq-dns-6454498675-gg82x\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " pod="openstack/dnsmasq-dns-6454498675-gg82x" Feb 19 18:56:08 crc kubenswrapper[4749]: I0219 18:56:08.111610 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-config\") pod \"dnsmasq-dns-6454498675-gg82x\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " pod="openstack/dnsmasq-dns-6454498675-gg82x" Feb 19 18:56:08 crc kubenswrapper[4749]: I0219 18:56:08.111724 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-dns-svc\") pod \"dnsmasq-dns-6454498675-gg82x\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " pod="openstack/dnsmasq-dns-6454498675-gg82x" Feb 19 18:56:08 crc kubenswrapper[4749]: I0219 18:56:08.112338 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-openstack-edpm-ipam\") pod \"dnsmasq-dns-6454498675-gg82x\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " pod="openstack/dnsmasq-dns-6454498675-gg82x" Feb 19 18:56:08 crc kubenswrapper[4749]: I0219 18:56:08.131866 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5z6m\" (UniqueName: \"kubernetes.io/projected/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-kube-api-access-h5z6m\") pod \"dnsmasq-dns-6454498675-gg82x\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " pod="openstack/dnsmasq-dns-6454498675-gg82x" Feb 19 18:56:08 crc kubenswrapper[4749]: I0219 18:56:08.161902 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6454498675-gg82x" Feb 19 18:56:08 crc kubenswrapper[4749]: I0219 18:56:08.690391 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6454498675-gg82x"] Feb 19 18:56:09 crc kubenswrapper[4749]: I0219 18:56:09.109438 4749 generic.go:334] "Generic (PLEG): container finished" podID="651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4" containerID="13e6fa08ad4d8beba7839d176a657ef4afd0510449b31eb803559a40ca7ad1ce" exitCode=0 Feb 19 18:56:09 crc kubenswrapper[4749]: I0219 18:56:09.109479 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6454498675-gg82x" event={"ID":"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4","Type":"ContainerDied","Data":"13e6fa08ad4d8beba7839d176a657ef4afd0510449b31eb803559a40ca7ad1ce"} Feb 19 18:56:09 crc kubenswrapper[4749]: I0219 18:56:09.109505 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6454498675-gg82x" event={"ID":"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4","Type":"ContainerStarted","Data":"7d978430c40d24093cfa9ecee4052daa7e51041d8e9b2043c0a17e1d22a1a0d7"} Feb 19 18:56:10 crc kubenswrapper[4749]: I0219 18:56:10.119353 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6454498675-gg82x" event={"ID":"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4","Type":"ContainerStarted","Data":"e6fd50f1587de509e8c97331e4829800fc5e9b9d93567e9684fa72aa8b0bfd6c"} Feb 19 18:56:10 crc kubenswrapper[4749]: I0219 18:56:10.119994 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6454498675-gg82x" Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.164110 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6454498675-gg82x" Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.190543 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6454498675-gg82x" podStartSLOduration=11.19051607 podStartE2EDuration="11.19051607s" podCreationTimestamp="2026-02-19 18:56:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:56:10.137430906 +0000 UTC m=+1344.098650870" watchObservedRunningTime="2026-02-19 18:56:18.19051607 +0000 UTC m=+1352.151736044" Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.235300 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-597b667f69-7hpgv"] Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.235561 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-597b667f69-7hpgv" podUID="1c41744d-1aa4-4ea7-bef0-fa0838093916" containerName="dnsmasq-dns" containerID="cri-o://ba2f09c8738df305668a914b6221f68c65bd370e5b638999b837ad408c987979" gracePeriod=10 Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.421975 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c96bd5bf7-wmjrv"] Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.424865 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.443358 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c96bd5bf7-wmjrv"] Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.514015 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7707fe3e-adab-4755-bf50-f74bb3924913-dns-swift-storage-0\") pod \"dnsmasq-dns-7c96bd5bf7-wmjrv\" (UID: \"7707fe3e-adab-4755-bf50-f74bb3924913\") " pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.514102 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49xl6\" (UniqueName: \"kubernetes.io/projected/7707fe3e-adab-4755-bf50-f74bb3924913-kube-api-access-49xl6\") pod \"dnsmasq-dns-7c96bd5bf7-wmjrv\" (UID: \"7707fe3e-adab-4755-bf50-f74bb3924913\") " pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.514170 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7707fe3e-adab-4755-bf50-f74bb3924913-dns-svc\") pod \"dnsmasq-dns-7c96bd5bf7-wmjrv\" (UID: \"7707fe3e-adab-4755-bf50-f74bb3924913\") " pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.514209 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7707fe3e-adab-4755-bf50-f74bb3924913-openstack-edpm-ipam\") pod \"dnsmasq-dns-7c96bd5bf7-wmjrv\" (UID: \"7707fe3e-adab-4755-bf50-f74bb3924913\") " pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.514259 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7707fe3e-adab-4755-bf50-f74bb3924913-ovsdbserver-sb\") pod \"dnsmasq-dns-7c96bd5bf7-wmjrv\" (UID: \"7707fe3e-adab-4755-bf50-f74bb3924913\") " pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.514376 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7707fe3e-adab-4755-bf50-f74bb3924913-config\") pod \"dnsmasq-dns-7c96bd5bf7-wmjrv\" (UID: \"7707fe3e-adab-4755-bf50-f74bb3924913\") " pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.514439 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7707fe3e-adab-4755-bf50-f74bb3924913-ovsdbserver-nb\") pod \"dnsmasq-dns-7c96bd5bf7-wmjrv\" (UID: \"7707fe3e-adab-4755-bf50-f74bb3924913\") " pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.616621 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7707fe3e-adab-4755-bf50-f74bb3924913-ovsdbserver-nb\") pod \"dnsmasq-dns-7c96bd5bf7-wmjrv\" (UID: \"7707fe3e-adab-4755-bf50-f74bb3924913\") " pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.616807 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7707fe3e-adab-4755-bf50-f74bb3924913-dns-swift-storage-0\") pod \"dnsmasq-dns-7c96bd5bf7-wmjrv\" (UID: \"7707fe3e-adab-4755-bf50-f74bb3924913\") " pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.616839 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49xl6\" (UniqueName: \"kubernetes.io/projected/7707fe3e-adab-4755-bf50-f74bb3924913-kube-api-access-49xl6\") pod \"dnsmasq-dns-7c96bd5bf7-wmjrv\" (UID: \"7707fe3e-adab-4755-bf50-f74bb3924913\") " pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.616917 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7707fe3e-adab-4755-bf50-f74bb3924913-dns-svc\") pod \"dnsmasq-dns-7c96bd5bf7-wmjrv\" (UID: \"7707fe3e-adab-4755-bf50-f74bb3924913\") " pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.616953 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7707fe3e-adab-4755-bf50-f74bb3924913-openstack-edpm-ipam\") pod \"dnsmasq-dns-7c96bd5bf7-wmjrv\" (UID: \"7707fe3e-adab-4755-bf50-f74bb3924913\") " pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.616983 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7707fe3e-adab-4755-bf50-f74bb3924913-ovsdbserver-sb\") pod \"dnsmasq-dns-7c96bd5bf7-wmjrv\" (UID: \"7707fe3e-adab-4755-bf50-f74bb3924913\") " pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.617015 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7707fe3e-adab-4755-bf50-f74bb3924913-config\") pod \"dnsmasq-dns-7c96bd5bf7-wmjrv\" (UID: \"7707fe3e-adab-4755-bf50-f74bb3924913\") " pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.617745 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7707fe3e-adab-4755-bf50-f74bb3924913-ovsdbserver-nb\") pod \"dnsmasq-dns-7c96bd5bf7-wmjrv\" (UID: \"7707fe3e-adab-4755-bf50-f74bb3924913\") " pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.618166 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7707fe3e-adab-4755-bf50-f74bb3924913-config\") pod \"dnsmasq-dns-7c96bd5bf7-wmjrv\" (UID: \"7707fe3e-adab-4755-bf50-f74bb3924913\") " pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.618419 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7707fe3e-adab-4755-bf50-f74bb3924913-openstack-edpm-ipam\") pod \"dnsmasq-dns-7c96bd5bf7-wmjrv\" (UID: \"7707fe3e-adab-4755-bf50-f74bb3924913\") " pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.618528 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7707fe3e-adab-4755-bf50-f74bb3924913-dns-svc\") pod \"dnsmasq-dns-7c96bd5bf7-wmjrv\" (UID: \"7707fe3e-adab-4755-bf50-f74bb3924913\") " pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.618781 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7707fe3e-adab-4755-bf50-f74bb3924913-dns-swift-storage-0\") pod \"dnsmasq-dns-7c96bd5bf7-wmjrv\" (UID: \"7707fe3e-adab-4755-bf50-f74bb3924913\") " pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.618804 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7707fe3e-adab-4755-bf50-f74bb3924913-ovsdbserver-sb\") pod \"dnsmasq-dns-7c96bd5bf7-wmjrv\" (UID: \"7707fe3e-adab-4755-bf50-f74bb3924913\") " pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.636745 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49xl6\" (UniqueName: \"kubernetes.io/projected/7707fe3e-adab-4755-bf50-f74bb3924913-kube-api-access-49xl6\") pod \"dnsmasq-dns-7c96bd5bf7-wmjrv\" (UID: \"7707fe3e-adab-4755-bf50-f74bb3924913\") " pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.803334 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-597b667f69-7hpgv" Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.808786 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.923519 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-ovsdbserver-nb\") pod \"1c41744d-1aa4-4ea7-bef0-fa0838093916\" (UID: \"1c41744d-1aa4-4ea7-bef0-fa0838093916\") " Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.923673 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-ovsdbserver-sb\") pod \"1c41744d-1aa4-4ea7-bef0-fa0838093916\" (UID: \"1c41744d-1aa4-4ea7-bef0-fa0838093916\") " Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.923760 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-dns-svc\") pod \"1c41744d-1aa4-4ea7-bef0-fa0838093916\" (UID: \"1c41744d-1aa4-4ea7-bef0-fa0838093916\") " Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.923824 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-config\") pod \"1c41744d-1aa4-4ea7-bef0-fa0838093916\" (UID: \"1c41744d-1aa4-4ea7-bef0-fa0838093916\") " Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.923933 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-dns-swift-storage-0\") pod \"1c41744d-1aa4-4ea7-bef0-fa0838093916\" (UID: \"1c41744d-1aa4-4ea7-bef0-fa0838093916\") " Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.924070 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfdlj\" (UniqueName: \"kubernetes.io/projected/1c41744d-1aa4-4ea7-bef0-fa0838093916-kube-api-access-pfdlj\") pod \"1c41744d-1aa4-4ea7-bef0-fa0838093916\" (UID: \"1c41744d-1aa4-4ea7-bef0-fa0838093916\") " Feb 19 18:56:18 crc kubenswrapper[4749]: I0219 18:56:18.941380 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c41744d-1aa4-4ea7-bef0-fa0838093916-kube-api-access-pfdlj" (OuterVolumeSpecName: "kube-api-access-pfdlj") pod "1c41744d-1aa4-4ea7-bef0-fa0838093916" (UID: "1c41744d-1aa4-4ea7-bef0-fa0838093916"). InnerVolumeSpecName "kube-api-access-pfdlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:56:19 crc kubenswrapper[4749]: I0219 18:56:19.026560 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfdlj\" (UniqueName: \"kubernetes.io/projected/1c41744d-1aa4-4ea7-bef0-fa0838093916-kube-api-access-pfdlj\") on node \"crc\" DevicePath \"\"" Feb 19 18:56:19 crc kubenswrapper[4749]: I0219 18:56:19.034884 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1c41744d-1aa4-4ea7-bef0-fa0838093916" (UID: "1c41744d-1aa4-4ea7-bef0-fa0838093916"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:56:19 crc kubenswrapper[4749]: I0219 18:56:19.050160 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1c41744d-1aa4-4ea7-bef0-fa0838093916" (UID: "1c41744d-1aa4-4ea7-bef0-fa0838093916"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:56:19 crc kubenswrapper[4749]: I0219 18:56:19.065618 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-config" (OuterVolumeSpecName: "config") pod "1c41744d-1aa4-4ea7-bef0-fa0838093916" (UID: "1c41744d-1aa4-4ea7-bef0-fa0838093916"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:56:19 crc kubenswrapper[4749]: I0219 18:56:19.065901 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1c41744d-1aa4-4ea7-bef0-fa0838093916" (UID: "1c41744d-1aa4-4ea7-bef0-fa0838093916"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:56:19 crc kubenswrapper[4749]: I0219 18:56:19.066528 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1c41744d-1aa4-4ea7-bef0-fa0838093916" (UID: "1c41744d-1aa4-4ea7-bef0-fa0838093916"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:56:19 crc kubenswrapper[4749]: I0219 18:56:19.129311 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 18:56:19 crc kubenswrapper[4749]: I0219 18:56:19.129350 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 18:56:19 crc kubenswrapper[4749]: I0219 18:56:19.129360 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 18:56:19 crc kubenswrapper[4749]: I0219 18:56:19.129369 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:56:19 crc kubenswrapper[4749]: I0219 18:56:19.129377 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c41744d-1aa4-4ea7-bef0-fa0838093916-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 18:56:19 crc kubenswrapper[4749]: I0219 18:56:19.205325 4749 generic.go:334] "Generic (PLEG): container finished" podID="1c41744d-1aa4-4ea7-bef0-fa0838093916" containerID="ba2f09c8738df305668a914b6221f68c65bd370e5b638999b837ad408c987979" exitCode=0 Feb 19 18:56:19 crc kubenswrapper[4749]: I0219 18:56:19.205364 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-597b667f69-7hpgv" event={"ID":"1c41744d-1aa4-4ea7-bef0-fa0838093916","Type":"ContainerDied","Data":"ba2f09c8738df305668a914b6221f68c65bd370e5b638999b837ad408c987979"} Feb 19 18:56:19 crc kubenswrapper[4749]: I0219 18:56:19.205392 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-597b667f69-7hpgv" event={"ID":"1c41744d-1aa4-4ea7-bef0-fa0838093916","Type":"ContainerDied","Data":"e0d3a89faf507083f75f5c969d131a8c6bad0eb728637637c97600635fd47155"} Feb 19 18:56:19 crc kubenswrapper[4749]: I0219 18:56:19.205409 4749 scope.go:117] "RemoveContainer" containerID="ba2f09c8738df305668a914b6221f68c65bd370e5b638999b837ad408c987979" Feb 19 18:56:19 crc kubenswrapper[4749]: I0219 18:56:19.205408 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-597b667f69-7hpgv" Feb 19 18:56:19 crc kubenswrapper[4749]: I0219 18:56:19.211479 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c96bd5bf7-wmjrv"] Feb 19 18:56:19 crc kubenswrapper[4749]: I0219 18:56:19.230318 4749 scope.go:117] "RemoveContainer" containerID="0517e53c34f55e02e9386a7ab69e171aa5140859bf45b2412c01fd4cd087964c" Feb 19 18:56:19 crc kubenswrapper[4749]: I0219 18:56:19.260801 4749 scope.go:117] "RemoveContainer" containerID="ba2f09c8738df305668a914b6221f68c65bd370e5b638999b837ad408c987979" Feb 19 18:56:19 crc kubenswrapper[4749]: E0219 18:56:19.261174 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba2f09c8738df305668a914b6221f68c65bd370e5b638999b837ad408c987979\": container with ID starting with ba2f09c8738df305668a914b6221f68c65bd370e5b638999b837ad408c987979 not found: ID does not exist" containerID="ba2f09c8738df305668a914b6221f68c65bd370e5b638999b837ad408c987979" Feb 19 18:56:19 crc kubenswrapper[4749]: I0219 18:56:19.261199 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba2f09c8738df305668a914b6221f68c65bd370e5b638999b837ad408c987979"} err="failed to get container status \"ba2f09c8738df305668a914b6221f68c65bd370e5b638999b837ad408c987979\": rpc error: code = NotFound desc = could not find container \"ba2f09c8738df305668a914b6221f68c65bd370e5b638999b837ad408c987979\": container with ID starting with ba2f09c8738df305668a914b6221f68c65bd370e5b638999b837ad408c987979 not found: ID does not exist" Feb 19 18:56:19 crc kubenswrapper[4749]: I0219 18:56:19.261220 4749 scope.go:117] "RemoveContainer" containerID="0517e53c34f55e02e9386a7ab69e171aa5140859bf45b2412c01fd4cd087964c" Feb 19 18:56:19 crc kubenswrapper[4749]: E0219 18:56:19.261516 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0517e53c34f55e02e9386a7ab69e171aa5140859bf45b2412c01fd4cd087964c\": container with ID starting with 0517e53c34f55e02e9386a7ab69e171aa5140859bf45b2412c01fd4cd087964c not found: ID does not exist" containerID="0517e53c34f55e02e9386a7ab69e171aa5140859bf45b2412c01fd4cd087964c" Feb 19 18:56:19 crc kubenswrapper[4749]: I0219 18:56:19.261537 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0517e53c34f55e02e9386a7ab69e171aa5140859bf45b2412c01fd4cd087964c"} err="failed to get container status \"0517e53c34f55e02e9386a7ab69e171aa5140859bf45b2412c01fd4cd087964c\": rpc error: code = NotFound desc = could not find container \"0517e53c34f55e02e9386a7ab69e171aa5140859bf45b2412c01fd4cd087964c\": container with ID starting with 0517e53c34f55e02e9386a7ab69e171aa5140859bf45b2412c01fd4cd087964c not found: ID does not exist" Feb 19 18:56:19 crc kubenswrapper[4749]: I0219 18:56:19.266918 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-597b667f69-7hpgv"] Feb 19 18:56:19 crc kubenswrapper[4749]: I0219 18:56:19.278780 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-597b667f69-7hpgv"] Feb 19 18:56:20 crc kubenswrapper[4749]: I0219 18:56:20.214908 4749 generic.go:334] "Generic (PLEG): container finished" podID="7707fe3e-adab-4755-bf50-f74bb3924913" containerID="5cb14a341b7cb9deb82e15b15d927df9766614f1f8182a23a0812bffaab0fab6" exitCode=0 Feb 19 18:56:20 crc kubenswrapper[4749]: I0219 18:56:20.214949 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" event={"ID":"7707fe3e-adab-4755-bf50-f74bb3924913","Type":"ContainerDied","Data":"5cb14a341b7cb9deb82e15b15d927df9766614f1f8182a23a0812bffaab0fab6"} Feb 19 18:56:20 crc kubenswrapper[4749]: I0219 18:56:20.215302 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" event={"ID":"7707fe3e-adab-4755-bf50-f74bb3924913","Type":"ContainerStarted","Data":"c66866bd8b8b5ab73d20d444565f11b27c63b49bf281fdcb2f164577c2234c75"} Feb 19 18:56:20 crc kubenswrapper[4749]: I0219 18:56:20.691568 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c41744d-1aa4-4ea7-bef0-fa0838093916" path="/var/lib/kubelet/pods/1c41744d-1aa4-4ea7-bef0-fa0838093916/volumes" Feb 19 18:56:21 crc kubenswrapper[4749]: I0219 18:56:21.226435 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" event={"ID":"7707fe3e-adab-4755-bf50-f74bb3924913","Type":"ContainerStarted","Data":"7ca91a4a045d04c824d644f3a3724c1a6ba8bc914ac3b8e2ab24d8348ba0908c"} Feb 19 18:56:21 crc kubenswrapper[4749]: I0219 18:56:21.227768 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" Feb 19 18:56:21 crc kubenswrapper[4749]: I0219 18:56:21.253248 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" podStartSLOduration=3.253226437 podStartE2EDuration="3.253226437s" podCreationTimestamp="2026-02-19 18:56:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:56:21.247795486 +0000 UTC m=+1355.209015470" watchObservedRunningTime="2026-02-19 18:56:21.253226437 +0000 UTC m=+1355.214446391" Feb 19 18:56:28 crc kubenswrapper[4749]: I0219 18:56:28.811295 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c96bd5bf7-wmjrv" Feb 19 18:56:28 crc kubenswrapper[4749]: I0219 18:56:28.885715 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6454498675-gg82x"] Feb 19 18:56:28 crc kubenswrapper[4749]: I0219 18:56:28.885952 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6454498675-gg82x" podUID="651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4" containerName="dnsmasq-dns" containerID="cri-o://e6fd50f1587de509e8c97331e4829800fc5e9b9d93567e9684fa72aa8b0bfd6c" gracePeriod=10 Feb 19 18:56:29 crc kubenswrapper[4749]: I0219 18:56:29.307212 4749 generic.go:334] "Generic (PLEG): container finished" podID="651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4" containerID="e6fd50f1587de509e8c97331e4829800fc5e9b9d93567e9684fa72aa8b0bfd6c" exitCode=0 Feb 19 18:56:29 crc kubenswrapper[4749]: I0219 18:56:29.307418 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6454498675-gg82x" event={"ID":"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4","Type":"ContainerDied","Data":"e6fd50f1587de509e8c97331e4829800fc5e9b9d93567e9684fa72aa8b0bfd6c"} Feb 19 18:56:29 crc kubenswrapper[4749]: I0219 18:56:29.882551 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6454498675-gg82x" Feb 19 18:56:29 crc kubenswrapper[4749]: I0219 18:56:29.983991 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-dns-swift-storage-0\") pod \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " Feb 19 18:56:29 crc kubenswrapper[4749]: I0219 18:56:29.984076 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5z6m\" (UniqueName: \"kubernetes.io/projected/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-kube-api-access-h5z6m\") pod \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " Feb 19 18:56:29 crc kubenswrapper[4749]: I0219 18:56:29.984146 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-dns-svc\") pod \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " Feb 19 18:56:29 crc kubenswrapper[4749]: I0219 18:56:29.984186 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-openstack-edpm-ipam\") pod \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " Feb 19 18:56:29 crc kubenswrapper[4749]: I0219 18:56:29.984270 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-ovsdbserver-sb\") pod \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " Feb 19 18:56:29 crc kubenswrapper[4749]: I0219 18:56:29.984294 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-config\") pod \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " Feb 19 18:56:29 crc kubenswrapper[4749]: I0219 18:56:29.984386 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-ovsdbserver-nb\") pod \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\" (UID: \"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4\") " Feb 19 18:56:29 crc kubenswrapper[4749]: I0219 18:56:29.991165 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-kube-api-access-h5z6m" (OuterVolumeSpecName: "kube-api-access-h5z6m") pod "651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4" (UID: "651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4"). InnerVolumeSpecName "kube-api-access-h5z6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:56:30 crc kubenswrapper[4749]: I0219 18:56:30.038321 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4" (UID: "651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:56:30 crc kubenswrapper[4749]: I0219 18:56:30.039550 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4" (UID: "651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:56:30 crc kubenswrapper[4749]: I0219 18:56:30.042323 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-config" (OuterVolumeSpecName: "config") pod "651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4" (UID: "651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:56:30 crc kubenswrapper[4749]: I0219 18:56:30.045857 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4" (UID: "651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:56:30 crc kubenswrapper[4749]: I0219 18:56:30.046085 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4" (UID: "651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:56:30 crc kubenswrapper[4749]: I0219 18:56:30.048669 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4" (UID: "651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 18:56:30 crc kubenswrapper[4749]: I0219 18:56:30.087248 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 18:56:30 crc kubenswrapper[4749]: I0219 18:56:30.087279 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 18:56:30 crc kubenswrapper[4749]: I0219 18:56:30.087412 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5z6m\" (UniqueName: \"kubernetes.io/projected/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-kube-api-access-h5z6m\") on node \"crc\" DevicePath \"\"" Feb 19 18:56:30 crc kubenswrapper[4749]: I0219 18:56:30.087425 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 18:56:30 crc kubenswrapper[4749]: I0219 18:56:30.087434 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 18:56:30 crc kubenswrapper[4749]: I0219 18:56:30.087442 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 18:56:30 crc kubenswrapper[4749]: I0219 18:56:30.087450 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4-config\") on node \"crc\" DevicePath \"\"" Feb 19 18:56:30 crc kubenswrapper[4749]: I0219 18:56:30.317939 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6454498675-gg82x" event={"ID":"651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4","Type":"ContainerDied","Data":"7d978430c40d24093cfa9ecee4052daa7e51041d8e9b2043c0a17e1d22a1a0d7"} Feb 19 18:56:30 crc kubenswrapper[4749]: I0219 18:56:30.318003 4749 scope.go:117] "RemoveContainer" containerID="e6fd50f1587de509e8c97331e4829800fc5e9b9d93567e9684fa72aa8b0bfd6c" Feb 19 18:56:30 crc kubenswrapper[4749]: I0219 18:56:30.318017 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6454498675-gg82x" Feb 19 18:56:30 crc kubenswrapper[4749]: I0219 18:56:30.351604 4749 scope.go:117] "RemoveContainer" containerID="13e6fa08ad4d8beba7839d176a657ef4afd0510449b31eb803559a40ca7ad1ce" Feb 19 18:56:30 crc kubenswrapper[4749]: I0219 18:56:30.360241 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6454498675-gg82x"] Feb 19 18:56:30 crc kubenswrapper[4749]: I0219 18:56:30.366601 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6454498675-gg82x"] Feb 19 18:56:30 crc kubenswrapper[4749]: I0219 18:56:30.691784 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4" path="/var/lib/kubelet/pods/651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4/volumes" Feb 19 18:56:34 crc kubenswrapper[4749]: I0219 18:56:34.368758 4749 generic.go:334] "Generic (PLEG): container finished" podID="8f675112-5cb9-4988-b346-b29f1e2699f9" containerID="66e4d27e90172c5543ac1a50dbdfa888865b2ba95b3b176205cc4833661d72a1" exitCode=0 Feb 19 18:56:34 crc kubenswrapper[4749]: I0219 18:56:34.369502 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8f675112-5cb9-4988-b346-b29f1e2699f9","Type":"ContainerDied","Data":"66e4d27e90172c5543ac1a50dbdfa888865b2ba95b3b176205cc4833661d72a1"} Feb 19 18:56:35 crc kubenswrapper[4749]: I0219 18:56:35.380756 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8f675112-5cb9-4988-b346-b29f1e2699f9","Type":"ContainerStarted","Data":"511afeebdb7edc2d52f67c2a08f674b0f21446ae97c6b935050247877b4a6a7e"} Feb 19 18:56:35 crc kubenswrapper[4749]: I0219 18:56:35.381277 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 19 18:56:35 crc kubenswrapper[4749]: I0219 18:56:35.383200 4749 generic.go:334] "Generic (PLEG): container finished" podID="6783e255-9125-478b-8c87-61176c735e2c" containerID="fa4ca1d0f9829a7e1f5a8fd10bcf6dc73dcccbe49bfe5e82c848865b69bb456e" exitCode=0 Feb 19 18:56:35 crc kubenswrapper[4749]: I0219 18:56:35.383229 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6783e255-9125-478b-8c87-61176c735e2c","Type":"ContainerDied","Data":"fa4ca1d0f9829a7e1f5a8fd10bcf6dc73dcccbe49bfe5e82c848865b69bb456e"} Feb 19 18:56:35 crc kubenswrapper[4749]: I0219 18:56:35.411188 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.411168129 podStartE2EDuration="36.411168129s" podCreationTimestamp="2026-02-19 18:55:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:56:35.402231143 +0000 UTC m=+1369.363451117" watchObservedRunningTime="2026-02-19 18:56:35.411168129 +0000 UTC m=+1369.372388083" Feb 19 18:56:36 crc kubenswrapper[4749]: I0219 18:56:36.395262 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6783e255-9125-478b-8c87-61176c735e2c","Type":"ContainerStarted","Data":"4f8282163e0794957aa1b7eba3c4ae80bfd2f8d413e145998e0875b24ac3954d"} Feb 19 18:56:36 crc kubenswrapper[4749]: I0219 18:56:36.395809 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:36 crc kubenswrapper[4749]: I0219 18:56:36.419509 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.4194866 podStartE2EDuration="36.4194866s" podCreationTimestamp="2026-02-19 18:56:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 18:56:36.417180175 +0000 UTC m=+1370.378400139" watchObservedRunningTime="2026-02-19 18:56:36.4194866 +0000 UTC m=+1370.380706554" Feb 19 18:56:47 crc kubenswrapper[4749]: I0219 18:56:47.071992 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42"] Feb 19 18:56:47 crc kubenswrapper[4749]: E0219 18:56:47.072880 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c41744d-1aa4-4ea7-bef0-fa0838093916" containerName="dnsmasq-dns" Feb 19 18:56:47 crc kubenswrapper[4749]: I0219 18:56:47.072890 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c41744d-1aa4-4ea7-bef0-fa0838093916" containerName="dnsmasq-dns" Feb 19 18:56:47 crc kubenswrapper[4749]: E0219 18:56:47.072905 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4" containerName="init" Feb 19 18:56:47 crc kubenswrapper[4749]: I0219 18:56:47.072911 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4" containerName="init" Feb 19 18:56:47 crc kubenswrapper[4749]: E0219 18:56:47.072920 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c41744d-1aa4-4ea7-bef0-fa0838093916" containerName="init" Feb 19 18:56:47 crc kubenswrapper[4749]: I0219 18:56:47.072927 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c41744d-1aa4-4ea7-bef0-fa0838093916" containerName="init" Feb 19 18:56:47 crc kubenswrapper[4749]: E0219 18:56:47.072951 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4" containerName="dnsmasq-dns" Feb 19 18:56:47 crc kubenswrapper[4749]: I0219 18:56:47.072956 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4" containerName="dnsmasq-dns" Feb 19 18:56:47 crc kubenswrapper[4749]: I0219 18:56:47.073149 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="651f2c9a-bcc4-43ea-a78b-ad0dd70e57b4" containerName="dnsmasq-dns" Feb 19 18:56:47 crc kubenswrapper[4749]: I0219 18:56:47.073160 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c41744d-1aa4-4ea7-bef0-fa0838093916" containerName="dnsmasq-dns" Feb 19 18:56:47 crc kubenswrapper[4749]: I0219 18:56:47.073787 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42" Feb 19 18:56:47 crc kubenswrapper[4749]: I0219 18:56:47.075984 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 18:56:47 crc kubenswrapper[4749]: I0219 18:56:47.076365 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4pz98" Feb 19 18:56:47 crc kubenswrapper[4749]: I0219 18:56:47.076665 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 18:56:47 crc kubenswrapper[4749]: I0219 18:56:47.077112 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 18:56:47 crc kubenswrapper[4749]: I0219 18:56:47.094127 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42"] Feb 19 18:56:47 crc kubenswrapper[4749]: I0219 18:56:47.216360 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf17385f-8d5b-43a5-82c8-9d8bd893e056-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42\" (UID: \"bf17385f-8d5b-43a5-82c8-9d8bd893e056\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42" Feb 19 18:56:47 crc kubenswrapper[4749]: I0219 18:56:47.216429 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf17385f-8d5b-43a5-82c8-9d8bd893e056-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42\" (UID: \"bf17385f-8d5b-43a5-82c8-9d8bd893e056\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42" Feb 19 18:56:47 crc kubenswrapper[4749]: I0219 18:56:47.216460 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf17385f-8d5b-43a5-82c8-9d8bd893e056-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42\" (UID: \"bf17385f-8d5b-43a5-82c8-9d8bd893e056\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42" Feb 19 18:56:47 crc kubenswrapper[4749]: I0219 18:56:47.216513 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbl99\" (UniqueName: \"kubernetes.io/projected/bf17385f-8d5b-43a5-82c8-9d8bd893e056-kube-api-access-zbl99\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42\" (UID: \"bf17385f-8d5b-43a5-82c8-9d8bd893e056\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42" Feb 19 18:56:47 crc kubenswrapper[4749]: I0219 18:56:47.318187 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf17385f-8d5b-43a5-82c8-9d8bd893e056-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42\" (UID: \"bf17385f-8d5b-43a5-82c8-9d8bd893e056\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42" Feb 19 18:56:47 crc kubenswrapper[4749]: I0219 18:56:47.318252 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf17385f-8d5b-43a5-82c8-9d8bd893e056-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42\" (UID: \"bf17385f-8d5b-43a5-82c8-9d8bd893e056\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42" Feb 19 18:56:47 crc kubenswrapper[4749]: I0219 18:56:47.318282 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf17385f-8d5b-43a5-82c8-9d8bd893e056-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42\" (UID: \"bf17385f-8d5b-43a5-82c8-9d8bd893e056\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42" Feb 19 18:56:47 crc kubenswrapper[4749]: I0219 18:56:47.318344 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbl99\" (UniqueName: \"kubernetes.io/projected/bf17385f-8d5b-43a5-82c8-9d8bd893e056-kube-api-access-zbl99\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42\" (UID: \"bf17385f-8d5b-43a5-82c8-9d8bd893e056\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42" Feb 19 18:56:47 crc kubenswrapper[4749]: I0219 18:56:47.327044 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf17385f-8d5b-43a5-82c8-9d8bd893e056-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42\" (UID: \"bf17385f-8d5b-43a5-82c8-9d8bd893e056\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42" Feb 19 18:56:47 crc kubenswrapper[4749]: I0219 18:56:47.327268 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf17385f-8d5b-43a5-82c8-9d8bd893e056-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42\" (UID: \"bf17385f-8d5b-43a5-82c8-9d8bd893e056\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42" Feb 19 18:56:47 crc kubenswrapper[4749]: I0219 18:56:47.332475 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf17385f-8d5b-43a5-82c8-9d8bd893e056-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42\" (UID: \"bf17385f-8d5b-43a5-82c8-9d8bd893e056\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42" Feb 19 18:56:47 crc kubenswrapper[4749]: I0219 18:56:47.335374 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbl99\" (UniqueName: \"kubernetes.io/projected/bf17385f-8d5b-43a5-82c8-9d8bd893e056-kube-api-access-zbl99\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42\" (UID: \"bf17385f-8d5b-43a5-82c8-9d8bd893e056\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42" Feb 19 18:56:47 crc kubenswrapper[4749]: I0219 18:56:47.394231 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42" Feb 19 18:56:48 crc kubenswrapper[4749]: I0219 18:56:48.154265 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42"] Feb 19 18:56:48 crc kubenswrapper[4749]: W0219 18:56:48.154389 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf17385f_8d5b_43a5_82c8_9d8bd893e056.slice/crio-3639a5b9eca8d5cd4e47a9dfc705b8cc6755fcac39506121e7d04a6a21078a9e WatchSource:0}: Error finding container 3639a5b9eca8d5cd4e47a9dfc705b8cc6755fcac39506121e7d04a6a21078a9e: Status 404 returned error can't find the container with id 3639a5b9eca8d5cd4e47a9dfc705b8cc6755fcac39506121e7d04a6a21078a9e Feb 19 18:56:48 crc kubenswrapper[4749]: I0219 18:56:48.501883 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42" event={"ID":"bf17385f-8d5b-43a5-82c8-9d8bd893e056","Type":"ContainerStarted","Data":"3639a5b9eca8d5cd4e47a9dfc705b8cc6755fcac39506121e7d04a6a21078a9e"} Feb 19 18:56:49 crc kubenswrapper[4749]: I0219 18:56:49.458212 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 19 18:56:50 crc kubenswrapper[4749]: I0219 18:56:50.442296 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 19 18:56:54 crc kubenswrapper[4749]: I0219 18:56:54.725364 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 18:56:54 crc kubenswrapper[4749]: I0219 18:56:54.725902 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 18:56:57 crc kubenswrapper[4749]: I0219 18:56:57.627157 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42" event={"ID":"bf17385f-8d5b-43a5-82c8-9d8bd893e056","Type":"ContainerStarted","Data":"fad0f0fc61414d25001859294ffe60adc22147629652781eda1f960402f516e0"} Feb 19 18:56:57 crc kubenswrapper[4749]: I0219 18:56:57.658168 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42" podStartSLOduration=2.106584573 podStartE2EDuration="10.658152524s" podCreationTimestamp="2026-02-19 18:56:47 +0000 UTC" firstStartedPulling="2026-02-19 18:56:48.158841648 +0000 UTC m=+1382.120061602" lastFinishedPulling="2026-02-19 18:56:56.710409599 +0000 UTC m=+1390.671629553" observedRunningTime="2026-02-19 18:56:57.650190641 +0000 UTC m=+1391.611410595" watchObservedRunningTime="2026-02-19 18:56:57.658152524 +0000 UTC m=+1391.619372478" Feb 19 18:57:02 crc kubenswrapper[4749]: I0219 18:57:02.452622 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-456hf"] Feb 19 18:57:02 crc kubenswrapper[4749]: I0219 18:57:02.456913 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-456hf" Feb 19 18:57:02 crc kubenswrapper[4749]: I0219 18:57:02.466250 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-456hf"] Feb 19 18:57:02 crc kubenswrapper[4749]: I0219 18:57:02.524236 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31148a4b-37af-4f5b-b0a9-84e707ba1ad4-utilities\") pod \"redhat-marketplace-456hf\" (UID: \"31148a4b-37af-4f5b-b0a9-84e707ba1ad4\") " pod="openshift-marketplace/redhat-marketplace-456hf" Feb 19 18:57:02 crc kubenswrapper[4749]: I0219 18:57:02.524322 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31148a4b-37af-4f5b-b0a9-84e707ba1ad4-catalog-content\") pod \"redhat-marketplace-456hf\" (UID: \"31148a4b-37af-4f5b-b0a9-84e707ba1ad4\") " pod="openshift-marketplace/redhat-marketplace-456hf" Feb 19 18:57:02 crc kubenswrapper[4749]: I0219 18:57:02.524391 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xlrl\" (UniqueName: \"kubernetes.io/projected/31148a4b-37af-4f5b-b0a9-84e707ba1ad4-kube-api-access-8xlrl\") pod \"redhat-marketplace-456hf\" (UID: \"31148a4b-37af-4f5b-b0a9-84e707ba1ad4\") " pod="openshift-marketplace/redhat-marketplace-456hf" Feb 19 18:57:02 crc kubenswrapper[4749]: I0219 18:57:02.626305 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xlrl\" (UniqueName: \"kubernetes.io/projected/31148a4b-37af-4f5b-b0a9-84e707ba1ad4-kube-api-access-8xlrl\") pod \"redhat-marketplace-456hf\" (UID: \"31148a4b-37af-4f5b-b0a9-84e707ba1ad4\") " pod="openshift-marketplace/redhat-marketplace-456hf" Feb 19 18:57:02 crc kubenswrapper[4749]: I0219 18:57:02.626682 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31148a4b-37af-4f5b-b0a9-84e707ba1ad4-utilities\") pod \"redhat-marketplace-456hf\" (UID: \"31148a4b-37af-4f5b-b0a9-84e707ba1ad4\") " pod="openshift-marketplace/redhat-marketplace-456hf" Feb 19 18:57:02 crc kubenswrapper[4749]: I0219 18:57:02.626827 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31148a4b-37af-4f5b-b0a9-84e707ba1ad4-catalog-content\") pod \"redhat-marketplace-456hf\" (UID: \"31148a4b-37af-4f5b-b0a9-84e707ba1ad4\") " pod="openshift-marketplace/redhat-marketplace-456hf" Feb 19 18:57:02 crc kubenswrapper[4749]: I0219 18:57:02.627308 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31148a4b-37af-4f5b-b0a9-84e707ba1ad4-utilities\") pod \"redhat-marketplace-456hf\" (UID: \"31148a4b-37af-4f5b-b0a9-84e707ba1ad4\") " pod="openshift-marketplace/redhat-marketplace-456hf" Feb 19 18:57:02 crc kubenswrapper[4749]: I0219 18:57:02.627317 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31148a4b-37af-4f5b-b0a9-84e707ba1ad4-catalog-content\") pod \"redhat-marketplace-456hf\" (UID: \"31148a4b-37af-4f5b-b0a9-84e707ba1ad4\") " pod="openshift-marketplace/redhat-marketplace-456hf" Feb 19 18:57:02 crc kubenswrapper[4749]: I0219 18:57:02.647213 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xlrl\" (UniqueName: \"kubernetes.io/projected/31148a4b-37af-4f5b-b0a9-84e707ba1ad4-kube-api-access-8xlrl\") pod \"redhat-marketplace-456hf\" (UID: \"31148a4b-37af-4f5b-b0a9-84e707ba1ad4\") " pod="openshift-marketplace/redhat-marketplace-456hf" Feb 19 18:57:02 crc kubenswrapper[4749]: I0219 18:57:02.791633 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-456hf" Feb 19 18:57:03 crc kubenswrapper[4749]: I0219 18:57:03.294589 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-456hf"] Feb 19 18:57:03 crc kubenswrapper[4749]: I0219 18:57:03.680001 4749 generic.go:334] "Generic (PLEG): container finished" podID="31148a4b-37af-4f5b-b0a9-84e707ba1ad4" containerID="2a3cfc7b21613006e39473e724c892a91e8a067630800f25afea1d3d990dcd36" exitCode=0 Feb 19 18:57:03 crc kubenswrapper[4749]: I0219 18:57:03.680061 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-456hf" event={"ID":"31148a4b-37af-4f5b-b0a9-84e707ba1ad4","Type":"ContainerDied","Data":"2a3cfc7b21613006e39473e724c892a91e8a067630800f25afea1d3d990dcd36"} Feb 19 18:57:03 crc kubenswrapper[4749]: I0219 18:57:03.680370 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-456hf" event={"ID":"31148a4b-37af-4f5b-b0a9-84e707ba1ad4","Type":"ContainerStarted","Data":"f9aa0b023c594e6c4267cbfd8079cd4858f1d76874d41378c0b03cba7ea61865"} Feb 19 18:57:04 crc kubenswrapper[4749]: I0219 18:57:04.691219 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-456hf" event={"ID":"31148a4b-37af-4f5b-b0a9-84e707ba1ad4","Type":"ContainerStarted","Data":"3e2a227a03f0d4f3bb7a5eef724a9153e2bfdc4ca3e970468d526c6fa6fc2845"} Feb 19 18:57:05 crc kubenswrapper[4749]: I0219 18:57:05.702177 4749 generic.go:334] "Generic (PLEG): container finished" podID="31148a4b-37af-4f5b-b0a9-84e707ba1ad4" containerID="3e2a227a03f0d4f3bb7a5eef724a9153e2bfdc4ca3e970468d526c6fa6fc2845" exitCode=0 Feb 19 18:57:05 crc kubenswrapper[4749]: I0219 18:57:05.702257 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-456hf" event={"ID":"31148a4b-37af-4f5b-b0a9-84e707ba1ad4","Type":"ContainerDied","Data":"3e2a227a03f0d4f3bb7a5eef724a9153e2bfdc4ca3e970468d526c6fa6fc2845"} Feb 19 18:57:06 crc kubenswrapper[4749]: I0219 18:57:06.712864 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-456hf" event={"ID":"31148a4b-37af-4f5b-b0a9-84e707ba1ad4","Type":"ContainerStarted","Data":"820799f17d84dba7650e39ca2f142396b2ac474735f4ffae9c691a9833c0e892"} Feb 19 18:57:06 crc kubenswrapper[4749]: I0219 18:57:06.745058 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-456hf" podStartSLOduration=2.072860743 podStartE2EDuration="4.745006915s" podCreationTimestamp="2026-02-19 18:57:02 +0000 UTC" firstStartedPulling="2026-02-19 18:57:03.682113393 +0000 UTC m=+1397.643333347" lastFinishedPulling="2026-02-19 18:57:06.354259565 +0000 UTC m=+1400.315479519" observedRunningTime="2026-02-19 18:57:06.730973065 +0000 UTC m=+1400.692193039" watchObservedRunningTime="2026-02-19 18:57:06.745006915 +0000 UTC m=+1400.706226869" Feb 19 18:57:07 crc kubenswrapper[4749]: I0219 18:57:07.722671 4749 generic.go:334] "Generic (PLEG): container finished" podID="bf17385f-8d5b-43a5-82c8-9d8bd893e056" containerID="fad0f0fc61414d25001859294ffe60adc22147629652781eda1f960402f516e0" exitCode=0 Feb 19 18:57:07 crc kubenswrapper[4749]: I0219 18:57:07.723123 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42" event={"ID":"bf17385f-8d5b-43a5-82c8-9d8bd893e056","Type":"ContainerDied","Data":"fad0f0fc61414d25001859294ffe60adc22147629652781eda1f960402f516e0"} Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.058199 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s5mfd"] Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.065292 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s5mfd" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.075656 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s5mfd"] Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.162771 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c321926-0b01-4bd4-83ff-2fa26e0d312a-utilities\") pod \"community-operators-s5mfd\" (UID: \"6c321926-0b01-4bd4-83ff-2fa26e0d312a\") " pod="openshift-marketplace/community-operators-s5mfd" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.162830 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvjb5\" (UniqueName: \"kubernetes.io/projected/6c321926-0b01-4bd4-83ff-2fa26e0d312a-kube-api-access-jvjb5\") pod \"community-operators-s5mfd\" (UID: \"6c321926-0b01-4bd4-83ff-2fa26e0d312a\") " pod="openshift-marketplace/community-operators-s5mfd" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.162860 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c321926-0b01-4bd4-83ff-2fa26e0d312a-catalog-content\") pod \"community-operators-s5mfd\" (UID: \"6c321926-0b01-4bd4-83ff-2fa26e0d312a\") " pod="openshift-marketplace/community-operators-s5mfd" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.234465 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.264269 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c321926-0b01-4bd4-83ff-2fa26e0d312a-utilities\") pod \"community-operators-s5mfd\" (UID: \"6c321926-0b01-4bd4-83ff-2fa26e0d312a\") " pod="openshift-marketplace/community-operators-s5mfd" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.264334 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvjb5\" (UniqueName: \"kubernetes.io/projected/6c321926-0b01-4bd4-83ff-2fa26e0d312a-kube-api-access-jvjb5\") pod \"community-operators-s5mfd\" (UID: \"6c321926-0b01-4bd4-83ff-2fa26e0d312a\") " pod="openshift-marketplace/community-operators-s5mfd" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.264363 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c321926-0b01-4bd4-83ff-2fa26e0d312a-catalog-content\") pod \"community-operators-s5mfd\" (UID: \"6c321926-0b01-4bd4-83ff-2fa26e0d312a\") " pod="openshift-marketplace/community-operators-s5mfd" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.265251 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c321926-0b01-4bd4-83ff-2fa26e0d312a-catalog-content\") pod \"community-operators-s5mfd\" (UID: \"6c321926-0b01-4bd4-83ff-2fa26e0d312a\") " pod="openshift-marketplace/community-operators-s5mfd" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.265767 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c321926-0b01-4bd4-83ff-2fa26e0d312a-utilities\") pod \"community-operators-s5mfd\" (UID: \"6c321926-0b01-4bd4-83ff-2fa26e0d312a\") " pod="openshift-marketplace/community-operators-s5mfd" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.291760 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvjb5\" (UniqueName: \"kubernetes.io/projected/6c321926-0b01-4bd4-83ff-2fa26e0d312a-kube-api-access-jvjb5\") pod \"community-operators-s5mfd\" (UID: \"6c321926-0b01-4bd4-83ff-2fa26e0d312a\") " pod="openshift-marketplace/community-operators-s5mfd" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.365547 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf17385f-8d5b-43a5-82c8-9d8bd893e056-repo-setup-combined-ca-bundle\") pod \"bf17385f-8d5b-43a5-82c8-9d8bd893e056\" (UID: \"bf17385f-8d5b-43a5-82c8-9d8bd893e056\") " Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.366087 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf17385f-8d5b-43a5-82c8-9d8bd893e056-inventory\") pod \"bf17385f-8d5b-43a5-82c8-9d8bd893e056\" (UID: \"bf17385f-8d5b-43a5-82c8-9d8bd893e056\") " Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.366150 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbl99\" (UniqueName: \"kubernetes.io/projected/bf17385f-8d5b-43a5-82c8-9d8bd893e056-kube-api-access-zbl99\") pod \"bf17385f-8d5b-43a5-82c8-9d8bd893e056\" (UID: \"bf17385f-8d5b-43a5-82c8-9d8bd893e056\") " Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.366195 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf17385f-8d5b-43a5-82c8-9d8bd893e056-ssh-key-openstack-edpm-ipam\") pod \"bf17385f-8d5b-43a5-82c8-9d8bd893e056\" (UID: \"bf17385f-8d5b-43a5-82c8-9d8bd893e056\") " Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.372195 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf17385f-8d5b-43a5-82c8-9d8bd893e056-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "bf17385f-8d5b-43a5-82c8-9d8bd893e056" (UID: "bf17385f-8d5b-43a5-82c8-9d8bd893e056"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.373960 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf17385f-8d5b-43a5-82c8-9d8bd893e056-kube-api-access-zbl99" (OuterVolumeSpecName: "kube-api-access-zbl99") pod "bf17385f-8d5b-43a5-82c8-9d8bd893e056" (UID: "bf17385f-8d5b-43a5-82c8-9d8bd893e056"). InnerVolumeSpecName "kube-api-access-zbl99". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.401524 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf17385f-8d5b-43a5-82c8-9d8bd893e056-inventory" (OuterVolumeSpecName: "inventory") pod "bf17385f-8d5b-43a5-82c8-9d8bd893e056" (UID: "bf17385f-8d5b-43a5-82c8-9d8bd893e056"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.404061 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s5mfd" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.407894 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf17385f-8d5b-43a5-82c8-9d8bd893e056-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bf17385f-8d5b-43a5-82c8-9d8bd893e056" (UID: "bf17385f-8d5b-43a5-82c8-9d8bd893e056"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.469046 4749 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf17385f-8d5b-43a5-82c8-9d8bd893e056-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.469081 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf17385f-8d5b-43a5-82c8-9d8bd893e056-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.469092 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbl99\" (UniqueName: \"kubernetes.io/projected/bf17385f-8d5b-43a5-82c8-9d8bd893e056-kube-api-access-zbl99\") on node \"crc\" DevicePath \"\"" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.469102 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf17385f-8d5b-43a5-82c8-9d8bd893e056-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.784665 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42" event={"ID":"bf17385f-8d5b-43a5-82c8-9d8bd893e056","Type":"ContainerDied","Data":"3639a5b9eca8d5cd4e47a9dfc705b8cc6755fcac39506121e7d04a6a21078a9e"} Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.784705 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3639a5b9eca8d5cd4e47a9dfc705b8cc6755fcac39506121e7d04a6a21078a9e" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.784762 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.857979 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-5cmm6"] Feb 19 18:57:09 crc kubenswrapper[4749]: E0219 18:57:09.858818 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf17385f-8d5b-43a5-82c8-9d8bd893e056" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.858844 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf17385f-8d5b-43a5-82c8-9d8bd893e056" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.859123 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf17385f-8d5b-43a5-82c8-9d8bd893e056" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.859983 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5cmm6" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.861684 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4pz98" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.861952 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.862131 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.862932 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.867362 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-5cmm6"] Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.962639 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s5mfd"] Feb 19 18:57:09 crc kubenswrapper[4749]: W0219 18:57:09.965120 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c321926_0b01_4bd4_83ff_2fa26e0d312a.slice/crio-ee63d61099bd5fc3054c14b4ebb79042f18be91b384eca5e45e36a04cbbc43e9 WatchSource:0}: Error finding container ee63d61099bd5fc3054c14b4ebb79042f18be91b384eca5e45e36a04cbbc43e9: Status 404 returned error can't find the container with id ee63d61099bd5fc3054c14b4ebb79042f18be91b384eca5e45e36a04cbbc43e9 Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.994959 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6cqp\" (UniqueName: \"kubernetes.io/projected/ab3e2ecf-4d0f-4482-87fc-1098e7b8818a-kube-api-access-f6cqp\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5cmm6\" (UID: \"ab3e2ecf-4d0f-4482-87fc-1098e7b8818a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5cmm6" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.995482 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab3e2ecf-4d0f-4482-87fc-1098e7b8818a-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5cmm6\" (UID: \"ab3e2ecf-4d0f-4482-87fc-1098e7b8818a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5cmm6" Feb 19 18:57:09 crc kubenswrapper[4749]: I0219 18:57:09.995626 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab3e2ecf-4d0f-4482-87fc-1098e7b8818a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5cmm6\" (UID: \"ab3e2ecf-4d0f-4482-87fc-1098e7b8818a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5cmm6" Feb 19 18:57:10 crc kubenswrapper[4749]: I0219 18:57:10.097447 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab3e2ecf-4d0f-4482-87fc-1098e7b8818a-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5cmm6\" (UID: \"ab3e2ecf-4d0f-4482-87fc-1098e7b8818a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5cmm6" Feb 19 18:57:10 crc kubenswrapper[4749]: I0219 18:57:10.097535 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab3e2ecf-4d0f-4482-87fc-1098e7b8818a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5cmm6\" (UID: \"ab3e2ecf-4d0f-4482-87fc-1098e7b8818a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5cmm6" Feb 19 18:57:10 crc kubenswrapper[4749]: I0219 18:57:10.097591 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6cqp\" (UniqueName: \"kubernetes.io/projected/ab3e2ecf-4d0f-4482-87fc-1098e7b8818a-kube-api-access-f6cqp\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5cmm6\" (UID: \"ab3e2ecf-4d0f-4482-87fc-1098e7b8818a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5cmm6" Feb 19 18:57:10 crc kubenswrapper[4749]: I0219 18:57:10.101870 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab3e2ecf-4d0f-4482-87fc-1098e7b8818a-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5cmm6\" (UID: \"ab3e2ecf-4d0f-4482-87fc-1098e7b8818a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5cmm6" Feb 19 18:57:10 crc kubenswrapper[4749]: I0219 18:57:10.102636 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab3e2ecf-4d0f-4482-87fc-1098e7b8818a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5cmm6\" (UID: \"ab3e2ecf-4d0f-4482-87fc-1098e7b8818a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5cmm6" Feb 19 18:57:10 crc kubenswrapper[4749]: I0219 18:57:10.113694 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6cqp\" (UniqueName: \"kubernetes.io/projected/ab3e2ecf-4d0f-4482-87fc-1098e7b8818a-kube-api-access-f6cqp\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5cmm6\" (UID: \"ab3e2ecf-4d0f-4482-87fc-1098e7b8818a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5cmm6" Feb 19 18:57:10 crc kubenswrapper[4749]: I0219 18:57:10.215425 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5cmm6" Feb 19 18:57:10 crc kubenswrapper[4749]: I0219 18:57:10.751687 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-5cmm6"] Feb 19 18:57:10 crc kubenswrapper[4749]: W0219 18:57:10.758957 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab3e2ecf_4d0f_4482_87fc_1098e7b8818a.slice/crio-7f5176ff1e34d5c11a46d9be35e00f7f63ef5433aca13bfa0c46ef2ac880214c WatchSource:0}: Error finding container 7f5176ff1e34d5c11a46d9be35e00f7f63ef5433aca13bfa0c46ef2ac880214c: Status 404 returned error can't find the container with id 7f5176ff1e34d5c11a46d9be35e00f7f63ef5433aca13bfa0c46ef2ac880214c Feb 19 18:57:10 crc kubenswrapper[4749]: I0219 18:57:10.804426 4749 generic.go:334] "Generic (PLEG): container finished" podID="6c321926-0b01-4bd4-83ff-2fa26e0d312a" containerID="89784bd20b1a711fb571638a33598d6068861f2209632cf6a1789401475b9d43" exitCode=0 Feb 19 18:57:10 crc kubenswrapper[4749]: I0219 18:57:10.804539 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5mfd" event={"ID":"6c321926-0b01-4bd4-83ff-2fa26e0d312a","Type":"ContainerDied","Data":"89784bd20b1a711fb571638a33598d6068861f2209632cf6a1789401475b9d43"} Feb 19 18:57:10 crc kubenswrapper[4749]: I0219 18:57:10.804616 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5mfd" event={"ID":"6c321926-0b01-4bd4-83ff-2fa26e0d312a","Type":"ContainerStarted","Data":"ee63d61099bd5fc3054c14b4ebb79042f18be91b384eca5e45e36a04cbbc43e9"} Feb 19 18:57:10 crc kubenswrapper[4749]: I0219 18:57:10.806115 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5cmm6" event={"ID":"ab3e2ecf-4d0f-4482-87fc-1098e7b8818a","Type":"ContainerStarted","Data":"7f5176ff1e34d5c11a46d9be35e00f7f63ef5433aca13bfa0c46ef2ac880214c"} Feb 19 18:57:11 crc kubenswrapper[4749]: I0219 18:57:11.817215 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5mfd" event={"ID":"6c321926-0b01-4bd4-83ff-2fa26e0d312a","Type":"ContainerStarted","Data":"605430e27306251f6be9554256774a716c3961634e062190755f85f9b956f62e"} Feb 19 18:57:11 crc kubenswrapper[4749]: I0219 18:57:11.819170 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5cmm6" event={"ID":"ab3e2ecf-4d0f-4482-87fc-1098e7b8818a","Type":"ContainerStarted","Data":"ae44aa0dd64c710f8f71aeccdb048b68914a884c6ced580753e0a983bd62d017"} Feb 19 18:57:11 crc kubenswrapper[4749]: I0219 18:57:11.862365 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5cmm6" podStartSLOduration=2.413388866 podStartE2EDuration="2.862341055s" podCreationTimestamp="2026-02-19 18:57:09 +0000 UTC" firstStartedPulling="2026-02-19 18:57:10.770976713 +0000 UTC m=+1404.732196667" lastFinishedPulling="2026-02-19 18:57:11.219928902 +0000 UTC m=+1405.181148856" observedRunningTime="2026-02-19 18:57:11.853194193 +0000 UTC m=+1405.814414147" watchObservedRunningTime="2026-02-19 18:57:11.862341055 +0000 UTC m=+1405.823561039" Feb 19 18:57:12 crc kubenswrapper[4749]: I0219 18:57:12.792998 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-456hf" Feb 19 18:57:12 crc kubenswrapper[4749]: I0219 18:57:12.793445 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-456hf" Feb 19 18:57:12 crc kubenswrapper[4749]: I0219 18:57:12.851982 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-456hf" Feb 19 18:57:12 crc kubenswrapper[4749]: I0219 18:57:12.903421 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-456hf" Feb 19 18:57:14 crc kubenswrapper[4749]: I0219 18:57:14.850310 4749 generic.go:334] "Generic (PLEG): container finished" podID="ab3e2ecf-4d0f-4482-87fc-1098e7b8818a" containerID="ae44aa0dd64c710f8f71aeccdb048b68914a884c6ced580753e0a983bd62d017" exitCode=0 Feb 19 18:57:14 crc kubenswrapper[4749]: I0219 18:57:14.850421 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5cmm6" event={"ID":"ab3e2ecf-4d0f-4482-87fc-1098e7b8818a","Type":"ContainerDied","Data":"ae44aa0dd64c710f8f71aeccdb048b68914a884c6ced580753e0a983bd62d017"} Feb 19 18:57:15 crc kubenswrapper[4749]: I0219 18:57:15.028897 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-456hf"] Feb 19 18:57:15 crc kubenswrapper[4749]: I0219 18:57:15.029196 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-456hf" podUID="31148a4b-37af-4f5b-b0a9-84e707ba1ad4" containerName="registry-server" containerID="cri-o://820799f17d84dba7650e39ca2f142396b2ac474735f4ffae9c691a9833c0e892" gracePeriod=2 Feb 19 18:57:15 crc kubenswrapper[4749]: I0219 18:57:15.859872 4749 generic.go:334] "Generic (PLEG): container finished" podID="6c321926-0b01-4bd4-83ff-2fa26e0d312a" containerID="605430e27306251f6be9554256774a716c3961634e062190755f85f9b956f62e" exitCode=0 Feb 19 18:57:15 crc kubenswrapper[4749]: I0219 18:57:15.859898 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5mfd" event={"ID":"6c321926-0b01-4bd4-83ff-2fa26e0d312a","Type":"ContainerDied","Data":"605430e27306251f6be9554256774a716c3961634e062190755f85f9b956f62e"} Feb 19 18:57:15 crc kubenswrapper[4749]: I0219 18:57:15.864085 4749 generic.go:334] "Generic (PLEG): container finished" podID="31148a4b-37af-4f5b-b0a9-84e707ba1ad4" containerID="820799f17d84dba7650e39ca2f142396b2ac474735f4ffae9c691a9833c0e892" exitCode=0 Feb 19 18:57:15 crc kubenswrapper[4749]: I0219 18:57:15.864300 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-456hf" event={"ID":"31148a4b-37af-4f5b-b0a9-84e707ba1ad4","Type":"ContainerDied","Data":"820799f17d84dba7650e39ca2f142396b2ac474735f4ffae9c691a9833c0e892"} Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.220212 4749 scope.go:117] "RemoveContainer" containerID="2526f8dc6e001f84c0dc07081bcc7822b10251354f7b568a4fde083fdf119970" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.311004 4749 scope.go:117] "RemoveContainer" containerID="505eaac2bb687c42a34911e46d43ca987fdfa63a1a58808c66af3062702a0237" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.693757 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5cmm6" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.799477 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-456hf" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.841203 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6cqp\" (UniqueName: \"kubernetes.io/projected/ab3e2ecf-4d0f-4482-87fc-1098e7b8818a-kube-api-access-f6cqp\") pod \"ab3e2ecf-4d0f-4482-87fc-1098e7b8818a\" (UID: \"ab3e2ecf-4d0f-4482-87fc-1098e7b8818a\") " Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.841328 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab3e2ecf-4d0f-4482-87fc-1098e7b8818a-inventory\") pod \"ab3e2ecf-4d0f-4482-87fc-1098e7b8818a\" (UID: \"ab3e2ecf-4d0f-4482-87fc-1098e7b8818a\") " Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.841358 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab3e2ecf-4d0f-4482-87fc-1098e7b8818a-ssh-key-openstack-edpm-ipam\") pod \"ab3e2ecf-4d0f-4482-87fc-1098e7b8818a\" (UID: \"ab3e2ecf-4d0f-4482-87fc-1098e7b8818a\") " Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.847904 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab3e2ecf-4d0f-4482-87fc-1098e7b8818a-kube-api-access-f6cqp" (OuterVolumeSpecName: "kube-api-access-f6cqp") pod "ab3e2ecf-4d0f-4482-87fc-1098e7b8818a" (UID: "ab3e2ecf-4d0f-4482-87fc-1098e7b8818a"). InnerVolumeSpecName "kube-api-access-f6cqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.872339 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab3e2ecf-4d0f-4482-87fc-1098e7b8818a-inventory" (OuterVolumeSpecName: "inventory") pod "ab3e2ecf-4d0f-4482-87fc-1098e7b8818a" (UID: "ab3e2ecf-4d0f-4482-87fc-1098e7b8818a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.875330 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5cmm6" event={"ID":"ab3e2ecf-4d0f-4482-87fc-1098e7b8818a","Type":"ContainerDied","Data":"7f5176ff1e34d5c11a46d9be35e00f7f63ef5433aca13bfa0c46ef2ac880214c"} Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.875362 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f5176ff1e34d5c11a46d9be35e00f7f63ef5433aca13bfa0c46ef2ac880214c" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.875399 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5cmm6" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.877745 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-456hf" event={"ID":"31148a4b-37af-4f5b-b0a9-84e707ba1ad4","Type":"ContainerDied","Data":"f9aa0b023c594e6c4267cbfd8079cd4858f1d76874d41378c0b03cba7ea61865"} Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.877783 4749 scope.go:117] "RemoveContainer" containerID="820799f17d84dba7650e39ca2f142396b2ac474735f4ffae9c691a9833c0e892" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.877915 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-456hf" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.885316 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5mfd" event={"ID":"6c321926-0b01-4bd4-83ff-2fa26e0d312a","Type":"ContainerStarted","Data":"4bbe32e623e9e6df005e3c58c22f05d910c469bde3c77ac20178e38d86358f57"} Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.887305 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab3e2ecf-4d0f-4482-87fc-1098e7b8818a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ab3e2ecf-4d0f-4482-87fc-1098e7b8818a" (UID: "ab3e2ecf-4d0f-4482-87fc-1098e7b8818a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.906049 4749 scope.go:117] "RemoveContainer" containerID="3e2a227a03f0d4f3bb7a5eef724a9153e2bfdc4ca3e970468d526c6fa6fc2845" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.919291 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s5mfd" podStartSLOduration=2.216466579 podStartE2EDuration="7.919268833s" podCreationTimestamp="2026-02-19 18:57:09 +0000 UTC" firstStartedPulling="2026-02-19 18:57:10.806567585 +0000 UTC m=+1404.767787539" lastFinishedPulling="2026-02-19 18:57:16.509369839 +0000 UTC m=+1410.470589793" observedRunningTime="2026-02-19 18:57:16.90638483 +0000 UTC m=+1410.867604794" watchObservedRunningTime="2026-02-19 18:57:16.919268833 +0000 UTC m=+1410.880488797" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.934674 4749 scope.go:117] "RemoveContainer" containerID="2a3cfc7b21613006e39473e724c892a91e8a067630800f25afea1d3d990dcd36" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.945865 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31148a4b-37af-4f5b-b0a9-84e707ba1ad4-catalog-content\") pod \"31148a4b-37af-4f5b-b0a9-84e707ba1ad4\" (UID: \"31148a4b-37af-4f5b-b0a9-84e707ba1ad4\") " Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.946047 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xlrl\" (UniqueName: \"kubernetes.io/projected/31148a4b-37af-4f5b-b0a9-84e707ba1ad4-kube-api-access-8xlrl\") pod \"31148a4b-37af-4f5b-b0a9-84e707ba1ad4\" (UID: \"31148a4b-37af-4f5b-b0a9-84e707ba1ad4\") " Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.946149 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31148a4b-37af-4f5b-b0a9-84e707ba1ad4-utilities\") pod \"31148a4b-37af-4f5b-b0a9-84e707ba1ad4\" (UID: \"31148a4b-37af-4f5b-b0a9-84e707ba1ad4\") " Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.946735 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab3e2ecf-4d0f-4482-87fc-1098e7b8818a-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.946760 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab3e2ecf-4d0f-4482-87fc-1098e7b8818a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.946777 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6cqp\" (UniqueName: \"kubernetes.io/projected/ab3e2ecf-4d0f-4482-87fc-1098e7b8818a-kube-api-access-f6cqp\") on node \"crc\" DevicePath \"\"" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.946954 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31148a4b-37af-4f5b-b0a9-84e707ba1ad4-utilities" (OuterVolumeSpecName: "utilities") pod "31148a4b-37af-4f5b-b0a9-84e707ba1ad4" (UID: "31148a4b-37af-4f5b-b0a9-84e707ba1ad4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.949862 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31148a4b-37af-4f5b-b0a9-84e707ba1ad4-kube-api-access-8xlrl" (OuterVolumeSpecName: "kube-api-access-8xlrl") pod "31148a4b-37af-4f5b-b0a9-84e707ba1ad4" (UID: "31148a4b-37af-4f5b-b0a9-84e707ba1ad4"). InnerVolumeSpecName "kube-api-access-8xlrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.984086 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq"] Feb 19 18:57:16 crc kubenswrapper[4749]: E0219 18:57:16.984462 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31148a4b-37af-4f5b-b0a9-84e707ba1ad4" containerName="extract-content" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.984478 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="31148a4b-37af-4f5b-b0a9-84e707ba1ad4" containerName="extract-content" Feb 19 18:57:16 crc kubenswrapper[4749]: E0219 18:57:16.984490 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31148a4b-37af-4f5b-b0a9-84e707ba1ad4" containerName="registry-server" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.984496 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="31148a4b-37af-4f5b-b0a9-84e707ba1ad4" containerName="registry-server" Feb 19 18:57:16 crc kubenswrapper[4749]: E0219 18:57:16.984518 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31148a4b-37af-4f5b-b0a9-84e707ba1ad4" containerName="extract-utilities" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.984524 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="31148a4b-37af-4f5b-b0a9-84e707ba1ad4" containerName="extract-utilities" Feb 19 18:57:16 crc kubenswrapper[4749]: E0219 18:57:16.984536 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab3e2ecf-4d0f-4482-87fc-1098e7b8818a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.984542 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab3e2ecf-4d0f-4482-87fc-1098e7b8818a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.984731 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab3e2ecf-4d0f-4482-87fc-1098e7b8818a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.984746 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="31148a4b-37af-4f5b-b0a9-84e707ba1ad4" containerName="registry-server" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.985385 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.986163 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31148a4b-37af-4f5b-b0a9-84e707ba1ad4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31148a4b-37af-4f5b-b0a9-84e707ba1ad4" (UID: "31148a4b-37af-4f5b-b0a9-84e707ba1ad4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:57:16 crc kubenswrapper[4749]: I0219 18:57:16.993149 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq"] Feb 19 18:57:17 crc kubenswrapper[4749]: I0219 18:57:17.048892 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/22b6e4d8-5ab3-4a92-b8a4-e38b68e59744-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq\" (UID: \"22b6e4d8-5ab3-4a92-b8a4-e38b68e59744\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq" Feb 19 18:57:17 crc kubenswrapper[4749]: I0219 18:57:17.048934 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hchx2\" (UniqueName: \"kubernetes.io/projected/22b6e4d8-5ab3-4a92-b8a4-e38b68e59744-kube-api-access-hchx2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq\" (UID: \"22b6e4d8-5ab3-4a92-b8a4-e38b68e59744\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq" Feb 19 18:57:17 crc kubenswrapper[4749]: I0219 18:57:17.049118 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/22b6e4d8-5ab3-4a92-b8a4-e38b68e59744-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq\" (UID: \"22b6e4d8-5ab3-4a92-b8a4-e38b68e59744\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq" Feb 19 18:57:17 crc kubenswrapper[4749]: I0219 18:57:17.049405 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22b6e4d8-5ab3-4a92-b8a4-e38b68e59744-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq\" (UID: \"22b6e4d8-5ab3-4a92-b8a4-e38b68e59744\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq" Feb 19 18:57:17 crc kubenswrapper[4749]: I0219 18:57:17.049645 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31148a4b-37af-4f5b-b0a9-84e707ba1ad4-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 18:57:17 crc kubenswrapper[4749]: I0219 18:57:17.049662 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31148a4b-37af-4f5b-b0a9-84e707ba1ad4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 18:57:17 crc kubenswrapper[4749]: I0219 18:57:17.049673 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xlrl\" (UniqueName: \"kubernetes.io/projected/31148a4b-37af-4f5b-b0a9-84e707ba1ad4-kube-api-access-8xlrl\") on node \"crc\" DevicePath \"\"" Feb 19 18:57:17 crc kubenswrapper[4749]: I0219 18:57:17.151506 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/22b6e4d8-5ab3-4a92-b8a4-e38b68e59744-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq\" (UID: \"22b6e4d8-5ab3-4a92-b8a4-e38b68e59744\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq" Feb 19 18:57:17 crc kubenswrapper[4749]: I0219 18:57:17.151645 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22b6e4d8-5ab3-4a92-b8a4-e38b68e59744-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq\" (UID: \"22b6e4d8-5ab3-4a92-b8a4-e38b68e59744\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq" Feb 19 18:57:17 crc kubenswrapper[4749]: I0219 18:57:17.151711 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/22b6e4d8-5ab3-4a92-b8a4-e38b68e59744-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq\" (UID: \"22b6e4d8-5ab3-4a92-b8a4-e38b68e59744\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq" Feb 19 18:57:17 crc kubenswrapper[4749]: I0219 18:57:17.151738 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hchx2\" (UniqueName: \"kubernetes.io/projected/22b6e4d8-5ab3-4a92-b8a4-e38b68e59744-kube-api-access-hchx2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq\" (UID: \"22b6e4d8-5ab3-4a92-b8a4-e38b68e59744\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq" Feb 19 18:57:17 crc kubenswrapper[4749]: I0219 18:57:17.156649 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/22b6e4d8-5ab3-4a92-b8a4-e38b68e59744-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq\" (UID: \"22b6e4d8-5ab3-4a92-b8a4-e38b68e59744\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq" Feb 19 18:57:17 crc kubenswrapper[4749]: I0219 18:57:17.157207 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/22b6e4d8-5ab3-4a92-b8a4-e38b68e59744-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq\" (UID: \"22b6e4d8-5ab3-4a92-b8a4-e38b68e59744\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq" Feb 19 18:57:17 crc kubenswrapper[4749]: I0219 18:57:17.157216 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22b6e4d8-5ab3-4a92-b8a4-e38b68e59744-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq\" (UID: \"22b6e4d8-5ab3-4a92-b8a4-e38b68e59744\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq" Feb 19 18:57:17 crc kubenswrapper[4749]: I0219 18:57:17.167626 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hchx2\" (UniqueName: \"kubernetes.io/projected/22b6e4d8-5ab3-4a92-b8a4-e38b68e59744-kube-api-access-hchx2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq\" (UID: \"22b6e4d8-5ab3-4a92-b8a4-e38b68e59744\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq" Feb 19 18:57:17 crc kubenswrapper[4749]: I0219 18:57:17.225143 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-456hf"] Feb 19 18:57:17 crc kubenswrapper[4749]: I0219 18:57:17.238995 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-456hf"] Feb 19 18:57:17 crc kubenswrapper[4749]: I0219 18:57:17.312267 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq" Feb 19 18:57:17 crc kubenswrapper[4749]: I0219 18:57:17.852582 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq"] Feb 19 18:57:17 crc kubenswrapper[4749]: W0219 18:57:17.861357 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22b6e4d8_5ab3_4a92_b8a4_e38b68e59744.slice/crio-11ac7a4fd0d26c04d76833aeb95fcadffeccdc9ac7f68b17bc101d9478934376 WatchSource:0}: Error finding container 11ac7a4fd0d26c04d76833aeb95fcadffeccdc9ac7f68b17bc101d9478934376: Status 404 returned error can't find the container with id 11ac7a4fd0d26c04d76833aeb95fcadffeccdc9ac7f68b17bc101d9478934376 Feb 19 18:57:17 crc kubenswrapper[4749]: I0219 18:57:17.897252 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq" event={"ID":"22b6e4d8-5ab3-4a92-b8a4-e38b68e59744","Type":"ContainerStarted","Data":"11ac7a4fd0d26c04d76833aeb95fcadffeccdc9ac7f68b17bc101d9478934376"} Feb 19 18:57:18 crc kubenswrapper[4749]: I0219 18:57:18.693182 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31148a4b-37af-4f5b-b0a9-84e707ba1ad4" path="/var/lib/kubelet/pods/31148a4b-37af-4f5b-b0a9-84e707ba1ad4/volumes" Feb 19 18:57:18 crc kubenswrapper[4749]: I0219 18:57:18.907972 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq" event={"ID":"22b6e4d8-5ab3-4a92-b8a4-e38b68e59744","Type":"ContainerStarted","Data":"6b26264946943d575a305e3fdb8738852f36efccaff137e4ca271c794c53005d"} Feb 19 18:57:18 crc kubenswrapper[4749]: I0219 18:57:18.935207 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq" podStartSLOduration=2.440575851 podStartE2EDuration="2.935176136s" podCreationTimestamp="2026-02-19 18:57:16 +0000 UTC" firstStartedPulling="2026-02-19 18:57:17.86384496 +0000 UTC m=+1411.825064914" lastFinishedPulling="2026-02-19 18:57:18.358445245 +0000 UTC m=+1412.319665199" observedRunningTime="2026-02-19 18:57:18.924703133 +0000 UTC m=+1412.885923097" watchObservedRunningTime="2026-02-19 18:57:18.935176136 +0000 UTC m=+1412.896396100" Feb 19 18:57:19 crc kubenswrapper[4749]: I0219 18:57:19.404580 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s5mfd" Feb 19 18:57:19 crc kubenswrapper[4749]: I0219 18:57:19.404624 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s5mfd" Feb 19 18:57:19 crc kubenswrapper[4749]: I0219 18:57:19.453303 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s5mfd" Feb 19 18:57:24 crc kubenswrapper[4749]: I0219 18:57:24.725578 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 18:57:24 crc kubenswrapper[4749]: I0219 18:57:24.726301 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 18:57:29 crc kubenswrapper[4749]: I0219 18:57:29.456344 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s5mfd" Feb 19 18:57:29 crc kubenswrapper[4749]: I0219 18:57:29.533238 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s5mfd"] Feb 19 18:57:30 crc kubenswrapper[4749]: I0219 18:57:30.007176 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s5mfd" podUID="6c321926-0b01-4bd4-83ff-2fa26e0d312a" containerName="registry-server" containerID="cri-o://4bbe32e623e9e6df005e3c58c22f05d910c469bde3c77ac20178e38d86358f57" gracePeriod=2 Feb 19 18:57:30 crc kubenswrapper[4749]: I0219 18:57:30.490234 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s5mfd" Feb 19 18:57:30 crc kubenswrapper[4749]: I0219 18:57:30.615239 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c321926-0b01-4bd4-83ff-2fa26e0d312a-utilities\") pod \"6c321926-0b01-4bd4-83ff-2fa26e0d312a\" (UID: \"6c321926-0b01-4bd4-83ff-2fa26e0d312a\") " Feb 19 18:57:30 crc kubenswrapper[4749]: I0219 18:57:30.615367 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvjb5\" (UniqueName: \"kubernetes.io/projected/6c321926-0b01-4bd4-83ff-2fa26e0d312a-kube-api-access-jvjb5\") pod \"6c321926-0b01-4bd4-83ff-2fa26e0d312a\" (UID: \"6c321926-0b01-4bd4-83ff-2fa26e0d312a\") " Feb 19 18:57:30 crc kubenswrapper[4749]: I0219 18:57:30.615549 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c321926-0b01-4bd4-83ff-2fa26e0d312a-catalog-content\") pod \"6c321926-0b01-4bd4-83ff-2fa26e0d312a\" (UID: \"6c321926-0b01-4bd4-83ff-2fa26e0d312a\") " Feb 19 18:57:30 crc kubenswrapper[4749]: I0219 18:57:30.616213 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c321926-0b01-4bd4-83ff-2fa26e0d312a-utilities" (OuterVolumeSpecName: "utilities") pod "6c321926-0b01-4bd4-83ff-2fa26e0d312a" (UID: "6c321926-0b01-4bd4-83ff-2fa26e0d312a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:57:30 crc kubenswrapper[4749]: I0219 18:57:30.624429 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c321926-0b01-4bd4-83ff-2fa26e0d312a-kube-api-access-jvjb5" (OuterVolumeSpecName: "kube-api-access-jvjb5") pod "6c321926-0b01-4bd4-83ff-2fa26e0d312a" (UID: "6c321926-0b01-4bd4-83ff-2fa26e0d312a"). InnerVolumeSpecName "kube-api-access-jvjb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 18:57:30 crc kubenswrapper[4749]: I0219 18:57:30.667404 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c321926-0b01-4bd4-83ff-2fa26e0d312a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c321926-0b01-4bd4-83ff-2fa26e0d312a" (UID: "6c321926-0b01-4bd4-83ff-2fa26e0d312a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 18:57:30 crc kubenswrapper[4749]: I0219 18:57:30.718936 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c321926-0b01-4bd4-83ff-2fa26e0d312a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 18:57:30 crc kubenswrapper[4749]: I0219 18:57:30.718973 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c321926-0b01-4bd4-83ff-2fa26e0d312a-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 18:57:30 crc kubenswrapper[4749]: I0219 18:57:30.719008 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvjb5\" (UniqueName: \"kubernetes.io/projected/6c321926-0b01-4bd4-83ff-2fa26e0d312a-kube-api-access-jvjb5\") on node \"crc\" DevicePath \"\"" Feb 19 18:57:31 crc kubenswrapper[4749]: I0219 18:57:31.018324 4749 generic.go:334] "Generic (PLEG): container finished" podID="6c321926-0b01-4bd4-83ff-2fa26e0d312a" containerID="4bbe32e623e9e6df005e3c58c22f05d910c469bde3c77ac20178e38d86358f57" exitCode=0 Feb 19 18:57:31 crc kubenswrapper[4749]: I0219 18:57:31.018376 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s5mfd" Feb 19 18:57:31 crc kubenswrapper[4749]: I0219 18:57:31.018408 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5mfd" event={"ID":"6c321926-0b01-4bd4-83ff-2fa26e0d312a","Type":"ContainerDied","Data":"4bbe32e623e9e6df005e3c58c22f05d910c469bde3c77ac20178e38d86358f57"} Feb 19 18:57:31 crc kubenswrapper[4749]: I0219 18:57:31.018496 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5mfd" event={"ID":"6c321926-0b01-4bd4-83ff-2fa26e0d312a","Type":"ContainerDied","Data":"ee63d61099bd5fc3054c14b4ebb79042f18be91b384eca5e45e36a04cbbc43e9"} Feb 19 18:57:31 crc kubenswrapper[4749]: I0219 18:57:31.018521 4749 scope.go:117] "RemoveContainer" containerID="4bbe32e623e9e6df005e3c58c22f05d910c469bde3c77ac20178e38d86358f57" Feb 19 18:57:31 crc kubenswrapper[4749]: I0219 18:57:31.043069 4749 scope.go:117] "RemoveContainer" containerID="605430e27306251f6be9554256774a716c3961634e062190755f85f9b956f62e" Feb 19 18:57:31 crc kubenswrapper[4749]: I0219 18:57:31.049104 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s5mfd"] Feb 19 18:57:31 crc kubenswrapper[4749]: I0219 18:57:31.058584 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s5mfd"] Feb 19 18:57:31 crc kubenswrapper[4749]: I0219 18:57:31.066136 4749 scope.go:117] "RemoveContainer" containerID="89784bd20b1a711fb571638a33598d6068861f2209632cf6a1789401475b9d43" Feb 19 18:57:31 crc kubenswrapper[4749]: I0219 18:57:31.110000 4749 scope.go:117] "RemoveContainer" containerID="4bbe32e623e9e6df005e3c58c22f05d910c469bde3c77ac20178e38d86358f57" Feb 19 18:57:31 crc kubenswrapper[4749]: E0219 18:57:31.110472 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bbe32e623e9e6df005e3c58c22f05d910c469bde3c77ac20178e38d86358f57\": container with ID starting with 4bbe32e623e9e6df005e3c58c22f05d910c469bde3c77ac20178e38d86358f57 not found: ID does not exist" containerID="4bbe32e623e9e6df005e3c58c22f05d910c469bde3c77ac20178e38d86358f57" Feb 19 18:57:31 crc kubenswrapper[4749]: I0219 18:57:31.110508 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bbe32e623e9e6df005e3c58c22f05d910c469bde3c77ac20178e38d86358f57"} err="failed to get container status \"4bbe32e623e9e6df005e3c58c22f05d910c469bde3c77ac20178e38d86358f57\": rpc error: code = NotFound desc = could not find container \"4bbe32e623e9e6df005e3c58c22f05d910c469bde3c77ac20178e38d86358f57\": container with ID starting with 4bbe32e623e9e6df005e3c58c22f05d910c469bde3c77ac20178e38d86358f57 not found: ID does not exist" Feb 19 18:57:31 crc kubenswrapper[4749]: I0219 18:57:31.110531 4749 scope.go:117] "RemoveContainer" containerID="605430e27306251f6be9554256774a716c3961634e062190755f85f9b956f62e" Feb 19 18:57:31 crc kubenswrapper[4749]: E0219 18:57:31.110983 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"605430e27306251f6be9554256774a716c3961634e062190755f85f9b956f62e\": container with ID starting with 605430e27306251f6be9554256774a716c3961634e062190755f85f9b956f62e not found: ID does not exist" containerID="605430e27306251f6be9554256774a716c3961634e062190755f85f9b956f62e" Feb 19 18:57:31 crc kubenswrapper[4749]: I0219 18:57:31.111078 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"605430e27306251f6be9554256774a716c3961634e062190755f85f9b956f62e"} err="failed to get container status \"605430e27306251f6be9554256774a716c3961634e062190755f85f9b956f62e\": rpc error: code = NotFound desc = could not find container \"605430e27306251f6be9554256774a716c3961634e062190755f85f9b956f62e\": container with ID starting with 605430e27306251f6be9554256774a716c3961634e062190755f85f9b956f62e not found: ID does not exist" Feb 19 18:57:31 crc kubenswrapper[4749]: I0219 18:57:31.111125 4749 scope.go:117] "RemoveContainer" containerID="89784bd20b1a711fb571638a33598d6068861f2209632cf6a1789401475b9d43" Feb 19 18:57:31 crc kubenswrapper[4749]: E0219 18:57:31.111425 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89784bd20b1a711fb571638a33598d6068861f2209632cf6a1789401475b9d43\": container with ID starting with 89784bd20b1a711fb571638a33598d6068861f2209632cf6a1789401475b9d43 not found: ID does not exist" containerID="89784bd20b1a711fb571638a33598d6068861f2209632cf6a1789401475b9d43" Feb 19 18:57:31 crc kubenswrapper[4749]: I0219 18:57:31.111458 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89784bd20b1a711fb571638a33598d6068861f2209632cf6a1789401475b9d43"} err="failed to get container status \"89784bd20b1a711fb571638a33598d6068861f2209632cf6a1789401475b9d43\": rpc error: code = NotFound desc = could not find container \"89784bd20b1a711fb571638a33598d6068861f2209632cf6a1789401475b9d43\": container with ID starting with 89784bd20b1a711fb571638a33598d6068861f2209632cf6a1789401475b9d43 not found: ID does not exist" Feb 19 18:57:32 crc kubenswrapper[4749]: I0219 18:57:32.690682 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c321926-0b01-4bd4-83ff-2fa26e0d312a" path="/var/lib/kubelet/pods/6c321926-0b01-4bd4-83ff-2fa26e0d312a/volumes" Feb 19 18:57:54 crc kubenswrapper[4749]: I0219 18:57:54.725166 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 18:57:54 crc kubenswrapper[4749]: I0219 18:57:54.725609 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 18:57:54 crc kubenswrapper[4749]: I0219 18:57:54.725653 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 18:57:54 crc kubenswrapper[4749]: I0219 18:57:54.726439 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6d4364be274ebee2999aeb5c900bcffa311ebe5dc83a06265c201d7a01da149a"} pod="openshift-machine-config-operator/machine-config-daemon-nzldt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 18:57:54 crc kubenswrapper[4749]: I0219 18:57:54.726486 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" containerID="cri-o://6d4364be274ebee2999aeb5c900bcffa311ebe5dc83a06265c201d7a01da149a" gracePeriod=600 Feb 19 18:57:55 crc kubenswrapper[4749]: I0219 18:57:55.244555 4749 generic.go:334] "Generic (PLEG): container finished" podID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerID="6d4364be274ebee2999aeb5c900bcffa311ebe5dc83a06265c201d7a01da149a" exitCode=0 Feb 19 18:57:55 crc kubenswrapper[4749]: I0219 18:57:55.244607 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerDied","Data":"6d4364be274ebee2999aeb5c900bcffa311ebe5dc83a06265c201d7a01da149a"} Feb 19 18:57:55 crc kubenswrapper[4749]: I0219 18:57:55.245274 4749 scope.go:117] "RemoveContainer" containerID="92113cb5e4b06748bc8c8134a5b7e7475e1c6d6f0cf9e918b487ab08df45bb1f" Feb 19 18:57:56 crc kubenswrapper[4749]: I0219 18:57:56.262398 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerStarted","Data":"45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f"} Feb 19 18:58:16 crc kubenswrapper[4749]: I0219 18:58:16.498112 4749 scope.go:117] "RemoveContainer" containerID="ea3111911f70a12e19c82baf183a7d9b4294320ae0c58115da896f67e0ca42fc" Feb 19 18:58:16 crc kubenswrapper[4749]: I0219 18:58:16.524518 4749 scope.go:117] "RemoveContainer" containerID="7379a11bb2a3623b7b06c52e6724ffd514a62505ff46b0039924f9d92a454eaa" Feb 19 19:00:00 crc kubenswrapper[4749]: I0219 19:00:00.149717 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525460-dzdjc"] Feb 19 19:00:00 crc kubenswrapper[4749]: E0219 19:00:00.150621 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c321926-0b01-4bd4-83ff-2fa26e0d312a" containerName="extract-content" Feb 19 19:00:00 crc kubenswrapper[4749]: I0219 19:00:00.150634 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c321926-0b01-4bd4-83ff-2fa26e0d312a" containerName="extract-content" Feb 19 19:00:00 crc kubenswrapper[4749]: E0219 19:00:00.150642 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c321926-0b01-4bd4-83ff-2fa26e0d312a" containerName="registry-server" Feb 19 19:00:00 crc kubenswrapper[4749]: I0219 19:00:00.150647 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c321926-0b01-4bd4-83ff-2fa26e0d312a" containerName="registry-server" Feb 19 19:00:00 crc kubenswrapper[4749]: E0219 19:00:00.150670 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c321926-0b01-4bd4-83ff-2fa26e0d312a" containerName="extract-utilities" Feb 19 19:00:00 crc kubenswrapper[4749]: I0219 19:00:00.150677 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c321926-0b01-4bd4-83ff-2fa26e0d312a" containerName="extract-utilities" Feb 19 19:00:00 crc kubenswrapper[4749]: I0219 19:00:00.150851 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c321926-0b01-4bd4-83ff-2fa26e0d312a" containerName="registry-server" Feb 19 19:00:00 crc kubenswrapper[4749]: I0219 19:00:00.151689 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525460-dzdjc" Feb 19 19:00:00 crc kubenswrapper[4749]: I0219 19:00:00.153172 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 19:00:00 crc kubenswrapper[4749]: I0219 19:00:00.153457 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 19:00:00 crc kubenswrapper[4749]: I0219 19:00:00.167712 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525460-dzdjc"] Feb 19 19:00:00 crc kubenswrapper[4749]: I0219 19:00:00.248093 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5286134-7c72-4d2e-b922-292886feb2c5-config-volume\") pod \"collect-profiles-29525460-dzdjc\" (UID: \"a5286134-7c72-4d2e-b922-292886feb2c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525460-dzdjc" Feb 19 19:00:00 crc kubenswrapper[4749]: I0219 19:00:00.248340 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c264d\" (UniqueName: \"kubernetes.io/projected/a5286134-7c72-4d2e-b922-292886feb2c5-kube-api-access-c264d\") pod \"collect-profiles-29525460-dzdjc\" (UID: \"a5286134-7c72-4d2e-b922-292886feb2c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525460-dzdjc" Feb 19 19:00:00 crc kubenswrapper[4749]: I0219 19:00:00.248483 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5286134-7c72-4d2e-b922-292886feb2c5-secret-volume\") pod \"collect-profiles-29525460-dzdjc\" (UID: \"a5286134-7c72-4d2e-b922-292886feb2c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525460-dzdjc" Feb 19 19:00:00 crc kubenswrapper[4749]: I0219 19:00:00.350363 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5286134-7c72-4d2e-b922-292886feb2c5-config-volume\") pod \"collect-profiles-29525460-dzdjc\" (UID: \"a5286134-7c72-4d2e-b922-292886feb2c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525460-dzdjc" Feb 19 19:00:00 crc kubenswrapper[4749]: I0219 19:00:00.350438 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c264d\" (UniqueName: \"kubernetes.io/projected/a5286134-7c72-4d2e-b922-292886feb2c5-kube-api-access-c264d\") pod \"collect-profiles-29525460-dzdjc\" (UID: \"a5286134-7c72-4d2e-b922-292886feb2c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525460-dzdjc" Feb 19 19:00:00 crc kubenswrapper[4749]: I0219 19:00:00.350485 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5286134-7c72-4d2e-b922-292886feb2c5-secret-volume\") pod \"collect-profiles-29525460-dzdjc\" (UID: \"a5286134-7c72-4d2e-b922-292886feb2c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525460-dzdjc" Feb 19 19:00:00 crc kubenswrapper[4749]: I0219 19:00:00.351415 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5286134-7c72-4d2e-b922-292886feb2c5-config-volume\") pod \"collect-profiles-29525460-dzdjc\" (UID: \"a5286134-7c72-4d2e-b922-292886feb2c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525460-dzdjc" Feb 19 19:00:00 crc kubenswrapper[4749]: I0219 19:00:00.357777 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5286134-7c72-4d2e-b922-292886feb2c5-secret-volume\") pod \"collect-profiles-29525460-dzdjc\" (UID: \"a5286134-7c72-4d2e-b922-292886feb2c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525460-dzdjc" Feb 19 19:00:00 crc kubenswrapper[4749]: I0219 19:00:00.367396 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c264d\" (UniqueName: \"kubernetes.io/projected/a5286134-7c72-4d2e-b922-292886feb2c5-kube-api-access-c264d\") pod \"collect-profiles-29525460-dzdjc\" (UID: \"a5286134-7c72-4d2e-b922-292886feb2c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525460-dzdjc" Feb 19 19:00:00 crc kubenswrapper[4749]: I0219 19:00:00.471810 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525460-dzdjc" Feb 19 19:00:00 crc kubenswrapper[4749]: I0219 19:00:00.924862 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525460-dzdjc"] Feb 19 19:00:01 crc kubenswrapper[4749]: I0219 19:00:01.456014 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525460-dzdjc" event={"ID":"a5286134-7c72-4d2e-b922-292886feb2c5","Type":"ContainerStarted","Data":"0e6420a8bfd2e1f6598da4a56c6888903004ac5af42e017395320fda4929fba4"} Feb 19 19:00:01 crc kubenswrapper[4749]: I0219 19:00:01.456348 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525460-dzdjc" event={"ID":"a5286134-7c72-4d2e-b922-292886feb2c5","Type":"ContainerStarted","Data":"c96d277c1178c2a23370da9e03414881a9776000b032b120ea1a43550931e171"} Feb 19 19:00:01 crc kubenswrapper[4749]: I0219 19:00:01.472962 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525460-dzdjc" podStartSLOduration=1.47293871 podStartE2EDuration="1.47293871s" podCreationTimestamp="2026-02-19 19:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:00:01.468744068 +0000 UTC m=+1575.429964022" watchObservedRunningTime="2026-02-19 19:00:01.47293871 +0000 UTC m=+1575.434158664" Feb 19 19:00:02 crc kubenswrapper[4749]: I0219 19:00:02.466366 4749 generic.go:334] "Generic (PLEG): container finished" podID="a5286134-7c72-4d2e-b922-292886feb2c5" containerID="0e6420a8bfd2e1f6598da4a56c6888903004ac5af42e017395320fda4929fba4" exitCode=0 Feb 19 19:00:02 crc kubenswrapper[4749]: I0219 19:00:02.466417 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525460-dzdjc" event={"ID":"a5286134-7c72-4d2e-b922-292886feb2c5","Type":"ContainerDied","Data":"0e6420a8bfd2e1f6598da4a56c6888903004ac5af42e017395320fda4929fba4"} Feb 19 19:00:03 crc kubenswrapper[4749]: I0219 19:00:03.771294 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525460-dzdjc" Feb 19 19:00:03 crc kubenswrapper[4749]: I0219 19:00:03.923128 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5286134-7c72-4d2e-b922-292886feb2c5-secret-volume\") pod \"a5286134-7c72-4d2e-b922-292886feb2c5\" (UID: \"a5286134-7c72-4d2e-b922-292886feb2c5\") " Feb 19 19:00:03 crc kubenswrapper[4749]: I0219 19:00:03.923503 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5286134-7c72-4d2e-b922-292886feb2c5-config-volume\") pod \"a5286134-7c72-4d2e-b922-292886feb2c5\" (UID: \"a5286134-7c72-4d2e-b922-292886feb2c5\") " Feb 19 19:00:03 crc kubenswrapper[4749]: I0219 19:00:03.923762 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c264d\" (UniqueName: \"kubernetes.io/projected/a5286134-7c72-4d2e-b922-292886feb2c5-kube-api-access-c264d\") pod \"a5286134-7c72-4d2e-b922-292886feb2c5\" (UID: \"a5286134-7c72-4d2e-b922-292886feb2c5\") " Feb 19 19:00:03 crc kubenswrapper[4749]: I0219 19:00:03.924040 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5286134-7c72-4d2e-b922-292886feb2c5-config-volume" (OuterVolumeSpecName: "config-volume") pod "a5286134-7c72-4d2e-b922-292886feb2c5" (UID: "a5286134-7c72-4d2e-b922-292886feb2c5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:00:03 crc kubenswrapper[4749]: I0219 19:00:03.924477 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5286134-7c72-4d2e-b922-292886feb2c5-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 19:00:03 crc kubenswrapper[4749]: I0219 19:00:03.939375 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5286134-7c72-4d2e-b922-292886feb2c5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a5286134-7c72-4d2e-b922-292886feb2c5" (UID: "a5286134-7c72-4d2e-b922-292886feb2c5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:00:03 crc kubenswrapper[4749]: I0219 19:00:03.939926 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5286134-7c72-4d2e-b922-292886feb2c5-kube-api-access-c264d" (OuterVolumeSpecName: "kube-api-access-c264d") pod "a5286134-7c72-4d2e-b922-292886feb2c5" (UID: "a5286134-7c72-4d2e-b922-292886feb2c5"). InnerVolumeSpecName "kube-api-access-c264d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:00:04 crc kubenswrapper[4749]: I0219 19:00:04.026110 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5286134-7c72-4d2e-b922-292886feb2c5-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 19:00:04 crc kubenswrapper[4749]: I0219 19:00:04.026149 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c264d\" (UniqueName: \"kubernetes.io/projected/a5286134-7c72-4d2e-b922-292886feb2c5-kube-api-access-c264d\") on node \"crc\" DevicePath \"\"" Feb 19 19:00:04 crc kubenswrapper[4749]: I0219 19:00:04.488645 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525460-dzdjc" event={"ID":"a5286134-7c72-4d2e-b922-292886feb2c5","Type":"ContainerDied","Data":"c96d277c1178c2a23370da9e03414881a9776000b032b120ea1a43550931e171"} Feb 19 19:00:04 crc kubenswrapper[4749]: I0219 19:00:04.488690 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c96d277c1178c2a23370da9e03414881a9776000b032b120ea1a43550931e171" Feb 19 19:00:04 crc kubenswrapper[4749]: I0219 19:00:04.488736 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525460-dzdjc" Feb 19 19:00:24 crc kubenswrapper[4749]: I0219 19:00:24.725725 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:00:24 crc kubenswrapper[4749]: I0219 19:00:24.726315 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:00:31 crc kubenswrapper[4749]: I0219 19:00:31.727236 4749 generic.go:334] "Generic (PLEG): container finished" podID="22b6e4d8-5ab3-4a92-b8a4-e38b68e59744" containerID="6b26264946943d575a305e3fdb8738852f36efccaff137e4ca271c794c53005d" exitCode=0 Feb 19 19:00:31 crc kubenswrapper[4749]: I0219 19:00:31.727311 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq" event={"ID":"22b6e4d8-5ab3-4a92-b8a4-e38b68e59744","Type":"ContainerDied","Data":"6b26264946943d575a305e3fdb8738852f36efccaff137e4ca271c794c53005d"} Feb 19 19:00:33 crc kubenswrapper[4749]: I0219 19:00:33.165904 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq" Feb 19 19:00:33 crc kubenswrapper[4749]: I0219 19:00:33.317655 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/22b6e4d8-5ab3-4a92-b8a4-e38b68e59744-ssh-key-openstack-edpm-ipam\") pod \"22b6e4d8-5ab3-4a92-b8a4-e38b68e59744\" (UID: \"22b6e4d8-5ab3-4a92-b8a4-e38b68e59744\") " Feb 19 19:00:33 crc kubenswrapper[4749]: I0219 19:00:33.317727 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22b6e4d8-5ab3-4a92-b8a4-e38b68e59744-bootstrap-combined-ca-bundle\") pod \"22b6e4d8-5ab3-4a92-b8a4-e38b68e59744\" (UID: \"22b6e4d8-5ab3-4a92-b8a4-e38b68e59744\") " Feb 19 19:00:33 crc kubenswrapper[4749]: I0219 19:00:33.317886 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hchx2\" (UniqueName: \"kubernetes.io/projected/22b6e4d8-5ab3-4a92-b8a4-e38b68e59744-kube-api-access-hchx2\") pod \"22b6e4d8-5ab3-4a92-b8a4-e38b68e59744\" (UID: \"22b6e4d8-5ab3-4a92-b8a4-e38b68e59744\") " Feb 19 19:00:33 crc kubenswrapper[4749]: I0219 19:00:33.318008 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/22b6e4d8-5ab3-4a92-b8a4-e38b68e59744-inventory\") pod \"22b6e4d8-5ab3-4a92-b8a4-e38b68e59744\" (UID: \"22b6e4d8-5ab3-4a92-b8a4-e38b68e59744\") " Feb 19 19:00:33 crc kubenswrapper[4749]: I0219 19:00:33.324910 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22b6e4d8-5ab3-4a92-b8a4-e38b68e59744-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "22b6e4d8-5ab3-4a92-b8a4-e38b68e59744" (UID: "22b6e4d8-5ab3-4a92-b8a4-e38b68e59744"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:00:33 crc kubenswrapper[4749]: I0219 19:00:33.326174 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22b6e4d8-5ab3-4a92-b8a4-e38b68e59744-kube-api-access-hchx2" (OuterVolumeSpecName: "kube-api-access-hchx2") pod "22b6e4d8-5ab3-4a92-b8a4-e38b68e59744" (UID: "22b6e4d8-5ab3-4a92-b8a4-e38b68e59744"). InnerVolumeSpecName "kube-api-access-hchx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:00:33 crc kubenswrapper[4749]: I0219 19:00:33.346403 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22b6e4d8-5ab3-4a92-b8a4-e38b68e59744-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "22b6e4d8-5ab3-4a92-b8a4-e38b68e59744" (UID: "22b6e4d8-5ab3-4a92-b8a4-e38b68e59744"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:00:33 crc kubenswrapper[4749]: I0219 19:00:33.356303 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22b6e4d8-5ab3-4a92-b8a4-e38b68e59744-inventory" (OuterVolumeSpecName: "inventory") pod "22b6e4d8-5ab3-4a92-b8a4-e38b68e59744" (UID: "22b6e4d8-5ab3-4a92-b8a4-e38b68e59744"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:00:33 crc kubenswrapper[4749]: I0219 19:00:33.420347 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/22b6e4d8-5ab3-4a92-b8a4-e38b68e59744-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:00:33 crc kubenswrapper[4749]: I0219 19:00:33.420384 4749 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22b6e4d8-5ab3-4a92-b8a4-e38b68e59744-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:00:33 crc kubenswrapper[4749]: I0219 19:00:33.420394 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hchx2\" (UniqueName: \"kubernetes.io/projected/22b6e4d8-5ab3-4a92-b8a4-e38b68e59744-kube-api-access-hchx2\") on node \"crc\" DevicePath \"\"" Feb 19 19:00:33 crc kubenswrapper[4749]: I0219 19:00:33.420404 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/22b6e4d8-5ab3-4a92-b8a4-e38b68e59744-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:00:33 crc kubenswrapper[4749]: I0219 19:00:33.748703 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq" event={"ID":"22b6e4d8-5ab3-4a92-b8a4-e38b68e59744","Type":"ContainerDied","Data":"11ac7a4fd0d26c04d76833aeb95fcadffeccdc9ac7f68b17bc101d9478934376"} Feb 19 19:00:33 crc kubenswrapper[4749]: I0219 19:00:33.748765 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11ac7a4fd0d26c04d76833aeb95fcadffeccdc9ac7f68b17bc101d9478934376" Feb 19 19:00:33 crc kubenswrapper[4749]: I0219 19:00:33.748779 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq" Feb 19 19:00:33 crc kubenswrapper[4749]: I0219 19:00:33.834507 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k"] Feb 19 19:00:33 crc kubenswrapper[4749]: E0219 19:00:33.835155 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22b6e4d8-5ab3-4a92-b8a4-e38b68e59744" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 19 19:00:33 crc kubenswrapper[4749]: I0219 19:00:33.835188 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="22b6e4d8-5ab3-4a92-b8a4-e38b68e59744" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 19 19:00:33 crc kubenswrapper[4749]: E0219 19:00:33.835205 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5286134-7c72-4d2e-b922-292886feb2c5" containerName="collect-profiles" Feb 19 19:00:33 crc kubenswrapper[4749]: I0219 19:00:33.835211 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5286134-7c72-4d2e-b922-292886feb2c5" containerName="collect-profiles" Feb 19 19:00:33 crc kubenswrapper[4749]: I0219 19:00:33.835777 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5286134-7c72-4d2e-b922-292886feb2c5" containerName="collect-profiles" Feb 19 19:00:33 crc kubenswrapper[4749]: I0219 19:00:33.835796 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="22b6e4d8-5ab3-4a92-b8a4-e38b68e59744" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 19 19:00:33 crc kubenswrapper[4749]: I0219 19:00:33.836588 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k" Feb 19 19:00:33 crc kubenswrapper[4749]: I0219 19:00:33.842509 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:00:33 crc kubenswrapper[4749]: I0219 19:00:33.842827 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:00:33 crc kubenswrapper[4749]: I0219 19:00:33.843371 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4pz98" Feb 19 19:00:33 crc kubenswrapper[4749]: I0219 19:00:33.843378 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:00:33 crc kubenswrapper[4749]: I0219 19:00:33.852823 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k"] Feb 19 19:00:34 crc kubenswrapper[4749]: I0219 19:00:34.030796 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe29f0cb-bc56-4d7b-983c-52667a6c4ceb-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k\" (UID: \"fe29f0cb-bc56-4d7b-983c-52667a6c4ceb\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k" Feb 19 19:00:34 crc kubenswrapper[4749]: I0219 19:00:34.030948 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe29f0cb-bc56-4d7b-983c-52667a6c4ceb-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k\" (UID: \"fe29f0cb-bc56-4d7b-983c-52667a6c4ceb\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k" Feb 19 19:00:34 crc kubenswrapper[4749]: I0219 19:00:34.031058 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j5r7\" (UniqueName: \"kubernetes.io/projected/fe29f0cb-bc56-4d7b-983c-52667a6c4ceb-kube-api-access-9j5r7\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k\" (UID: \"fe29f0cb-bc56-4d7b-983c-52667a6c4ceb\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k" Feb 19 19:00:34 crc kubenswrapper[4749]: I0219 19:00:34.132673 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe29f0cb-bc56-4d7b-983c-52667a6c4ceb-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k\" (UID: \"fe29f0cb-bc56-4d7b-983c-52667a6c4ceb\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k" Feb 19 19:00:34 crc kubenswrapper[4749]: I0219 19:00:34.132760 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe29f0cb-bc56-4d7b-983c-52667a6c4ceb-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k\" (UID: \"fe29f0cb-bc56-4d7b-983c-52667a6c4ceb\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k" Feb 19 19:00:34 crc kubenswrapper[4749]: I0219 19:00:34.132818 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j5r7\" (UniqueName: \"kubernetes.io/projected/fe29f0cb-bc56-4d7b-983c-52667a6c4ceb-kube-api-access-9j5r7\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k\" (UID: \"fe29f0cb-bc56-4d7b-983c-52667a6c4ceb\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k" Feb 19 19:00:34 crc kubenswrapper[4749]: I0219 19:00:34.137559 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe29f0cb-bc56-4d7b-983c-52667a6c4ceb-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k\" (UID: \"fe29f0cb-bc56-4d7b-983c-52667a6c4ceb\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k" Feb 19 19:00:34 crc kubenswrapper[4749]: I0219 19:00:34.142589 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe29f0cb-bc56-4d7b-983c-52667a6c4ceb-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k\" (UID: \"fe29f0cb-bc56-4d7b-983c-52667a6c4ceb\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k" Feb 19 19:00:34 crc kubenswrapper[4749]: I0219 19:00:34.148689 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j5r7\" (UniqueName: \"kubernetes.io/projected/fe29f0cb-bc56-4d7b-983c-52667a6c4ceb-kube-api-access-9j5r7\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k\" (UID: \"fe29f0cb-bc56-4d7b-983c-52667a6c4ceb\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k" Feb 19 19:00:34 crc kubenswrapper[4749]: I0219 19:00:34.152688 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k" Feb 19 19:00:34 crc kubenswrapper[4749]: I0219 19:00:34.708783 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k"] Feb 19 19:00:34 crc kubenswrapper[4749]: I0219 19:00:34.714372 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 19:00:34 crc kubenswrapper[4749]: I0219 19:00:34.765571 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k" event={"ID":"fe29f0cb-bc56-4d7b-983c-52667a6c4ceb","Type":"ContainerStarted","Data":"ea28c401648226879ba8205e5e21a9a9220a593275547e8e31dbc641c5e11658"} Feb 19 19:00:35 crc kubenswrapper[4749]: I0219 19:00:35.776637 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k" event={"ID":"fe29f0cb-bc56-4d7b-983c-52667a6c4ceb","Type":"ContainerStarted","Data":"63908d7c84180e86b3ef9c32f27eabd42ccf17d8f0eadd9c2831e88900cf9bf9"} Feb 19 19:00:35 crc kubenswrapper[4749]: I0219 19:00:35.799342 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k" podStartSLOduration=2.229019016 podStartE2EDuration="2.799323302s" podCreationTimestamp="2026-02-19 19:00:33 +0000 UTC" firstStartedPulling="2026-02-19 19:00:34.714171423 +0000 UTC m=+1608.675391377" lastFinishedPulling="2026-02-19 19:00:35.284475709 +0000 UTC m=+1609.245695663" observedRunningTime="2026-02-19 19:00:35.792710871 +0000 UTC m=+1609.753930825" watchObservedRunningTime="2026-02-19 19:00:35.799323302 +0000 UTC m=+1609.760543256" Feb 19 19:00:54 crc kubenswrapper[4749]: I0219 19:00:54.725843 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:00:54 crc kubenswrapper[4749]: I0219 19:00:54.726611 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:01:00 crc kubenswrapper[4749]: I0219 19:01:00.037424 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f544-account-create-update-h22hj"] Feb 19 19:01:00 crc kubenswrapper[4749]: I0219 19:01:00.047654 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-8mb6x"] Feb 19 19:01:00 crc kubenswrapper[4749]: I0219 19:01:00.055997 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-f544-account-create-update-h22hj"] Feb 19 19:01:00 crc kubenswrapper[4749]: I0219 19:01:00.066518 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-8mb6x"] Feb 19 19:01:00 crc kubenswrapper[4749]: I0219 19:01:00.144785 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29525461-vkh7z"] Feb 19 19:01:00 crc kubenswrapper[4749]: I0219 19:01:00.145980 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525461-vkh7z" Feb 19 19:01:00 crc kubenswrapper[4749]: I0219 19:01:00.167211 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29525461-vkh7z"] Feb 19 19:01:00 crc kubenswrapper[4749]: I0219 19:01:00.215106 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bdfb9341-a712-489d-ba8e-01ce41b5d1cb-fernet-keys\") pod \"keystone-cron-29525461-vkh7z\" (UID: \"bdfb9341-a712-489d-ba8e-01ce41b5d1cb\") " pod="openstack/keystone-cron-29525461-vkh7z" Feb 19 19:01:00 crc kubenswrapper[4749]: I0219 19:01:00.215170 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdfb9341-a712-489d-ba8e-01ce41b5d1cb-config-data\") pod \"keystone-cron-29525461-vkh7z\" (UID: \"bdfb9341-a712-489d-ba8e-01ce41b5d1cb\") " pod="openstack/keystone-cron-29525461-vkh7z" Feb 19 19:01:00 crc kubenswrapper[4749]: I0219 19:01:00.215440 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdfb9341-a712-489d-ba8e-01ce41b5d1cb-combined-ca-bundle\") pod \"keystone-cron-29525461-vkh7z\" (UID: \"bdfb9341-a712-489d-ba8e-01ce41b5d1cb\") " pod="openstack/keystone-cron-29525461-vkh7z" Feb 19 19:01:00 crc kubenswrapper[4749]: I0219 19:01:00.215589 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2xfb\" (UniqueName: \"kubernetes.io/projected/bdfb9341-a712-489d-ba8e-01ce41b5d1cb-kube-api-access-l2xfb\") pod \"keystone-cron-29525461-vkh7z\" (UID: \"bdfb9341-a712-489d-ba8e-01ce41b5d1cb\") " pod="openstack/keystone-cron-29525461-vkh7z" Feb 19 19:01:00 crc kubenswrapper[4749]: I0219 19:01:00.317677 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bdfb9341-a712-489d-ba8e-01ce41b5d1cb-fernet-keys\") pod \"keystone-cron-29525461-vkh7z\" (UID: \"bdfb9341-a712-489d-ba8e-01ce41b5d1cb\") " pod="openstack/keystone-cron-29525461-vkh7z" Feb 19 19:01:00 crc kubenswrapper[4749]: I0219 19:01:00.317742 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdfb9341-a712-489d-ba8e-01ce41b5d1cb-config-data\") pod \"keystone-cron-29525461-vkh7z\" (UID: \"bdfb9341-a712-489d-ba8e-01ce41b5d1cb\") " pod="openstack/keystone-cron-29525461-vkh7z" Feb 19 19:01:00 crc kubenswrapper[4749]: I0219 19:01:00.317802 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdfb9341-a712-489d-ba8e-01ce41b5d1cb-combined-ca-bundle\") pod \"keystone-cron-29525461-vkh7z\" (UID: \"bdfb9341-a712-489d-ba8e-01ce41b5d1cb\") " pod="openstack/keystone-cron-29525461-vkh7z" Feb 19 19:01:00 crc kubenswrapper[4749]: I0219 19:01:00.317859 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2xfb\" (UniqueName: \"kubernetes.io/projected/bdfb9341-a712-489d-ba8e-01ce41b5d1cb-kube-api-access-l2xfb\") pod \"keystone-cron-29525461-vkh7z\" (UID: \"bdfb9341-a712-489d-ba8e-01ce41b5d1cb\") " pod="openstack/keystone-cron-29525461-vkh7z" Feb 19 19:01:00 crc kubenswrapper[4749]: I0219 19:01:00.323549 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdfb9341-a712-489d-ba8e-01ce41b5d1cb-combined-ca-bundle\") pod \"keystone-cron-29525461-vkh7z\" (UID: \"bdfb9341-a712-489d-ba8e-01ce41b5d1cb\") " pod="openstack/keystone-cron-29525461-vkh7z" Feb 19 19:01:00 crc kubenswrapper[4749]: I0219 19:01:00.323846 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bdfb9341-a712-489d-ba8e-01ce41b5d1cb-fernet-keys\") pod \"keystone-cron-29525461-vkh7z\" (UID: \"bdfb9341-a712-489d-ba8e-01ce41b5d1cb\") " pod="openstack/keystone-cron-29525461-vkh7z" Feb 19 19:01:00 crc kubenswrapper[4749]: I0219 19:01:00.324396 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdfb9341-a712-489d-ba8e-01ce41b5d1cb-config-data\") pod \"keystone-cron-29525461-vkh7z\" (UID: \"bdfb9341-a712-489d-ba8e-01ce41b5d1cb\") " pod="openstack/keystone-cron-29525461-vkh7z" Feb 19 19:01:00 crc kubenswrapper[4749]: I0219 19:01:00.338213 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2xfb\" (UniqueName: \"kubernetes.io/projected/bdfb9341-a712-489d-ba8e-01ce41b5d1cb-kube-api-access-l2xfb\") pod \"keystone-cron-29525461-vkh7z\" (UID: \"bdfb9341-a712-489d-ba8e-01ce41b5d1cb\") " pod="openstack/keystone-cron-29525461-vkh7z" Feb 19 19:01:00 crc kubenswrapper[4749]: I0219 19:01:00.465761 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525461-vkh7z" Feb 19 19:01:00 crc kubenswrapper[4749]: I0219 19:01:00.702573 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a056c11-d814-41ff-a7b7-a1aaa36ed053" path="/var/lib/kubelet/pods/0a056c11-d814-41ff-a7b7-a1aaa36ed053/volumes" Feb 19 19:01:00 crc kubenswrapper[4749]: I0219 19:01:00.714431 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1abcce5b-61f2-44f5-82e3-9d476a81ccd8" path="/var/lib/kubelet/pods/1abcce5b-61f2-44f5-82e3-9d476a81ccd8/volumes" Feb 19 19:01:00 crc kubenswrapper[4749]: I0219 19:01:00.907823 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29525461-vkh7z"] Feb 19 19:01:00 crc kubenswrapper[4749]: I0219 19:01:00.986098 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525461-vkh7z" event={"ID":"bdfb9341-a712-489d-ba8e-01ce41b5d1cb","Type":"ContainerStarted","Data":"124d412636098bf2cc479b699917e20c810297481cad991f6472742219f4596b"} Feb 19 19:01:01 crc kubenswrapper[4749]: I0219 19:01:01.996286 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525461-vkh7z" event={"ID":"bdfb9341-a712-489d-ba8e-01ce41b5d1cb","Type":"ContainerStarted","Data":"5842ed825fe2bad300618f08600fb4946b8e3e64b6248c1e1433fa1eb6501158"} Feb 19 19:01:02 crc kubenswrapper[4749]: I0219 19:01:02.018083 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29525461-vkh7z" podStartSLOduration=2.017996667 podStartE2EDuration="2.017996667s" podCreationTimestamp="2026-02-19 19:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:01:02.00982156 +0000 UTC m=+1635.971041524" watchObservedRunningTime="2026-02-19 19:01:02.017996667 +0000 UTC m=+1635.979216621" Feb 19 19:01:04 crc kubenswrapper[4749]: I0219 19:01:04.015273 4749 generic.go:334] "Generic (PLEG): container finished" podID="bdfb9341-a712-489d-ba8e-01ce41b5d1cb" containerID="5842ed825fe2bad300618f08600fb4946b8e3e64b6248c1e1433fa1eb6501158" exitCode=0 Feb 19 19:01:04 crc kubenswrapper[4749]: I0219 19:01:04.015360 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525461-vkh7z" event={"ID":"bdfb9341-a712-489d-ba8e-01ce41b5d1cb","Type":"ContainerDied","Data":"5842ed825fe2bad300618f08600fb4946b8e3e64b6248c1e1433fa1eb6501158"} Feb 19 19:01:05 crc kubenswrapper[4749]: I0219 19:01:05.337641 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525461-vkh7z" Feb 19 19:01:05 crc kubenswrapper[4749]: I0219 19:01:05.420671 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdfb9341-a712-489d-ba8e-01ce41b5d1cb-config-data\") pod \"bdfb9341-a712-489d-ba8e-01ce41b5d1cb\" (UID: \"bdfb9341-a712-489d-ba8e-01ce41b5d1cb\") " Feb 19 19:01:05 crc kubenswrapper[4749]: I0219 19:01:05.420812 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2xfb\" (UniqueName: \"kubernetes.io/projected/bdfb9341-a712-489d-ba8e-01ce41b5d1cb-kube-api-access-l2xfb\") pod \"bdfb9341-a712-489d-ba8e-01ce41b5d1cb\" (UID: \"bdfb9341-a712-489d-ba8e-01ce41b5d1cb\") " Feb 19 19:01:05 crc kubenswrapper[4749]: I0219 19:01:05.420932 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdfb9341-a712-489d-ba8e-01ce41b5d1cb-combined-ca-bundle\") pod \"bdfb9341-a712-489d-ba8e-01ce41b5d1cb\" (UID: \"bdfb9341-a712-489d-ba8e-01ce41b5d1cb\") " Feb 19 19:01:05 crc kubenswrapper[4749]: I0219 19:01:05.420991 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bdfb9341-a712-489d-ba8e-01ce41b5d1cb-fernet-keys\") pod \"bdfb9341-a712-489d-ba8e-01ce41b5d1cb\" (UID: \"bdfb9341-a712-489d-ba8e-01ce41b5d1cb\") " Feb 19 19:01:05 crc kubenswrapper[4749]: I0219 19:01:05.426905 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdfb9341-a712-489d-ba8e-01ce41b5d1cb-kube-api-access-l2xfb" (OuterVolumeSpecName: "kube-api-access-l2xfb") pod "bdfb9341-a712-489d-ba8e-01ce41b5d1cb" (UID: "bdfb9341-a712-489d-ba8e-01ce41b5d1cb"). InnerVolumeSpecName "kube-api-access-l2xfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:01:05 crc kubenswrapper[4749]: I0219 19:01:05.427758 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdfb9341-a712-489d-ba8e-01ce41b5d1cb-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "bdfb9341-a712-489d-ba8e-01ce41b5d1cb" (UID: "bdfb9341-a712-489d-ba8e-01ce41b5d1cb"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:01:05 crc kubenswrapper[4749]: I0219 19:01:05.458964 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdfb9341-a712-489d-ba8e-01ce41b5d1cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bdfb9341-a712-489d-ba8e-01ce41b5d1cb" (UID: "bdfb9341-a712-489d-ba8e-01ce41b5d1cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:01:05 crc kubenswrapper[4749]: I0219 19:01:05.479164 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdfb9341-a712-489d-ba8e-01ce41b5d1cb-config-data" (OuterVolumeSpecName: "config-data") pod "bdfb9341-a712-489d-ba8e-01ce41b5d1cb" (UID: "bdfb9341-a712-489d-ba8e-01ce41b5d1cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:01:05 crc kubenswrapper[4749]: I0219 19:01:05.523343 4749 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bdfb9341-a712-489d-ba8e-01ce41b5d1cb-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 19:01:05 crc kubenswrapper[4749]: I0219 19:01:05.523376 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdfb9341-a712-489d-ba8e-01ce41b5d1cb-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:01:05 crc kubenswrapper[4749]: I0219 19:01:05.523387 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2xfb\" (UniqueName: \"kubernetes.io/projected/bdfb9341-a712-489d-ba8e-01ce41b5d1cb-kube-api-access-l2xfb\") on node \"crc\" DevicePath \"\"" Feb 19 19:01:05 crc kubenswrapper[4749]: I0219 19:01:05.523398 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdfb9341-a712-489d-ba8e-01ce41b5d1cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:01:06 crc kubenswrapper[4749]: I0219 19:01:06.033818 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-794a-account-create-update-296dd"] Feb 19 19:01:06 crc kubenswrapper[4749]: I0219 19:01:06.036446 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525461-vkh7z" event={"ID":"bdfb9341-a712-489d-ba8e-01ce41b5d1cb","Type":"ContainerDied","Data":"124d412636098bf2cc479b699917e20c810297481cad991f6472742219f4596b"} Feb 19 19:01:06 crc kubenswrapper[4749]: I0219 19:01:06.036483 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="124d412636098bf2cc479b699917e20c810297481cad991f6472742219f4596b" Feb 19 19:01:06 crc kubenswrapper[4749]: I0219 19:01:06.036516 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525461-vkh7z" Feb 19 19:01:06 crc kubenswrapper[4749]: I0219 19:01:06.045715 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-7mz2s"] Feb 19 19:01:06 crc kubenswrapper[4749]: I0219 19:01:06.066936 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-794a-account-create-update-296dd"] Feb 19 19:01:06 crc kubenswrapper[4749]: I0219 19:01:06.078912 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-7mz2s"] Feb 19 19:01:06 crc kubenswrapper[4749]: I0219 19:01:06.698459 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33a5f43c-0d61-48a3-abb9-86a7bb12af24" path="/var/lib/kubelet/pods/33a5f43c-0d61-48a3-abb9-86a7bb12af24/volumes" Feb 19 19:01:06 crc kubenswrapper[4749]: I0219 19:01:06.699386 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc1e4d4d-f564-46cd-86ae-2b2902b5f678" path="/var/lib/kubelet/pods/dc1e4d4d-f564-46cd-86ae-2b2902b5f678/volumes" Feb 19 19:01:07 crc kubenswrapper[4749]: I0219 19:01:07.028316 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-18b6-account-create-update-6cfqr"] Feb 19 19:01:07 crc kubenswrapper[4749]: I0219 19:01:07.039361 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-tq984"] Feb 19 19:01:07 crc kubenswrapper[4749]: I0219 19:01:07.049153 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-tq984"] Feb 19 19:01:07 crc kubenswrapper[4749]: I0219 19:01:07.057805 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-18b6-account-create-update-6cfqr"] Feb 19 19:01:08 crc kubenswrapper[4749]: I0219 19:01:08.047301 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-53ab-account-create-update-ggjn8"] Feb 19 19:01:08 crc kubenswrapper[4749]: I0219 19:01:08.059000 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-ftrcj"] Feb 19 19:01:08 crc kubenswrapper[4749]: I0219 19:01:08.067380 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-53ab-account-create-update-ggjn8"] Feb 19 19:01:08 crc kubenswrapper[4749]: I0219 19:01:08.080477 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-ftrcj"] Feb 19 19:01:08 crc kubenswrapper[4749]: I0219 19:01:08.689317 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3" path="/var/lib/kubelet/pods/28f9406a-e2cf-4bbe-ac55-ac7eb3e9d5b3/volumes" Feb 19 19:01:08 crc kubenswrapper[4749]: I0219 19:01:08.690125 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="307d3d46-e4e9-4ed1-abc3-88afd38d5497" path="/var/lib/kubelet/pods/307d3d46-e4e9-4ed1-abc3-88afd38d5497/volumes" Feb 19 19:01:08 crc kubenswrapper[4749]: I0219 19:01:08.690644 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a06e2a80-6319-460a-895c-686ceee7e8df" path="/var/lib/kubelet/pods/a06e2a80-6319-460a-895c-686ceee7e8df/volumes" Feb 19 19:01:08 crc kubenswrapper[4749]: I0219 19:01:08.692478 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aac53868-ce9a-4993-b620-a5cc50286b92" path="/var/lib/kubelet/pods/aac53868-ce9a-4993-b620-a5cc50286b92/volumes" Feb 19 19:01:16 crc kubenswrapper[4749]: I0219 19:01:16.676901 4749 scope.go:117] "RemoveContainer" containerID="b79398cce56b4e0681150010c89d7cba4890cb8511db5f22cfb45d82c0a01d17" Feb 19 19:01:16 crc kubenswrapper[4749]: I0219 19:01:16.714142 4749 scope.go:117] "RemoveContainer" containerID="feed97219043ded708db9c3cc1303e18a31a3ec504ab91a4f8870a38a8005f11" Feb 19 19:01:16 crc kubenswrapper[4749]: I0219 19:01:16.758057 4749 scope.go:117] "RemoveContainer" containerID="99ea3770e65720cdb19e7356929d565c47ef4c6b7954073cb2d57c0fee98f1db" Feb 19 19:01:16 crc kubenswrapper[4749]: I0219 19:01:16.818442 4749 scope.go:117] "RemoveContainer" containerID="6df48fc95856a290bf4052b93a35b03d065fad77c8ea9e1645e2ee1c181fcbe5" Feb 19 19:01:16 crc kubenswrapper[4749]: I0219 19:01:16.858865 4749 scope.go:117] "RemoveContainer" containerID="b711117dbecc09677e41ebcf5ecc036ae53d0f39e0d400503a109aad9d3ff5b6" Feb 19 19:01:16 crc kubenswrapper[4749]: I0219 19:01:16.903532 4749 scope.go:117] "RemoveContainer" containerID="e0e6efb47dd1e6d9f27f0466b50e790b54f4447f59a7c7bdbbafb80b8c1d99b2" Feb 19 19:01:16 crc kubenswrapper[4749]: I0219 19:01:16.943776 4749 scope.go:117] "RemoveContainer" containerID="2d3d2170553317949e650860ca2acf7aabe778227f7f709ef09631ece2474c0d" Feb 19 19:01:16 crc kubenswrapper[4749]: I0219 19:01:16.973599 4749 scope.go:117] "RemoveContainer" containerID="f937af1f2d8b9f78426e90c648d467a608bbc3596d4ff03b7baa83a22085c0d8" Feb 19 19:01:19 crc kubenswrapper[4749]: I0219 19:01:19.057772 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-hspjc"] Feb 19 19:01:19 crc kubenswrapper[4749]: I0219 19:01:19.069125 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-hspjc"] Feb 19 19:01:20 crc kubenswrapper[4749]: I0219 19:01:20.693210 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="910f857c-2f81-4747-8401-0b59bab921a2" path="/var/lib/kubelet/pods/910f857c-2f81-4747-8401-0b59bab921a2/volumes" Feb 19 19:01:24 crc kubenswrapper[4749]: I0219 19:01:24.725342 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:01:24 crc kubenswrapper[4749]: I0219 19:01:24.725999 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:01:24 crc kubenswrapper[4749]: I0219 19:01:24.726071 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 19:01:24 crc kubenswrapper[4749]: I0219 19:01:24.727048 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f"} pod="openshift-machine-config-operator/machine-config-daemon-nzldt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:01:24 crc kubenswrapper[4749]: I0219 19:01:24.727123 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" containerID="cri-o://45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" gracePeriod=600 Feb 19 19:01:24 crc kubenswrapper[4749]: E0219 19:01:24.910811 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:01:25 crc kubenswrapper[4749]: I0219 19:01:25.214349 4749 generic.go:334] "Generic (PLEG): container finished" podID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerID="45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" exitCode=0 Feb 19 19:01:25 crc kubenswrapper[4749]: I0219 19:01:25.214394 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerDied","Data":"45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f"} Feb 19 19:01:25 crc kubenswrapper[4749]: I0219 19:01:25.214428 4749 scope.go:117] "RemoveContainer" containerID="6d4364be274ebee2999aeb5c900bcffa311ebe5dc83a06265c201d7a01da149a" Feb 19 19:01:25 crc kubenswrapper[4749]: I0219 19:01:25.215061 4749 scope.go:117] "RemoveContainer" containerID="45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" Feb 19 19:01:25 crc kubenswrapper[4749]: E0219 19:01:25.215459 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:01:30 crc kubenswrapper[4749]: I0219 19:01:30.039667 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-rhcc5"] Feb 19 19:01:30 crc kubenswrapper[4749]: I0219 19:01:30.058455 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-2xmjl"] Feb 19 19:01:30 crc kubenswrapper[4749]: I0219 19:01:30.067598 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-nggkc"] Feb 19 19:01:30 crc kubenswrapper[4749]: I0219 19:01:30.076956 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-1d78-account-create-update-lrl5l"] Feb 19 19:01:30 crc kubenswrapper[4749]: I0219 19:01:30.085184 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-14db-account-create-update-xdwgq"] Feb 19 19:01:30 crc kubenswrapper[4749]: I0219 19:01:30.093650 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-nggkc"] Feb 19 19:01:30 crc kubenswrapper[4749]: I0219 19:01:30.101255 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-2xmjl"] Feb 19 19:01:30 crc kubenswrapper[4749]: I0219 19:01:30.108251 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-14db-account-create-update-xdwgq"] Feb 19 19:01:30 crc kubenswrapper[4749]: I0219 19:01:30.115415 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-rhcc5"] Feb 19 19:01:30 crc kubenswrapper[4749]: I0219 19:01:30.122510 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-1d78-account-create-update-lrl5l"] Feb 19 19:01:30 crc kubenswrapper[4749]: I0219 19:01:30.129757 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-cbfb-account-create-update-gzp2p"] Feb 19 19:01:30 crc kubenswrapper[4749]: I0219 19:01:30.137184 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-cbfb-account-create-update-gzp2p"] Feb 19 19:01:30 crc kubenswrapper[4749]: I0219 19:01:30.688898 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15f880df-7af3-4bab-91f1-5085b70a86d0" path="/var/lib/kubelet/pods/15f880df-7af3-4bab-91f1-5085b70a86d0/volumes" Feb 19 19:01:30 crc kubenswrapper[4749]: I0219 19:01:30.689487 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1767b8ff-e819-4268-982c-57cd067b1cd5" path="/var/lib/kubelet/pods/1767b8ff-e819-4268-982c-57cd067b1cd5/volumes" Feb 19 19:01:30 crc kubenswrapper[4749]: I0219 19:01:30.690043 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fa888c5-51f3-47a3-ae88-65694f44677a" path="/var/lib/kubelet/pods/4fa888c5-51f3-47a3-ae88-65694f44677a/volumes" Feb 19 19:01:30 crc kubenswrapper[4749]: I0219 19:01:30.690642 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55d0d068-45a8-49a7-9cbe-a3ff70cc056f" path="/var/lib/kubelet/pods/55d0d068-45a8-49a7-9cbe-a3ff70cc056f/volumes" Feb 19 19:01:30 crc kubenswrapper[4749]: I0219 19:01:30.691915 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f34e48f9-e454-4f11-b78e-965516098e91" path="/var/lib/kubelet/pods/f34e48f9-e454-4f11-b78e-965516098e91/volumes" Feb 19 19:01:30 crc kubenswrapper[4749]: I0219 19:01:30.692600 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbc2d05b-8f07-4386-9d4c-604e07f0f265" path="/var/lib/kubelet/pods/fbc2d05b-8f07-4386-9d4c-604e07f0f265/volumes" Feb 19 19:01:34 crc kubenswrapper[4749]: I0219 19:01:34.028733 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-9wjqf"] Feb 19 19:01:34 crc kubenswrapper[4749]: I0219 19:01:34.045603 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-9wjqf"] Feb 19 19:01:34 crc kubenswrapper[4749]: I0219 19:01:34.692404 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fb0bea8-d0bd-4a01-a752-7c9697971db8" path="/var/lib/kubelet/pods/4fb0bea8-d0bd-4a01-a752-7c9697971db8/volumes" Feb 19 19:01:36 crc kubenswrapper[4749]: I0219 19:01:36.685042 4749 scope.go:117] "RemoveContainer" containerID="45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" Feb 19 19:01:36 crc kubenswrapper[4749]: E0219 19:01:36.686240 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:01:41 crc kubenswrapper[4749]: I0219 19:01:41.031770 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-tkp7q"] Feb 19 19:01:41 crc kubenswrapper[4749]: I0219 19:01:41.040568 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-tkp7q"] Feb 19 19:01:42 crc kubenswrapper[4749]: I0219 19:01:42.037094 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-trrfz"] Feb 19 19:01:42 crc kubenswrapper[4749]: I0219 19:01:42.048839 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-trrfz"] Feb 19 19:01:42 crc kubenswrapper[4749]: I0219 19:01:42.688658 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04522760-872e-4efc-a852-726292ac24f4" path="/var/lib/kubelet/pods/04522760-872e-4efc-a852-726292ac24f4/volumes" Feb 19 19:01:42 crc kubenswrapper[4749]: I0219 19:01:42.689413 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="284da664-b432-48cf-8f30-fa7cc57bd5b3" path="/var/lib/kubelet/pods/284da664-b432-48cf-8f30-fa7cc57bd5b3/volumes" Feb 19 19:01:47 crc kubenswrapper[4749]: I0219 19:01:47.679684 4749 scope.go:117] "RemoveContainer" containerID="45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" Feb 19 19:01:47 crc kubenswrapper[4749]: E0219 19:01:47.680550 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:02:00 crc kubenswrapper[4749]: I0219 19:02:00.679576 4749 scope.go:117] "RemoveContainer" containerID="45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" Feb 19 19:02:00 crc kubenswrapper[4749]: E0219 19:02:00.680483 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:02:14 crc kubenswrapper[4749]: I0219 19:02:14.678876 4749 scope.go:117] "RemoveContainer" containerID="45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" Feb 19 19:02:14 crc kubenswrapper[4749]: E0219 19:02:14.679730 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:02:17 crc kubenswrapper[4749]: I0219 19:02:17.147680 4749 scope.go:117] "RemoveContainer" containerID="8c17a93c00a141afefd774bb8ff010f31d0100825f40064435589d1ad56bcb39" Feb 19 19:02:17 crc kubenswrapper[4749]: I0219 19:02:17.189063 4749 scope.go:117] "RemoveContainer" containerID="c6616dd3022501a5ad9ae3b9f6dd0bcb3db2fb08710e3f51164bce28d0d78b30" Feb 19 19:02:17 crc kubenswrapper[4749]: I0219 19:02:17.215567 4749 scope.go:117] "RemoveContainer" containerID="4f7270a050bf0b55c766d387494cbb82874997b5f40c884d357181fdc9361b53" Feb 19 19:02:17 crc kubenswrapper[4749]: I0219 19:02:17.260496 4749 scope.go:117] "RemoveContainer" containerID="cd30e62108093956c6066b5e2d48fc375104fa0c280c57d78b26afdd27a388b2" Feb 19 19:02:17 crc kubenswrapper[4749]: I0219 19:02:17.317920 4749 scope.go:117] "RemoveContainer" containerID="352a35ca5c0a0988d7f8bf9cfca788b7f9d858c3dcd1018d9be50641b86b57ef" Feb 19 19:02:17 crc kubenswrapper[4749]: I0219 19:02:17.363551 4749 scope.go:117] "RemoveContainer" containerID="ad0e7d8564db6dacbe89319f85830b5e2c7bc2b01cbebdcb6e3ab4a6d370319f" Feb 19 19:02:17 crc kubenswrapper[4749]: I0219 19:02:17.402201 4749 scope.go:117] "RemoveContainer" containerID="f5ce2cb7b7b99196d49aff9f25ab067642afbe62910cb83f462d2f20329bc557" Feb 19 19:02:17 crc kubenswrapper[4749]: I0219 19:02:17.441899 4749 scope.go:117] "RemoveContainer" containerID="3554e0aa16a11cbb9c6f94a23b8e93ea5fc408c9dbeaae7a66d9a5d403192f0e" Feb 19 19:02:17 crc kubenswrapper[4749]: I0219 19:02:17.468586 4749 scope.go:117] "RemoveContainer" containerID="41eb58b1338db1e882273d3a36b1da142d86f374ea98f3b78a0578f261bd70df" Feb 19 19:02:17 crc kubenswrapper[4749]: I0219 19:02:17.497761 4749 scope.go:117] "RemoveContainer" containerID="51054023a2dc977d46e8925ec806bae10007d1e004f36abc9d03bf6988217c7a" Feb 19 19:02:23 crc kubenswrapper[4749]: I0219 19:02:23.763991 4749 generic.go:334] "Generic (PLEG): container finished" podID="fe29f0cb-bc56-4d7b-983c-52667a6c4ceb" containerID="63908d7c84180e86b3ef9c32f27eabd42ccf17d8f0eadd9c2831e88900cf9bf9" exitCode=0 Feb 19 19:02:23 crc kubenswrapper[4749]: I0219 19:02:23.764136 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k" event={"ID":"fe29f0cb-bc56-4d7b-983c-52667a6c4ceb","Type":"ContainerDied","Data":"63908d7c84180e86b3ef9c32f27eabd42ccf17d8f0eadd9c2831e88900cf9bf9"} Feb 19 19:02:24 crc kubenswrapper[4749]: I0219 19:02:24.040909 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-wsg2v"] Feb 19 19:02:24 crc kubenswrapper[4749]: I0219 19:02:24.051848 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-qr8r7"] Feb 19 19:02:24 crc kubenswrapper[4749]: I0219 19:02:24.062933 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-qr8r7"] Feb 19 19:02:24 crc kubenswrapper[4749]: I0219 19:02:24.073486 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-wsg2v"] Feb 19 19:02:24 crc kubenswrapper[4749]: I0219 19:02:24.690669 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="180a38b2-06f7-49bb-8641-4d82c2e14183" path="/var/lib/kubelet/pods/180a38b2-06f7-49bb-8641-4d82c2e14183/volumes" Feb 19 19:02:24 crc kubenswrapper[4749]: I0219 19:02:24.691578 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e6e94fd-e114-41b2-8634-ca805b5e260f" path="/var/lib/kubelet/pods/4e6e94fd-e114-41b2-8634-ca805b5e260f/volumes" Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.040702 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-jvrxf"] Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.050941 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-jvrxf"] Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.214217 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k" Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.323159 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe29f0cb-bc56-4d7b-983c-52667a6c4ceb-inventory\") pod \"fe29f0cb-bc56-4d7b-983c-52667a6c4ceb\" (UID: \"fe29f0cb-bc56-4d7b-983c-52667a6c4ceb\") " Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.323288 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j5r7\" (UniqueName: \"kubernetes.io/projected/fe29f0cb-bc56-4d7b-983c-52667a6c4ceb-kube-api-access-9j5r7\") pod \"fe29f0cb-bc56-4d7b-983c-52667a6c4ceb\" (UID: \"fe29f0cb-bc56-4d7b-983c-52667a6c4ceb\") " Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.323419 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe29f0cb-bc56-4d7b-983c-52667a6c4ceb-ssh-key-openstack-edpm-ipam\") pod \"fe29f0cb-bc56-4d7b-983c-52667a6c4ceb\" (UID: \"fe29f0cb-bc56-4d7b-983c-52667a6c4ceb\") " Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.330311 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe29f0cb-bc56-4d7b-983c-52667a6c4ceb-kube-api-access-9j5r7" (OuterVolumeSpecName: "kube-api-access-9j5r7") pod "fe29f0cb-bc56-4d7b-983c-52667a6c4ceb" (UID: "fe29f0cb-bc56-4d7b-983c-52667a6c4ceb"). InnerVolumeSpecName "kube-api-access-9j5r7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.354740 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe29f0cb-bc56-4d7b-983c-52667a6c4ceb-inventory" (OuterVolumeSpecName: "inventory") pod "fe29f0cb-bc56-4d7b-983c-52667a6c4ceb" (UID: "fe29f0cb-bc56-4d7b-983c-52667a6c4ceb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.366273 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe29f0cb-bc56-4d7b-983c-52667a6c4ceb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fe29f0cb-bc56-4d7b-983c-52667a6c4ceb" (UID: "fe29f0cb-bc56-4d7b-983c-52667a6c4ceb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.425798 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe29f0cb-bc56-4d7b-983c-52667a6c4ceb-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.425837 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j5r7\" (UniqueName: \"kubernetes.io/projected/fe29f0cb-bc56-4d7b-983c-52667a6c4ceb-kube-api-access-9j5r7\") on node \"crc\" DevicePath \"\"" Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.425871 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe29f0cb-bc56-4d7b-983c-52667a6c4ceb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.781561 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k" event={"ID":"fe29f0cb-bc56-4d7b-983c-52667a6c4ceb","Type":"ContainerDied","Data":"ea28c401648226879ba8205e5e21a9a9220a593275547e8e31dbc641c5e11658"} Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.781912 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea28c401648226879ba8205e5e21a9a9220a593275547e8e31dbc641c5e11658" Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.781610 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k" Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.888260 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z"] Feb 19 19:02:25 crc kubenswrapper[4749]: E0219 19:02:25.888707 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdfb9341-a712-489d-ba8e-01ce41b5d1cb" containerName="keystone-cron" Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.888724 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdfb9341-a712-489d-ba8e-01ce41b5d1cb" containerName="keystone-cron" Feb 19 19:02:25 crc kubenswrapper[4749]: E0219 19:02:25.888755 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe29f0cb-bc56-4d7b-983c-52667a6c4ceb" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.888762 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe29f0cb-bc56-4d7b-983c-52667a6c4ceb" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.888961 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe29f0cb-bc56-4d7b-983c-52667a6c4ceb" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.888978 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdfb9341-a712-489d-ba8e-01ce41b5d1cb" containerName="keystone-cron" Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.889641 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z" Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.893837 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.894089 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.894425 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.894482 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4pz98" Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.912382 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z"] Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.938198 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c39d9c2-1081-4cf9-96c7-746c51a42207-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z\" (UID: \"3c39d9c2-1081-4cf9-96c7-746c51a42207\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z" Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.938367 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c39d9c2-1081-4cf9-96c7-746c51a42207-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z\" (UID: \"3c39d9c2-1081-4cf9-96c7-746c51a42207\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z" Feb 19 19:02:25 crc kubenswrapper[4749]: I0219 19:02:25.938407 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnw6l\" (UniqueName: \"kubernetes.io/projected/3c39d9c2-1081-4cf9-96c7-746c51a42207-kube-api-access-mnw6l\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z\" (UID: \"3c39d9c2-1081-4cf9-96c7-746c51a42207\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z" Feb 19 19:02:26 crc kubenswrapper[4749]: I0219 19:02:26.039611 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c39d9c2-1081-4cf9-96c7-746c51a42207-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z\" (UID: \"3c39d9c2-1081-4cf9-96c7-746c51a42207\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z" Feb 19 19:02:26 crc kubenswrapper[4749]: I0219 19:02:26.039688 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnw6l\" (UniqueName: \"kubernetes.io/projected/3c39d9c2-1081-4cf9-96c7-746c51a42207-kube-api-access-mnw6l\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z\" (UID: \"3c39d9c2-1081-4cf9-96c7-746c51a42207\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z" Feb 19 19:02:26 crc kubenswrapper[4749]: I0219 19:02:26.039769 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c39d9c2-1081-4cf9-96c7-746c51a42207-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z\" (UID: \"3c39d9c2-1081-4cf9-96c7-746c51a42207\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z" Feb 19 19:02:26 crc kubenswrapper[4749]: I0219 19:02:26.043612 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c39d9c2-1081-4cf9-96c7-746c51a42207-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z\" (UID: \"3c39d9c2-1081-4cf9-96c7-746c51a42207\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z" Feb 19 19:02:26 crc kubenswrapper[4749]: I0219 19:02:26.043692 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c39d9c2-1081-4cf9-96c7-746c51a42207-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z\" (UID: \"3c39d9c2-1081-4cf9-96c7-746c51a42207\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z" Feb 19 19:02:26 crc kubenswrapper[4749]: I0219 19:02:26.054856 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnw6l\" (UniqueName: \"kubernetes.io/projected/3c39d9c2-1081-4cf9-96c7-746c51a42207-kube-api-access-mnw6l\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z\" (UID: \"3c39d9c2-1081-4cf9-96c7-746c51a42207\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z" Feb 19 19:02:26 crc kubenswrapper[4749]: I0219 19:02:26.212771 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z" Feb 19 19:02:26 crc kubenswrapper[4749]: I0219 19:02:26.689789 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9efa7d7f-28c0-4bd7-ae4f-f988968544c0" path="/var/lib/kubelet/pods/9efa7d7f-28c0-4bd7-ae4f-f988968544c0/volumes" Feb 19 19:02:26 crc kubenswrapper[4749]: I0219 19:02:26.750150 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z"] Feb 19 19:02:26 crc kubenswrapper[4749]: I0219 19:02:26.795239 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z" event={"ID":"3c39d9c2-1081-4cf9-96c7-746c51a42207","Type":"ContainerStarted","Data":"e3cabefbe4ba5860d9b19d2cfc2e309b968cde3f605b4bedba2fdbe704ee79fa"} Feb 19 19:02:27 crc kubenswrapper[4749]: I0219 19:02:27.806986 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z" event={"ID":"3c39d9c2-1081-4cf9-96c7-746c51a42207","Type":"ContainerStarted","Data":"58a27e400955fe9151eed6ea66de977b26a6abb754d9ca58a31371d3f9858a19"} Feb 19 19:02:27 crc kubenswrapper[4749]: I0219 19:02:27.829132 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z" podStartSLOduration=2.31375358 podStartE2EDuration="2.829113615s" podCreationTimestamp="2026-02-19 19:02:25 +0000 UTC" firstStartedPulling="2026-02-19 19:02:26.748574287 +0000 UTC m=+1720.709794231" lastFinishedPulling="2026-02-19 19:02:27.263934312 +0000 UTC m=+1721.225154266" observedRunningTime="2026-02-19 19:02:27.82312157 +0000 UTC m=+1721.784341544" watchObservedRunningTime="2026-02-19 19:02:27.829113615 +0000 UTC m=+1721.790333569" Feb 19 19:02:28 crc kubenswrapper[4749]: I0219 19:02:28.678943 4749 scope.go:117] "RemoveContainer" containerID="45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" Feb 19 19:02:28 crc kubenswrapper[4749]: E0219 19:02:28.679294 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:02:40 crc kubenswrapper[4749]: I0219 19:02:40.679661 4749 scope.go:117] "RemoveContainer" containerID="45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" Feb 19 19:02:40 crc kubenswrapper[4749]: E0219 19:02:40.681278 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:02:43 crc kubenswrapper[4749]: I0219 19:02:43.051704 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-f8dgh"] Feb 19 19:02:43 crc kubenswrapper[4749]: I0219 19:02:43.062220 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-f8dgh"] Feb 19 19:02:44 crc kubenswrapper[4749]: I0219 19:02:44.704224 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d" path="/var/lib/kubelet/pods/d3a8e773-dd45-4aa6-96ba-d3e6cb89f42d/volumes" Feb 19 19:02:48 crc kubenswrapper[4749]: I0219 19:02:48.027236 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-cmbgb"] Feb 19 19:02:48 crc kubenswrapper[4749]: I0219 19:02:48.035692 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-cmbgb"] Feb 19 19:02:48 crc kubenswrapper[4749]: I0219 19:02:48.689534 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5701fcc2-ae2a-4017-8991-3470421ff234" path="/var/lib/kubelet/pods/5701fcc2-ae2a-4017-8991-3470421ff234/volumes" Feb 19 19:02:51 crc kubenswrapper[4749]: I0219 19:02:51.678665 4749 scope.go:117] "RemoveContainer" containerID="45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" Feb 19 19:02:51 crc kubenswrapper[4749]: E0219 19:02:51.679277 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:03:03 crc kubenswrapper[4749]: I0219 19:03:03.679770 4749 scope.go:117] "RemoveContainer" containerID="45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" Feb 19 19:03:03 crc kubenswrapper[4749]: E0219 19:03:03.680717 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:03:17 crc kubenswrapper[4749]: I0219 19:03:17.678422 4749 scope.go:117] "RemoveContainer" containerID="45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" Feb 19 19:03:17 crc kubenswrapper[4749]: E0219 19:03:17.679278 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:03:17 crc kubenswrapper[4749]: I0219 19:03:17.685322 4749 scope.go:117] "RemoveContainer" containerID="2afb535e3c37a3548a563720815df50da4e49fd0ce861c86afb0dccb93d27954" Feb 19 19:03:17 crc kubenswrapper[4749]: I0219 19:03:17.725109 4749 scope.go:117] "RemoveContainer" containerID="59151c6bf16b4a152bc622f54e0507052951f8cc27cdb87f310817d937e10c53" Feb 19 19:03:17 crc kubenswrapper[4749]: I0219 19:03:17.779926 4749 scope.go:117] "RemoveContainer" containerID="fd16a22fd1e877553f10b8e81bace93a1789b59cfab3e6a9937e8cdb3cbf4092" Feb 19 19:03:17 crc kubenswrapper[4749]: I0219 19:03:17.818008 4749 scope.go:117] "RemoveContainer" containerID="aeb577c04e53e47cd1c67a35b32c7ae083c5ca53af319660498c257ead2a633e" Feb 19 19:03:17 crc kubenswrapper[4749]: I0219 19:03:17.862537 4749 scope.go:117] "RemoveContainer" containerID="5efeef4609cfd895356d5ac735bb91074bc2f5dc82f3c2ece8267ae058e92e5e" Feb 19 19:03:31 crc kubenswrapper[4749]: I0219 19:03:31.679139 4749 scope.go:117] "RemoveContainer" containerID="45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" Feb 19 19:03:31 crc kubenswrapper[4749]: E0219 19:03:31.680058 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:03:34 crc kubenswrapper[4749]: I0219 19:03:34.444744 4749 generic.go:334] "Generic (PLEG): container finished" podID="3c39d9c2-1081-4cf9-96c7-746c51a42207" containerID="58a27e400955fe9151eed6ea66de977b26a6abb754d9ca58a31371d3f9858a19" exitCode=0 Feb 19 19:03:34 crc kubenswrapper[4749]: I0219 19:03:34.444860 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z" event={"ID":"3c39d9c2-1081-4cf9-96c7-746c51a42207","Type":"ContainerDied","Data":"58a27e400955fe9151eed6ea66de977b26a6abb754d9ca58a31371d3f9858a19"} Feb 19 19:03:35 crc kubenswrapper[4749]: I0219 19:03:35.836312 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z" Feb 19 19:03:35 crc kubenswrapper[4749]: I0219 19:03:35.927836 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnw6l\" (UniqueName: \"kubernetes.io/projected/3c39d9c2-1081-4cf9-96c7-746c51a42207-kube-api-access-mnw6l\") pod \"3c39d9c2-1081-4cf9-96c7-746c51a42207\" (UID: \"3c39d9c2-1081-4cf9-96c7-746c51a42207\") " Feb 19 19:03:35 crc kubenswrapper[4749]: I0219 19:03:35.927990 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c39d9c2-1081-4cf9-96c7-746c51a42207-ssh-key-openstack-edpm-ipam\") pod \"3c39d9c2-1081-4cf9-96c7-746c51a42207\" (UID: \"3c39d9c2-1081-4cf9-96c7-746c51a42207\") " Feb 19 19:03:35 crc kubenswrapper[4749]: I0219 19:03:35.928118 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c39d9c2-1081-4cf9-96c7-746c51a42207-inventory\") pod \"3c39d9c2-1081-4cf9-96c7-746c51a42207\" (UID: \"3c39d9c2-1081-4cf9-96c7-746c51a42207\") " Feb 19 19:03:35 crc kubenswrapper[4749]: I0219 19:03:35.933266 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c39d9c2-1081-4cf9-96c7-746c51a42207-kube-api-access-mnw6l" (OuterVolumeSpecName: "kube-api-access-mnw6l") pod "3c39d9c2-1081-4cf9-96c7-746c51a42207" (UID: "3c39d9c2-1081-4cf9-96c7-746c51a42207"). InnerVolumeSpecName "kube-api-access-mnw6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:03:35 crc kubenswrapper[4749]: I0219 19:03:35.958775 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c39d9c2-1081-4cf9-96c7-746c51a42207-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3c39d9c2-1081-4cf9-96c7-746c51a42207" (UID: "3c39d9c2-1081-4cf9-96c7-746c51a42207"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:03:35 crc kubenswrapper[4749]: I0219 19:03:35.959778 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c39d9c2-1081-4cf9-96c7-746c51a42207-inventory" (OuterVolumeSpecName: "inventory") pod "3c39d9c2-1081-4cf9-96c7-746c51a42207" (UID: "3c39d9c2-1081-4cf9-96c7-746c51a42207"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:03:36 crc kubenswrapper[4749]: I0219 19:03:36.030449 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnw6l\" (UniqueName: \"kubernetes.io/projected/3c39d9c2-1081-4cf9-96c7-746c51a42207-kube-api-access-mnw6l\") on node \"crc\" DevicePath \"\"" Feb 19 19:03:36 crc kubenswrapper[4749]: I0219 19:03:36.030489 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c39d9c2-1081-4cf9-96c7-746c51a42207-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:03:36 crc kubenswrapper[4749]: I0219 19:03:36.030504 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c39d9c2-1081-4cf9-96c7-746c51a42207-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:03:36 crc kubenswrapper[4749]: I0219 19:03:36.463006 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z" event={"ID":"3c39d9c2-1081-4cf9-96c7-746c51a42207","Type":"ContainerDied","Data":"e3cabefbe4ba5860d9b19d2cfc2e309b968cde3f605b4bedba2fdbe704ee79fa"} Feb 19 19:03:36 crc kubenswrapper[4749]: I0219 19:03:36.463363 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3cabefbe4ba5860d9b19d2cfc2e309b968cde3f605b4bedba2fdbe704ee79fa" Feb 19 19:03:36 crc kubenswrapper[4749]: I0219 19:03:36.463118 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z" Feb 19 19:03:36 crc kubenswrapper[4749]: I0219 19:03:36.569970 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf"] Feb 19 19:03:36 crc kubenswrapper[4749]: E0219 19:03:36.570572 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c39d9c2-1081-4cf9-96c7-746c51a42207" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 19 19:03:36 crc kubenswrapper[4749]: I0219 19:03:36.570596 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c39d9c2-1081-4cf9-96c7-746c51a42207" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 19 19:03:36 crc kubenswrapper[4749]: I0219 19:03:36.570828 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c39d9c2-1081-4cf9-96c7-746c51a42207" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 19 19:03:36 crc kubenswrapper[4749]: I0219 19:03:36.571713 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf" Feb 19 19:03:36 crc kubenswrapper[4749]: I0219 19:03:36.581893 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:03:36 crc kubenswrapper[4749]: I0219 19:03:36.582188 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4pz98" Feb 19 19:03:36 crc kubenswrapper[4749]: I0219 19:03:36.582428 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:03:36 crc kubenswrapper[4749]: I0219 19:03:36.582627 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:03:36 crc kubenswrapper[4749]: I0219 19:03:36.614516 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf"] Feb 19 19:03:36 crc kubenswrapper[4749]: I0219 19:03:36.743265 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/91c1231f-dc2c-4c68-ba4f-e6d99913bd60-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf\" (UID: \"91c1231f-dc2c-4c68-ba4f-e6d99913bd60\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf" Feb 19 19:03:36 crc kubenswrapper[4749]: I0219 19:03:36.743658 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2clc\" (UniqueName: \"kubernetes.io/projected/91c1231f-dc2c-4c68-ba4f-e6d99913bd60-kube-api-access-g2clc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf\" (UID: \"91c1231f-dc2c-4c68-ba4f-e6d99913bd60\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf" Feb 19 19:03:36 crc kubenswrapper[4749]: I0219 19:03:36.743888 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91c1231f-dc2c-4c68-ba4f-e6d99913bd60-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf\" (UID: \"91c1231f-dc2c-4c68-ba4f-e6d99913bd60\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf" Feb 19 19:03:36 crc kubenswrapper[4749]: I0219 19:03:36.846154 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2clc\" (UniqueName: \"kubernetes.io/projected/91c1231f-dc2c-4c68-ba4f-e6d99913bd60-kube-api-access-g2clc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf\" (UID: \"91c1231f-dc2c-4c68-ba4f-e6d99913bd60\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf" Feb 19 19:03:36 crc kubenswrapper[4749]: I0219 19:03:36.846285 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91c1231f-dc2c-4c68-ba4f-e6d99913bd60-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf\" (UID: \"91c1231f-dc2c-4c68-ba4f-e6d99913bd60\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf" Feb 19 19:03:36 crc kubenswrapper[4749]: I0219 19:03:36.846359 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/91c1231f-dc2c-4c68-ba4f-e6d99913bd60-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf\" (UID: \"91c1231f-dc2c-4c68-ba4f-e6d99913bd60\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf" Feb 19 19:03:36 crc kubenswrapper[4749]: I0219 19:03:36.850803 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/91c1231f-dc2c-4c68-ba4f-e6d99913bd60-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf\" (UID: \"91c1231f-dc2c-4c68-ba4f-e6d99913bd60\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf" Feb 19 19:03:36 crc kubenswrapper[4749]: I0219 19:03:36.854321 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91c1231f-dc2c-4c68-ba4f-e6d99913bd60-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf\" (UID: \"91c1231f-dc2c-4c68-ba4f-e6d99913bd60\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf" Feb 19 19:03:36 crc kubenswrapper[4749]: I0219 19:03:36.862430 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2clc\" (UniqueName: \"kubernetes.io/projected/91c1231f-dc2c-4c68-ba4f-e6d99913bd60-kube-api-access-g2clc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf\" (UID: \"91c1231f-dc2c-4c68-ba4f-e6d99913bd60\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf" Feb 19 19:03:36 crc kubenswrapper[4749]: I0219 19:03:36.897782 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf" Feb 19 19:03:37 crc kubenswrapper[4749]: I0219 19:03:37.418295 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf"] Feb 19 19:03:37 crc kubenswrapper[4749]: I0219 19:03:37.473173 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf" event={"ID":"91c1231f-dc2c-4c68-ba4f-e6d99913bd60","Type":"ContainerStarted","Data":"739865964e10aca690c6d9a074a959ec10351ff02f6df1b370d070e63591601e"} Feb 19 19:03:38 crc kubenswrapper[4749]: I0219 19:03:38.482044 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf" event={"ID":"91c1231f-dc2c-4c68-ba4f-e6d99913bd60","Type":"ContainerStarted","Data":"7e6232cda6d0bccd5db04d4f9625dca407ccc3a4c2225dc302ce49d7606fa795"} Feb 19 19:03:38 crc kubenswrapper[4749]: I0219 19:03:38.499068 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf" podStartSLOduration=2.04267021 podStartE2EDuration="2.499048782s" podCreationTimestamp="2026-02-19 19:03:36 +0000 UTC" firstStartedPulling="2026-02-19 19:03:37.428677292 +0000 UTC m=+1791.389897236" lastFinishedPulling="2026-02-19 19:03:37.885055854 +0000 UTC m=+1791.846275808" observedRunningTime="2026-02-19 19:03:38.496938665 +0000 UTC m=+1792.458158619" watchObservedRunningTime="2026-02-19 19:03:38.499048782 +0000 UTC m=+1792.460268746" Feb 19 19:03:42 crc kubenswrapper[4749]: I0219 19:03:42.042539 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-f0bc-account-create-update-xv578"] Feb 19 19:03:42 crc kubenswrapper[4749]: I0219 19:03:42.052503 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-4085-account-create-update-t9pdt"] Feb 19 19:03:42 crc kubenswrapper[4749]: I0219 19:03:42.063102 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-dcm8j"] Feb 19 19:03:42 crc kubenswrapper[4749]: I0219 19:03:42.073881 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-53d3-account-create-update-7n9p9"] Feb 19 19:03:42 crc kubenswrapper[4749]: I0219 19:03:42.081928 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-9r5lq"] Feb 19 19:03:42 crc kubenswrapper[4749]: I0219 19:03:42.088986 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-xzg8l"] Feb 19 19:03:42 crc kubenswrapper[4749]: I0219 19:03:42.096522 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-f0bc-account-create-update-xv578"] Feb 19 19:03:42 crc kubenswrapper[4749]: I0219 19:03:42.103308 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-dcm8j"] Feb 19 19:03:42 crc kubenswrapper[4749]: I0219 19:03:42.110489 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-4085-account-create-update-t9pdt"] Feb 19 19:03:42 crc kubenswrapper[4749]: I0219 19:03:42.117640 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-9r5lq"] Feb 19 19:03:42 crc kubenswrapper[4749]: I0219 19:03:42.125105 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-xzg8l"] Feb 19 19:03:42 crc kubenswrapper[4749]: I0219 19:03:42.133893 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-53d3-account-create-update-7n9p9"] Feb 19 19:03:42 crc kubenswrapper[4749]: I0219 19:03:42.517537 4749 generic.go:334] "Generic (PLEG): container finished" podID="91c1231f-dc2c-4c68-ba4f-e6d99913bd60" containerID="7e6232cda6d0bccd5db04d4f9625dca407ccc3a4c2225dc302ce49d7606fa795" exitCode=0 Feb 19 19:03:42 crc kubenswrapper[4749]: I0219 19:03:42.517622 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf" event={"ID":"91c1231f-dc2c-4c68-ba4f-e6d99913bd60","Type":"ContainerDied","Data":"7e6232cda6d0bccd5db04d4f9625dca407ccc3a4c2225dc302ce49d7606fa795"} Feb 19 19:03:42 crc kubenswrapper[4749]: I0219 19:03:42.689424 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01a9c221-8937-4524-98ed-508ca1522a9d" path="/var/lib/kubelet/pods/01a9c221-8937-4524-98ed-508ca1522a9d/volumes" Feb 19 19:03:42 crc kubenswrapper[4749]: I0219 19:03:42.689955 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cb7b549-3520-49f2-9ee8-089bab48181d" path="/var/lib/kubelet/pods/0cb7b549-3520-49f2-9ee8-089bab48181d/volumes" Feb 19 19:03:42 crc kubenswrapper[4749]: I0219 19:03:42.690502 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b169885b-4f49-41e1-83e7-d43237329082" path="/var/lib/kubelet/pods/b169885b-4f49-41e1-83e7-d43237329082/volumes" Feb 19 19:03:42 crc kubenswrapper[4749]: I0219 19:03:42.691110 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f" path="/var/lib/kubelet/pods/ba3b0274-1b68-4fdd-b2d9-f8b2f04ed60f/volumes" Feb 19 19:03:42 crc kubenswrapper[4749]: I0219 19:03:42.692070 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fde9adc3-a038-4de5-949c-d366ebe287d6" path="/var/lib/kubelet/pods/fde9adc3-a038-4de5-949c-d366ebe287d6/volumes" Feb 19 19:03:42 crc kubenswrapper[4749]: I0219 19:03:42.692560 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe2b2973-1897-4202-be29-9ca7805d443a" path="/var/lib/kubelet/pods/fe2b2973-1897-4202-be29-9ca7805d443a/volumes" Feb 19 19:03:43 crc kubenswrapper[4749]: I0219 19:03:43.952041 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf" Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.095018 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2clc\" (UniqueName: \"kubernetes.io/projected/91c1231f-dc2c-4c68-ba4f-e6d99913bd60-kube-api-access-g2clc\") pod \"91c1231f-dc2c-4c68-ba4f-e6d99913bd60\" (UID: \"91c1231f-dc2c-4c68-ba4f-e6d99913bd60\") " Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.095121 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91c1231f-dc2c-4c68-ba4f-e6d99913bd60-inventory\") pod \"91c1231f-dc2c-4c68-ba4f-e6d99913bd60\" (UID: \"91c1231f-dc2c-4c68-ba4f-e6d99913bd60\") " Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.095423 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/91c1231f-dc2c-4c68-ba4f-e6d99913bd60-ssh-key-openstack-edpm-ipam\") pod \"91c1231f-dc2c-4c68-ba4f-e6d99913bd60\" (UID: \"91c1231f-dc2c-4c68-ba4f-e6d99913bd60\") " Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.100393 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91c1231f-dc2c-4c68-ba4f-e6d99913bd60-kube-api-access-g2clc" (OuterVolumeSpecName: "kube-api-access-g2clc") pod "91c1231f-dc2c-4c68-ba4f-e6d99913bd60" (UID: "91c1231f-dc2c-4c68-ba4f-e6d99913bd60"). InnerVolumeSpecName "kube-api-access-g2clc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.123398 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91c1231f-dc2c-4c68-ba4f-e6d99913bd60-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "91c1231f-dc2c-4c68-ba4f-e6d99913bd60" (UID: "91c1231f-dc2c-4c68-ba4f-e6d99913bd60"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.125350 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91c1231f-dc2c-4c68-ba4f-e6d99913bd60-inventory" (OuterVolumeSpecName: "inventory") pod "91c1231f-dc2c-4c68-ba4f-e6d99913bd60" (UID: "91c1231f-dc2c-4c68-ba4f-e6d99913bd60"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.197590 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/91c1231f-dc2c-4c68-ba4f-e6d99913bd60-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.197636 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2clc\" (UniqueName: \"kubernetes.io/projected/91c1231f-dc2c-4c68-ba4f-e6d99913bd60-kube-api-access-g2clc\") on node \"crc\" DevicePath \"\"" Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.197645 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91c1231f-dc2c-4c68-ba4f-e6d99913bd60-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.540177 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf" event={"ID":"91c1231f-dc2c-4c68-ba4f-e6d99913bd60","Type":"ContainerDied","Data":"739865964e10aca690c6d9a074a959ec10351ff02f6df1b370d070e63591601e"} Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.540635 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="739865964e10aca690c6d9a074a959ec10351ff02f6df1b370d070e63591601e" Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.540247 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf" Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.617928 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkxkr"] Feb 19 19:03:44 crc kubenswrapper[4749]: E0219 19:03:44.618365 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c1231f-dc2c-4c68-ba4f-e6d99913bd60" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.618383 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c1231f-dc2c-4c68-ba4f-e6d99913bd60" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.618631 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="91c1231f-dc2c-4c68-ba4f-e6d99913bd60" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.619308 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkxkr" Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.621370 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.622797 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.623599 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.626294 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4pz98" Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.629593 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkxkr"] Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.807623 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7ljc\" (UniqueName: \"kubernetes.io/projected/59c9b6e2-493f-4c1a-ae9b-47dca8a2658d-kube-api-access-b7ljc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hkxkr\" (UID: \"59c9b6e2-493f-4c1a-ae9b-47dca8a2658d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkxkr" Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.807681 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59c9b6e2-493f-4c1a-ae9b-47dca8a2658d-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hkxkr\" (UID: \"59c9b6e2-493f-4c1a-ae9b-47dca8a2658d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkxkr" Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.807728 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59c9b6e2-493f-4c1a-ae9b-47dca8a2658d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hkxkr\" (UID: \"59c9b6e2-493f-4c1a-ae9b-47dca8a2658d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkxkr" Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.909887 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7ljc\" (UniqueName: \"kubernetes.io/projected/59c9b6e2-493f-4c1a-ae9b-47dca8a2658d-kube-api-access-b7ljc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hkxkr\" (UID: \"59c9b6e2-493f-4c1a-ae9b-47dca8a2658d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkxkr" Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.910413 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59c9b6e2-493f-4c1a-ae9b-47dca8a2658d-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hkxkr\" (UID: \"59c9b6e2-493f-4c1a-ae9b-47dca8a2658d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkxkr" Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.911180 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59c9b6e2-493f-4c1a-ae9b-47dca8a2658d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hkxkr\" (UID: \"59c9b6e2-493f-4c1a-ae9b-47dca8a2658d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkxkr" Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.915645 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59c9b6e2-493f-4c1a-ae9b-47dca8a2658d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hkxkr\" (UID: \"59c9b6e2-493f-4c1a-ae9b-47dca8a2658d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkxkr" Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.915689 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59c9b6e2-493f-4c1a-ae9b-47dca8a2658d-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hkxkr\" (UID: \"59c9b6e2-493f-4c1a-ae9b-47dca8a2658d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkxkr" Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.928604 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7ljc\" (UniqueName: \"kubernetes.io/projected/59c9b6e2-493f-4c1a-ae9b-47dca8a2658d-kube-api-access-b7ljc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hkxkr\" (UID: \"59c9b6e2-493f-4c1a-ae9b-47dca8a2658d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkxkr" Feb 19 19:03:44 crc kubenswrapper[4749]: I0219 19:03:44.934239 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkxkr" Feb 19 19:03:45 crc kubenswrapper[4749]: I0219 19:03:45.438351 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkxkr"] Feb 19 19:03:45 crc kubenswrapper[4749]: I0219 19:03:45.548956 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkxkr" event={"ID":"59c9b6e2-493f-4c1a-ae9b-47dca8a2658d","Type":"ContainerStarted","Data":"e855046eb7832f3f35d4fd37e4efd972e57d153a44b1f28a13e9a409912f0116"} Feb 19 19:03:46 crc kubenswrapper[4749]: I0219 19:03:46.558748 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkxkr" event={"ID":"59c9b6e2-493f-4c1a-ae9b-47dca8a2658d","Type":"ContainerStarted","Data":"a6c294cf283f17e0ad0cc81e3876b055690003e09a68489f291648e88d157573"} Feb 19 19:03:46 crc kubenswrapper[4749]: I0219 19:03:46.685182 4749 scope.go:117] "RemoveContainer" containerID="45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" Feb 19 19:03:46 crc kubenswrapper[4749]: E0219 19:03:46.685444 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:03:58 crc kubenswrapper[4749]: I0219 19:03:58.678843 4749 scope.go:117] "RemoveContainer" containerID="45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" Feb 19 19:03:58 crc kubenswrapper[4749]: E0219 19:03:58.680182 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:04:11 crc kubenswrapper[4749]: I0219 19:04:11.679788 4749 scope.go:117] "RemoveContainer" containerID="45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" Feb 19 19:04:11 crc kubenswrapper[4749]: E0219 19:04:11.680731 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:04:12 crc kubenswrapper[4749]: I0219 19:04:12.040612 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkxkr" podStartSLOduration=27.600056019 podStartE2EDuration="28.040592158s" podCreationTimestamp="2026-02-19 19:03:44 +0000 UTC" firstStartedPulling="2026-02-19 19:03:45.44972915 +0000 UTC m=+1799.410949104" lastFinishedPulling="2026-02-19 19:03:45.890265289 +0000 UTC m=+1799.851485243" observedRunningTime="2026-02-19 19:03:46.582227663 +0000 UTC m=+1800.543447617" watchObservedRunningTime="2026-02-19 19:04:12.040592158 +0000 UTC m=+1826.001812122" Feb 19 19:04:12 crc kubenswrapper[4749]: I0219 19:04:12.047677 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fhz4p"] Feb 19 19:04:12 crc kubenswrapper[4749]: I0219 19:04:12.059143 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fhz4p"] Feb 19 19:04:12 crc kubenswrapper[4749]: I0219 19:04:12.691260 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bba20baa-aaa9-44cd-8076-2a93bf65ff1e" path="/var/lib/kubelet/pods/bba20baa-aaa9-44cd-8076-2a93bf65ff1e/volumes" Feb 19 19:04:18 crc kubenswrapper[4749]: I0219 19:04:18.002655 4749 scope.go:117] "RemoveContainer" containerID="cfa6524ccba16d2d8f1f346ed55ba6bd6cc21a6a4e3abea529a647ce2e2307a3" Feb 19 19:04:18 crc kubenswrapper[4749]: I0219 19:04:18.031183 4749 scope.go:117] "RemoveContainer" containerID="63d4c8b4d8f9752dde1907d862f485e8035f1f3cf89a217ef2e652ea42cfb699" Feb 19 19:04:18 crc kubenswrapper[4749]: I0219 19:04:18.095329 4749 scope.go:117] "RemoveContainer" containerID="cf43de36c50b60b14fe4b2bcb8e098295a7c8556e9921688cc793bf38436280e" Feb 19 19:04:18 crc kubenswrapper[4749]: I0219 19:04:18.155934 4749 scope.go:117] "RemoveContainer" containerID="44a837171e721b0ccd68a06da05986cc1fc3220ece3994219136528f6ac967cd" Feb 19 19:04:18 crc kubenswrapper[4749]: I0219 19:04:18.185702 4749 scope.go:117] "RemoveContainer" containerID="d1ae116371615005842cecd26fb4feec90c7ad69b97d39d522d3bc68dcb6dc25" Feb 19 19:04:18 crc kubenswrapper[4749]: I0219 19:04:18.228714 4749 scope.go:117] "RemoveContainer" containerID="65cd47193439036fe28115d1479c32c2eb3410dab98fa8a6c954cb4a3fada840" Feb 19 19:04:18 crc kubenswrapper[4749]: I0219 19:04:18.265062 4749 scope.go:117] "RemoveContainer" containerID="2a241fe1cf6961b1ad10732e0e8d3fd8a6fc085e00fef88f87c5f9f0009ee6e9" Feb 19 19:04:21 crc kubenswrapper[4749]: I0219 19:04:21.897814 4749 generic.go:334] "Generic (PLEG): container finished" podID="59c9b6e2-493f-4c1a-ae9b-47dca8a2658d" containerID="a6c294cf283f17e0ad0cc81e3876b055690003e09a68489f291648e88d157573" exitCode=0 Feb 19 19:04:21 crc kubenswrapper[4749]: I0219 19:04:21.897876 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkxkr" event={"ID":"59c9b6e2-493f-4c1a-ae9b-47dca8a2658d","Type":"ContainerDied","Data":"a6c294cf283f17e0ad0cc81e3876b055690003e09a68489f291648e88d157573"} Feb 19 19:04:23 crc kubenswrapper[4749]: I0219 19:04:23.294752 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkxkr" Feb 19 19:04:23 crc kubenswrapper[4749]: I0219 19:04:23.404388 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59c9b6e2-493f-4c1a-ae9b-47dca8a2658d-inventory\") pod \"59c9b6e2-493f-4c1a-ae9b-47dca8a2658d\" (UID: \"59c9b6e2-493f-4c1a-ae9b-47dca8a2658d\") " Feb 19 19:04:23 crc kubenswrapper[4749]: I0219 19:04:23.404447 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59c9b6e2-493f-4c1a-ae9b-47dca8a2658d-ssh-key-openstack-edpm-ipam\") pod \"59c9b6e2-493f-4c1a-ae9b-47dca8a2658d\" (UID: \"59c9b6e2-493f-4c1a-ae9b-47dca8a2658d\") " Feb 19 19:04:23 crc kubenswrapper[4749]: I0219 19:04:23.404653 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7ljc\" (UniqueName: \"kubernetes.io/projected/59c9b6e2-493f-4c1a-ae9b-47dca8a2658d-kube-api-access-b7ljc\") pod \"59c9b6e2-493f-4c1a-ae9b-47dca8a2658d\" (UID: \"59c9b6e2-493f-4c1a-ae9b-47dca8a2658d\") " Feb 19 19:04:23 crc kubenswrapper[4749]: I0219 19:04:23.409694 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59c9b6e2-493f-4c1a-ae9b-47dca8a2658d-kube-api-access-b7ljc" (OuterVolumeSpecName: "kube-api-access-b7ljc") pod "59c9b6e2-493f-4c1a-ae9b-47dca8a2658d" (UID: "59c9b6e2-493f-4c1a-ae9b-47dca8a2658d"). InnerVolumeSpecName "kube-api-access-b7ljc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:04:23 crc kubenswrapper[4749]: I0219 19:04:23.431178 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59c9b6e2-493f-4c1a-ae9b-47dca8a2658d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "59c9b6e2-493f-4c1a-ae9b-47dca8a2658d" (UID: "59c9b6e2-493f-4c1a-ae9b-47dca8a2658d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:04:23 crc kubenswrapper[4749]: I0219 19:04:23.434830 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59c9b6e2-493f-4c1a-ae9b-47dca8a2658d-inventory" (OuterVolumeSpecName: "inventory") pod "59c9b6e2-493f-4c1a-ae9b-47dca8a2658d" (UID: "59c9b6e2-493f-4c1a-ae9b-47dca8a2658d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:04:23 crc kubenswrapper[4749]: I0219 19:04:23.506939 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7ljc\" (UniqueName: \"kubernetes.io/projected/59c9b6e2-493f-4c1a-ae9b-47dca8a2658d-kube-api-access-b7ljc\") on node \"crc\" DevicePath \"\"" Feb 19 19:04:23 crc kubenswrapper[4749]: I0219 19:04:23.506973 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59c9b6e2-493f-4c1a-ae9b-47dca8a2658d-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:04:23 crc kubenswrapper[4749]: I0219 19:04:23.506982 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59c9b6e2-493f-4c1a-ae9b-47dca8a2658d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:04:23 crc kubenswrapper[4749]: I0219 19:04:23.920595 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkxkr" event={"ID":"59c9b6e2-493f-4c1a-ae9b-47dca8a2658d","Type":"ContainerDied","Data":"e855046eb7832f3f35d4fd37e4efd972e57d153a44b1f28a13e9a409912f0116"} Feb 19 19:04:23 crc kubenswrapper[4749]: I0219 19:04:23.920649 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e855046eb7832f3f35d4fd37e4efd972e57d153a44b1f28a13e9a409912f0116" Feb 19 19:04:23 crc kubenswrapper[4749]: I0219 19:04:23.920666 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkxkr" Feb 19 19:04:24 crc kubenswrapper[4749]: I0219 19:04:24.022172 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qw75z"] Feb 19 19:04:24 crc kubenswrapper[4749]: E0219 19:04:24.022675 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59c9b6e2-493f-4c1a-ae9b-47dca8a2658d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:04:24 crc kubenswrapper[4749]: I0219 19:04:24.022699 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="59c9b6e2-493f-4c1a-ae9b-47dca8a2658d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:04:24 crc kubenswrapper[4749]: I0219 19:04:24.022943 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="59c9b6e2-493f-4c1a-ae9b-47dca8a2658d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:04:24 crc kubenswrapper[4749]: I0219 19:04:24.023816 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qw75z" Feb 19 19:04:24 crc kubenswrapper[4749]: I0219 19:04:24.027460 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4pz98" Feb 19 19:04:24 crc kubenswrapper[4749]: I0219 19:04:24.027767 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:04:24 crc kubenswrapper[4749]: I0219 19:04:24.027985 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:04:24 crc kubenswrapper[4749]: I0219 19:04:24.037849 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:04:24 crc kubenswrapper[4749]: I0219 19:04:24.039599 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qw75z"] Feb 19 19:04:24 crc kubenswrapper[4749]: E0219 19:04:24.166792 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59c9b6e2_493f_4c1a_ae9b_47dca8a2658d.slice\": RecentStats: unable to find data in memory cache]" Feb 19 19:04:24 crc kubenswrapper[4749]: I0219 19:04:24.222345 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/780425c2-7f97-4d00-a992-bf0ef5be3876-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qw75z\" (UID: \"780425c2-7f97-4d00-a992-bf0ef5be3876\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qw75z" Feb 19 19:04:24 crc kubenswrapper[4749]: I0219 19:04:24.222726 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/780425c2-7f97-4d00-a992-bf0ef5be3876-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qw75z\" (UID: \"780425c2-7f97-4d00-a992-bf0ef5be3876\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qw75z" Feb 19 19:04:24 crc kubenswrapper[4749]: I0219 19:04:24.223157 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gqkz\" (UniqueName: \"kubernetes.io/projected/780425c2-7f97-4d00-a992-bf0ef5be3876-kube-api-access-8gqkz\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qw75z\" (UID: \"780425c2-7f97-4d00-a992-bf0ef5be3876\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qw75z" Feb 19 19:04:24 crc kubenswrapper[4749]: I0219 19:04:24.324443 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gqkz\" (UniqueName: \"kubernetes.io/projected/780425c2-7f97-4d00-a992-bf0ef5be3876-kube-api-access-8gqkz\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qw75z\" (UID: \"780425c2-7f97-4d00-a992-bf0ef5be3876\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qw75z" Feb 19 19:04:24 crc kubenswrapper[4749]: I0219 19:04:24.324495 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/780425c2-7f97-4d00-a992-bf0ef5be3876-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qw75z\" (UID: \"780425c2-7f97-4d00-a992-bf0ef5be3876\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qw75z" Feb 19 19:04:24 crc kubenswrapper[4749]: I0219 19:04:24.324561 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/780425c2-7f97-4d00-a992-bf0ef5be3876-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qw75z\" (UID: \"780425c2-7f97-4d00-a992-bf0ef5be3876\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qw75z" Feb 19 19:04:24 crc kubenswrapper[4749]: I0219 19:04:24.329413 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/780425c2-7f97-4d00-a992-bf0ef5be3876-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qw75z\" (UID: \"780425c2-7f97-4d00-a992-bf0ef5be3876\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qw75z" Feb 19 19:04:24 crc kubenswrapper[4749]: I0219 19:04:24.331434 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/780425c2-7f97-4d00-a992-bf0ef5be3876-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qw75z\" (UID: \"780425c2-7f97-4d00-a992-bf0ef5be3876\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qw75z" Feb 19 19:04:24 crc kubenswrapper[4749]: I0219 19:04:24.343125 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gqkz\" (UniqueName: \"kubernetes.io/projected/780425c2-7f97-4d00-a992-bf0ef5be3876-kube-api-access-8gqkz\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qw75z\" (UID: \"780425c2-7f97-4d00-a992-bf0ef5be3876\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qw75z" Feb 19 19:04:24 crc kubenswrapper[4749]: I0219 19:04:24.354824 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qw75z" Feb 19 19:04:24 crc kubenswrapper[4749]: I0219 19:04:24.870718 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qw75z"] Feb 19 19:04:24 crc kubenswrapper[4749]: I0219 19:04:24.929679 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qw75z" event={"ID":"780425c2-7f97-4d00-a992-bf0ef5be3876","Type":"ContainerStarted","Data":"56df161d0ed31ae457d9e0753f5547ce79647cf3ab8c6a75de0ab7ab94c5e9ca"} Feb 19 19:04:25 crc kubenswrapper[4749]: I0219 19:04:25.939835 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qw75z" event={"ID":"780425c2-7f97-4d00-a992-bf0ef5be3876","Type":"ContainerStarted","Data":"8191f5637379b668e7a8a8ca17018663855701cb9638e62eebc122375f2af08c"} Feb 19 19:04:25 crc kubenswrapper[4749]: I0219 19:04:25.969313 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qw75z" podStartSLOduration=2.552570683 podStartE2EDuration="2.969292182s" podCreationTimestamp="2026-02-19 19:04:23 +0000 UTC" firstStartedPulling="2026-02-19 19:04:24.870579373 +0000 UTC m=+1838.831799327" lastFinishedPulling="2026-02-19 19:04:25.287300852 +0000 UTC m=+1839.248520826" observedRunningTime="2026-02-19 19:04:25.962661737 +0000 UTC m=+1839.923881701" watchObservedRunningTime="2026-02-19 19:04:25.969292182 +0000 UTC m=+1839.930512136" Feb 19 19:04:26 crc kubenswrapper[4749]: I0219 19:04:26.684798 4749 scope.go:117] "RemoveContainer" containerID="45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" Feb 19 19:04:26 crc kubenswrapper[4749]: E0219 19:04:26.685089 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:04:35 crc kubenswrapper[4749]: I0219 19:04:35.056382 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-lnwdf"] Feb 19 19:04:35 crc kubenswrapper[4749]: I0219 19:04:35.067602 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-lnwdf"] Feb 19 19:04:36 crc kubenswrapper[4749]: I0219 19:04:36.690317 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7" path="/var/lib/kubelet/pods/ff1fe8c4-ed92-4058-80e5-bc16c5f93ae7/volumes" Feb 19 19:04:37 crc kubenswrapper[4749]: I0219 19:04:37.678650 4749 scope.go:117] "RemoveContainer" containerID="45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" Feb 19 19:04:37 crc kubenswrapper[4749]: E0219 19:04:37.679090 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:04:39 crc kubenswrapper[4749]: I0219 19:04:39.033980 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-znh79"] Feb 19 19:04:39 crc kubenswrapper[4749]: I0219 19:04:39.045875 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-znh79"] Feb 19 19:04:40 crc kubenswrapper[4749]: I0219 19:04:40.690419 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46f858df-0b0b-4e6a-ada4-c3063dbae4c5" path="/var/lib/kubelet/pods/46f858df-0b0b-4e6a-ada4-c3063dbae4c5/volumes" Feb 19 19:04:52 crc kubenswrapper[4749]: I0219 19:04:52.679582 4749 scope.go:117] "RemoveContainer" containerID="45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" Feb 19 19:04:52 crc kubenswrapper[4749]: E0219 19:04:52.682242 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:05:07 crc kubenswrapper[4749]: I0219 19:05:07.679336 4749 scope.go:117] "RemoveContainer" containerID="45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" Feb 19 19:05:07 crc kubenswrapper[4749]: E0219 19:05:07.680105 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:05:15 crc kubenswrapper[4749]: I0219 19:05:15.655412 4749 generic.go:334] "Generic (PLEG): container finished" podID="780425c2-7f97-4d00-a992-bf0ef5be3876" containerID="8191f5637379b668e7a8a8ca17018663855701cb9638e62eebc122375f2af08c" exitCode=0 Feb 19 19:05:15 crc kubenswrapper[4749]: I0219 19:05:15.655488 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qw75z" event={"ID":"780425c2-7f97-4d00-a992-bf0ef5be3876","Type":"ContainerDied","Data":"8191f5637379b668e7a8a8ca17018663855701cb9638e62eebc122375f2af08c"} Feb 19 19:05:17 crc kubenswrapper[4749]: I0219 19:05:17.038159 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qw75z" Feb 19 19:05:17 crc kubenswrapper[4749]: I0219 19:05:17.092984 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/780425c2-7f97-4d00-a992-bf0ef5be3876-ssh-key-openstack-edpm-ipam\") pod \"780425c2-7f97-4d00-a992-bf0ef5be3876\" (UID: \"780425c2-7f97-4d00-a992-bf0ef5be3876\") " Feb 19 19:05:17 crc kubenswrapper[4749]: I0219 19:05:17.093111 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/780425c2-7f97-4d00-a992-bf0ef5be3876-inventory\") pod \"780425c2-7f97-4d00-a992-bf0ef5be3876\" (UID: \"780425c2-7f97-4d00-a992-bf0ef5be3876\") " Feb 19 19:05:17 crc kubenswrapper[4749]: I0219 19:05:17.093236 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gqkz\" (UniqueName: \"kubernetes.io/projected/780425c2-7f97-4d00-a992-bf0ef5be3876-kube-api-access-8gqkz\") pod \"780425c2-7f97-4d00-a992-bf0ef5be3876\" (UID: \"780425c2-7f97-4d00-a992-bf0ef5be3876\") " Feb 19 19:05:17 crc kubenswrapper[4749]: I0219 19:05:17.100257 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/780425c2-7f97-4d00-a992-bf0ef5be3876-kube-api-access-8gqkz" (OuterVolumeSpecName: "kube-api-access-8gqkz") pod "780425c2-7f97-4d00-a992-bf0ef5be3876" (UID: "780425c2-7f97-4d00-a992-bf0ef5be3876"). InnerVolumeSpecName "kube-api-access-8gqkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:05:17 crc kubenswrapper[4749]: I0219 19:05:17.119359 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/780425c2-7f97-4d00-a992-bf0ef5be3876-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "780425c2-7f97-4d00-a992-bf0ef5be3876" (UID: "780425c2-7f97-4d00-a992-bf0ef5be3876"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:05:17 crc kubenswrapper[4749]: I0219 19:05:17.122896 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/780425c2-7f97-4d00-a992-bf0ef5be3876-inventory" (OuterVolumeSpecName: "inventory") pod "780425c2-7f97-4d00-a992-bf0ef5be3876" (UID: "780425c2-7f97-4d00-a992-bf0ef5be3876"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:05:17 crc kubenswrapper[4749]: I0219 19:05:17.195603 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/780425c2-7f97-4d00-a992-bf0ef5be3876-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:05:17 crc kubenswrapper[4749]: I0219 19:05:17.195632 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/780425c2-7f97-4d00-a992-bf0ef5be3876-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:05:17 crc kubenswrapper[4749]: I0219 19:05:17.195643 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gqkz\" (UniqueName: \"kubernetes.io/projected/780425c2-7f97-4d00-a992-bf0ef5be3876-kube-api-access-8gqkz\") on node \"crc\" DevicePath \"\"" Feb 19 19:05:17 crc kubenswrapper[4749]: I0219 19:05:17.674950 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qw75z" event={"ID":"780425c2-7f97-4d00-a992-bf0ef5be3876","Type":"ContainerDied","Data":"56df161d0ed31ae457d9e0753f5547ce79647cf3ab8c6a75de0ab7ab94c5e9ca"} Feb 19 19:05:17 crc kubenswrapper[4749]: I0219 19:05:17.675323 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56df161d0ed31ae457d9e0753f5547ce79647cf3ab8c6a75de0ab7ab94c5e9ca" Feb 19 19:05:17 crc kubenswrapper[4749]: I0219 19:05:17.675013 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qw75z" Feb 19 19:05:17 crc kubenswrapper[4749]: I0219 19:05:17.805066 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dz7fw"] Feb 19 19:05:17 crc kubenswrapper[4749]: E0219 19:05:17.807302 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="780425c2-7f97-4d00-a992-bf0ef5be3876" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:05:17 crc kubenswrapper[4749]: I0219 19:05:17.807335 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="780425c2-7f97-4d00-a992-bf0ef5be3876" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:05:17 crc kubenswrapper[4749]: I0219 19:05:17.807767 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="780425c2-7f97-4d00-a992-bf0ef5be3876" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:05:17 crc kubenswrapper[4749]: I0219 19:05:17.808555 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dz7fw" Feb 19 19:05:17 crc kubenswrapper[4749]: I0219 19:05:17.811483 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:05:17 crc kubenswrapper[4749]: I0219 19:05:17.811986 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:05:17 crc kubenswrapper[4749]: I0219 19:05:17.812189 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:05:17 crc kubenswrapper[4749]: I0219 19:05:17.812368 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4pz98" Feb 19 19:05:17 crc kubenswrapper[4749]: I0219 19:05:17.824095 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dz7fw"] Feb 19 19:05:18 crc kubenswrapper[4749]: I0219 19:05:18.011881 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhpqt\" (UniqueName: \"kubernetes.io/projected/31a27783-6092-4112-97a0-2335f4f251b4-kube-api-access-zhpqt\") pod \"ssh-known-hosts-edpm-deployment-dz7fw\" (UID: \"31a27783-6092-4112-97a0-2335f4f251b4\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz7fw" Feb 19 19:05:18 crc kubenswrapper[4749]: I0219 19:05:18.011933 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31a27783-6092-4112-97a0-2335f4f251b4-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dz7fw\" (UID: \"31a27783-6092-4112-97a0-2335f4f251b4\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz7fw" Feb 19 19:05:18 crc kubenswrapper[4749]: I0219 19:05:18.012060 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/31a27783-6092-4112-97a0-2335f4f251b4-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dz7fw\" (UID: \"31a27783-6092-4112-97a0-2335f4f251b4\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz7fw" Feb 19 19:05:18 crc kubenswrapper[4749]: I0219 19:05:18.113358 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhpqt\" (UniqueName: \"kubernetes.io/projected/31a27783-6092-4112-97a0-2335f4f251b4-kube-api-access-zhpqt\") pod \"ssh-known-hosts-edpm-deployment-dz7fw\" (UID: \"31a27783-6092-4112-97a0-2335f4f251b4\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz7fw" Feb 19 19:05:18 crc kubenswrapper[4749]: I0219 19:05:18.113421 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31a27783-6092-4112-97a0-2335f4f251b4-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dz7fw\" (UID: \"31a27783-6092-4112-97a0-2335f4f251b4\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz7fw" Feb 19 19:05:18 crc kubenswrapper[4749]: I0219 19:05:18.113537 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/31a27783-6092-4112-97a0-2335f4f251b4-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dz7fw\" (UID: \"31a27783-6092-4112-97a0-2335f4f251b4\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz7fw" Feb 19 19:05:18 crc kubenswrapper[4749]: I0219 19:05:18.118799 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/31a27783-6092-4112-97a0-2335f4f251b4-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dz7fw\" (UID: \"31a27783-6092-4112-97a0-2335f4f251b4\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz7fw" Feb 19 19:05:18 crc kubenswrapper[4749]: I0219 19:05:18.127212 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31a27783-6092-4112-97a0-2335f4f251b4-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dz7fw\" (UID: \"31a27783-6092-4112-97a0-2335f4f251b4\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz7fw" Feb 19 19:05:18 crc kubenswrapper[4749]: I0219 19:05:18.130882 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhpqt\" (UniqueName: \"kubernetes.io/projected/31a27783-6092-4112-97a0-2335f4f251b4-kube-api-access-zhpqt\") pod \"ssh-known-hosts-edpm-deployment-dz7fw\" (UID: \"31a27783-6092-4112-97a0-2335f4f251b4\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz7fw" Feb 19 19:05:18 crc kubenswrapper[4749]: I0219 19:05:18.144855 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dz7fw" Feb 19 19:05:18 crc kubenswrapper[4749]: I0219 19:05:18.421303 4749 scope.go:117] "RemoveContainer" containerID="5235264586dc444c39526b9f671323ba394d06f3ac246609350fd331e16a09fa" Feb 19 19:05:18 crc kubenswrapper[4749]: I0219 19:05:18.467276 4749 scope.go:117] "RemoveContainer" containerID="e741afc503439fe8962a221396ee12bb86f3f6ae7001a6e89147f203bd41ac97" Feb 19 19:05:18 crc kubenswrapper[4749]: I0219 19:05:18.632408 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dz7fw"] Feb 19 19:05:18 crc kubenswrapper[4749]: I0219 19:05:18.679238 4749 scope.go:117] "RemoveContainer" containerID="45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" Feb 19 19:05:18 crc kubenswrapper[4749]: E0219 19:05:18.679518 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:05:18 crc kubenswrapper[4749]: I0219 19:05:18.689680 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dz7fw" event={"ID":"31a27783-6092-4112-97a0-2335f4f251b4","Type":"ContainerStarted","Data":"208fd1bd5c991b76c0e7074ae465f83333f2470ca6affc600a6916da6619e301"} Feb 19 19:05:19 crc kubenswrapper[4749]: I0219 19:05:19.697015 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dz7fw" event={"ID":"31a27783-6092-4112-97a0-2335f4f251b4","Type":"ContainerStarted","Data":"9248e12633d899bf4312ebef10893c456bd3aa07fc7b4dd07d54b0bf7378ab7c"} Feb 19 19:05:19 crc kubenswrapper[4749]: I0219 19:05:19.718939 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-dz7fw" podStartSLOduration=2.258578347 podStartE2EDuration="2.718922574s" podCreationTimestamp="2026-02-19 19:05:17 +0000 UTC" firstStartedPulling="2026-02-19 19:05:18.635940024 +0000 UTC m=+1892.597159968" lastFinishedPulling="2026-02-19 19:05:19.096284241 +0000 UTC m=+1893.057504195" observedRunningTime="2026-02-19 19:05:19.714950448 +0000 UTC m=+1893.676170402" watchObservedRunningTime="2026-02-19 19:05:19.718922574 +0000 UTC m=+1893.680142528" Feb 19 19:05:20 crc kubenswrapper[4749]: I0219 19:05:20.040999 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-4j58v"] Feb 19 19:05:20 crc kubenswrapper[4749]: I0219 19:05:20.053070 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-4j58v"] Feb 19 19:05:20 crc kubenswrapper[4749]: I0219 19:05:20.690575 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="886c599b-2da2-4553-bc21-4b0b4e50c3bc" path="/var/lib/kubelet/pods/886c599b-2da2-4553-bc21-4b0b4e50c3bc/volumes" Feb 19 19:05:25 crc kubenswrapper[4749]: E0219 19:05:25.623406 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31a27783_6092_4112_97a0_2335f4f251b4.slice/crio-9248e12633d899bf4312ebef10893c456bd3aa07fc7b4dd07d54b0bf7378ab7c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31a27783_6092_4112_97a0_2335f4f251b4.slice/crio-conmon-9248e12633d899bf4312ebef10893c456bd3aa07fc7b4dd07d54b0bf7378ab7c.scope\": RecentStats: unable to find data in memory cache]" Feb 19 19:05:25 crc kubenswrapper[4749]: I0219 19:05:25.742237 4749 generic.go:334] "Generic (PLEG): container finished" podID="31a27783-6092-4112-97a0-2335f4f251b4" containerID="9248e12633d899bf4312ebef10893c456bd3aa07fc7b4dd07d54b0bf7378ab7c" exitCode=0 Feb 19 19:05:25 crc kubenswrapper[4749]: I0219 19:05:25.742501 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dz7fw" event={"ID":"31a27783-6092-4112-97a0-2335f4f251b4","Type":"ContainerDied","Data":"9248e12633d899bf4312ebef10893c456bd3aa07fc7b4dd07d54b0bf7378ab7c"} Feb 19 19:05:27 crc kubenswrapper[4749]: I0219 19:05:27.164673 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dz7fw" Feb 19 19:05:27 crc kubenswrapper[4749]: I0219 19:05:27.205482 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhpqt\" (UniqueName: \"kubernetes.io/projected/31a27783-6092-4112-97a0-2335f4f251b4-kube-api-access-zhpqt\") pod \"31a27783-6092-4112-97a0-2335f4f251b4\" (UID: \"31a27783-6092-4112-97a0-2335f4f251b4\") " Feb 19 19:05:27 crc kubenswrapper[4749]: I0219 19:05:27.205756 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/31a27783-6092-4112-97a0-2335f4f251b4-inventory-0\") pod \"31a27783-6092-4112-97a0-2335f4f251b4\" (UID: \"31a27783-6092-4112-97a0-2335f4f251b4\") " Feb 19 19:05:27 crc kubenswrapper[4749]: I0219 19:05:27.205779 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31a27783-6092-4112-97a0-2335f4f251b4-ssh-key-openstack-edpm-ipam\") pod \"31a27783-6092-4112-97a0-2335f4f251b4\" (UID: \"31a27783-6092-4112-97a0-2335f4f251b4\") " Feb 19 19:05:27 crc kubenswrapper[4749]: I0219 19:05:27.214643 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31a27783-6092-4112-97a0-2335f4f251b4-kube-api-access-zhpqt" (OuterVolumeSpecName: "kube-api-access-zhpqt") pod "31a27783-6092-4112-97a0-2335f4f251b4" (UID: "31a27783-6092-4112-97a0-2335f4f251b4"). InnerVolumeSpecName "kube-api-access-zhpqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:05:27 crc kubenswrapper[4749]: I0219 19:05:27.238931 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a27783-6092-4112-97a0-2335f4f251b4-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "31a27783-6092-4112-97a0-2335f4f251b4" (UID: "31a27783-6092-4112-97a0-2335f4f251b4"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:05:27 crc kubenswrapper[4749]: I0219 19:05:27.240134 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a27783-6092-4112-97a0-2335f4f251b4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "31a27783-6092-4112-97a0-2335f4f251b4" (UID: "31a27783-6092-4112-97a0-2335f4f251b4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:05:27 crc kubenswrapper[4749]: I0219 19:05:27.308694 4749 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/31a27783-6092-4112-97a0-2335f4f251b4-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:05:27 crc kubenswrapper[4749]: I0219 19:05:27.308743 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31a27783-6092-4112-97a0-2335f4f251b4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:05:27 crc kubenswrapper[4749]: I0219 19:05:27.308757 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhpqt\" (UniqueName: \"kubernetes.io/projected/31a27783-6092-4112-97a0-2335f4f251b4-kube-api-access-zhpqt\") on node \"crc\" DevicePath \"\"" Feb 19 19:05:27 crc kubenswrapper[4749]: I0219 19:05:27.768905 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dz7fw" event={"ID":"31a27783-6092-4112-97a0-2335f4f251b4","Type":"ContainerDied","Data":"208fd1bd5c991b76c0e7074ae465f83333f2470ca6affc600a6916da6619e301"} Feb 19 19:05:27 crc kubenswrapper[4749]: I0219 19:05:27.768968 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="208fd1bd5c991b76c0e7074ae465f83333f2470ca6affc600a6916da6619e301" Feb 19 19:05:27 crc kubenswrapper[4749]: I0219 19:05:27.768942 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dz7fw" Feb 19 19:05:27 crc kubenswrapper[4749]: I0219 19:05:27.830966 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bqhvv"] Feb 19 19:05:27 crc kubenswrapper[4749]: E0219 19:05:27.832262 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a27783-6092-4112-97a0-2335f4f251b4" containerName="ssh-known-hosts-edpm-deployment" Feb 19 19:05:27 crc kubenswrapper[4749]: I0219 19:05:27.832361 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a27783-6092-4112-97a0-2335f4f251b4" containerName="ssh-known-hosts-edpm-deployment" Feb 19 19:05:27 crc kubenswrapper[4749]: I0219 19:05:27.833258 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="31a27783-6092-4112-97a0-2335f4f251b4" containerName="ssh-known-hosts-edpm-deployment" Feb 19 19:05:27 crc kubenswrapper[4749]: I0219 19:05:27.834593 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bqhvv" Feb 19 19:05:27 crc kubenswrapper[4749]: I0219 19:05:27.837899 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:05:27 crc kubenswrapper[4749]: I0219 19:05:27.838443 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4pz98" Feb 19 19:05:27 crc kubenswrapper[4749]: I0219 19:05:27.838689 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:05:27 crc kubenswrapper[4749]: I0219 19:05:27.838990 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:05:27 crc kubenswrapper[4749]: I0219 19:05:27.846832 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bqhvv"] Feb 19 19:05:27 crc kubenswrapper[4749]: I0219 19:05:27.922824 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b354c1a0-43cd-442f-b818-54fc0bc89cad-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bqhvv\" (UID: \"b354c1a0-43cd-442f-b818-54fc0bc89cad\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bqhvv" Feb 19 19:05:27 crc kubenswrapper[4749]: I0219 19:05:27.922941 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf4bl\" (UniqueName: \"kubernetes.io/projected/b354c1a0-43cd-442f-b818-54fc0bc89cad-kube-api-access-tf4bl\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bqhvv\" (UID: \"b354c1a0-43cd-442f-b818-54fc0bc89cad\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bqhvv" Feb 19 19:05:27 crc kubenswrapper[4749]: I0219 19:05:27.923067 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b354c1a0-43cd-442f-b818-54fc0bc89cad-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bqhvv\" (UID: \"b354c1a0-43cd-442f-b818-54fc0bc89cad\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bqhvv" Feb 19 19:05:28 crc kubenswrapper[4749]: I0219 19:05:28.024276 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b354c1a0-43cd-442f-b818-54fc0bc89cad-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bqhvv\" (UID: \"b354c1a0-43cd-442f-b818-54fc0bc89cad\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bqhvv" Feb 19 19:05:28 crc kubenswrapper[4749]: I0219 19:05:28.024437 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b354c1a0-43cd-442f-b818-54fc0bc89cad-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bqhvv\" (UID: \"b354c1a0-43cd-442f-b818-54fc0bc89cad\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bqhvv" Feb 19 19:05:28 crc kubenswrapper[4749]: I0219 19:05:28.024491 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf4bl\" (UniqueName: \"kubernetes.io/projected/b354c1a0-43cd-442f-b818-54fc0bc89cad-kube-api-access-tf4bl\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bqhvv\" (UID: \"b354c1a0-43cd-442f-b818-54fc0bc89cad\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bqhvv" Feb 19 19:05:28 crc kubenswrapper[4749]: I0219 19:05:28.028668 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b354c1a0-43cd-442f-b818-54fc0bc89cad-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bqhvv\" (UID: \"b354c1a0-43cd-442f-b818-54fc0bc89cad\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bqhvv" Feb 19 19:05:28 crc kubenswrapper[4749]: I0219 19:05:28.029184 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b354c1a0-43cd-442f-b818-54fc0bc89cad-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bqhvv\" (UID: \"b354c1a0-43cd-442f-b818-54fc0bc89cad\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bqhvv" Feb 19 19:05:28 crc kubenswrapper[4749]: I0219 19:05:28.041913 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf4bl\" (UniqueName: \"kubernetes.io/projected/b354c1a0-43cd-442f-b818-54fc0bc89cad-kube-api-access-tf4bl\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bqhvv\" (UID: \"b354c1a0-43cd-442f-b818-54fc0bc89cad\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bqhvv" Feb 19 19:05:28 crc kubenswrapper[4749]: I0219 19:05:28.163487 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bqhvv" Feb 19 19:05:28 crc kubenswrapper[4749]: I0219 19:05:28.605535 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bqhvv"] Feb 19 19:05:28 crc kubenswrapper[4749]: I0219 19:05:28.780505 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bqhvv" event={"ID":"b354c1a0-43cd-442f-b818-54fc0bc89cad","Type":"ContainerStarted","Data":"a58cdb8c148c8eb1b095104c56f9038be6726fb58cdf7c1ece46534e9d5f9f2b"} Feb 19 19:05:29 crc kubenswrapper[4749]: I0219 19:05:29.790524 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bqhvv" event={"ID":"b354c1a0-43cd-442f-b818-54fc0bc89cad","Type":"ContainerStarted","Data":"f853a2467f109ca91564f6612ec4866bd57d42394f530cabec19434d70cf59a1"} Feb 19 19:05:29 crc kubenswrapper[4749]: I0219 19:05:29.810856 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bqhvv" podStartSLOduration=2.430885848 podStartE2EDuration="2.810831805s" podCreationTimestamp="2026-02-19 19:05:27 +0000 UTC" firstStartedPulling="2026-02-19 19:05:28.610678518 +0000 UTC m=+1902.571898472" lastFinishedPulling="2026-02-19 19:05:28.990624475 +0000 UTC m=+1902.951844429" observedRunningTime="2026-02-19 19:05:29.805054555 +0000 UTC m=+1903.766274509" watchObservedRunningTime="2026-02-19 19:05:29.810831805 +0000 UTC m=+1903.772051759" Feb 19 19:05:32 crc kubenswrapper[4749]: I0219 19:05:32.680400 4749 scope.go:117] "RemoveContainer" containerID="45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" Feb 19 19:05:32 crc kubenswrapper[4749]: E0219 19:05:32.680947 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:05:33 crc kubenswrapper[4749]: I0219 19:05:33.400648 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-phkxw"] Feb 19 19:05:33 crc kubenswrapper[4749]: I0219 19:05:33.402815 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phkxw" Feb 19 19:05:33 crc kubenswrapper[4749]: I0219 19:05:33.422982 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/356a254a-9963-46fb-8c4e-97414e163c64-utilities\") pod \"redhat-operators-phkxw\" (UID: \"356a254a-9963-46fb-8c4e-97414e163c64\") " pod="openshift-marketplace/redhat-operators-phkxw" Feb 19 19:05:33 crc kubenswrapper[4749]: I0219 19:05:33.423045 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/356a254a-9963-46fb-8c4e-97414e163c64-catalog-content\") pod \"redhat-operators-phkxw\" (UID: \"356a254a-9963-46fb-8c4e-97414e163c64\") " pod="openshift-marketplace/redhat-operators-phkxw" Feb 19 19:05:33 crc kubenswrapper[4749]: I0219 19:05:33.423478 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plsdh\" (UniqueName: \"kubernetes.io/projected/356a254a-9963-46fb-8c4e-97414e163c64-kube-api-access-plsdh\") pod \"redhat-operators-phkxw\" (UID: \"356a254a-9963-46fb-8c4e-97414e163c64\") " pod="openshift-marketplace/redhat-operators-phkxw" Feb 19 19:05:33 crc kubenswrapper[4749]: I0219 19:05:33.424557 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-phkxw"] Feb 19 19:05:33 crc kubenswrapper[4749]: I0219 19:05:33.525702 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plsdh\" (UniqueName: \"kubernetes.io/projected/356a254a-9963-46fb-8c4e-97414e163c64-kube-api-access-plsdh\") pod \"redhat-operators-phkxw\" (UID: \"356a254a-9963-46fb-8c4e-97414e163c64\") " pod="openshift-marketplace/redhat-operators-phkxw" Feb 19 19:05:33 crc kubenswrapper[4749]: I0219 19:05:33.525807 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/356a254a-9963-46fb-8c4e-97414e163c64-utilities\") pod \"redhat-operators-phkxw\" (UID: \"356a254a-9963-46fb-8c4e-97414e163c64\") " pod="openshift-marketplace/redhat-operators-phkxw" Feb 19 19:05:33 crc kubenswrapper[4749]: I0219 19:05:33.525841 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/356a254a-9963-46fb-8c4e-97414e163c64-catalog-content\") pod \"redhat-operators-phkxw\" (UID: \"356a254a-9963-46fb-8c4e-97414e163c64\") " pod="openshift-marketplace/redhat-operators-phkxw" Feb 19 19:05:33 crc kubenswrapper[4749]: I0219 19:05:33.526552 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/356a254a-9963-46fb-8c4e-97414e163c64-catalog-content\") pod \"redhat-operators-phkxw\" (UID: \"356a254a-9963-46fb-8c4e-97414e163c64\") " pod="openshift-marketplace/redhat-operators-phkxw" Feb 19 19:05:33 crc kubenswrapper[4749]: I0219 19:05:33.526558 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/356a254a-9963-46fb-8c4e-97414e163c64-utilities\") pod \"redhat-operators-phkxw\" (UID: \"356a254a-9963-46fb-8c4e-97414e163c64\") " pod="openshift-marketplace/redhat-operators-phkxw" Feb 19 19:05:33 crc kubenswrapper[4749]: I0219 19:05:33.564899 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plsdh\" (UniqueName: \"kubernetes.io/projected/356a254a-9963-46fb-8c4e-97414e163c64-kube-api-access-plsdh\") pod \"redhat-operators-phkxw\" (UID: \"356a254a-9963-46fb-8c4e-97414e163c64\") " pod="openshift-marketplace/redhat-operators-phkxw" Feb 19 19:05:33 crc kubenswrapper[4749]: I0219 19:05:33.720833 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phkxw" Feb 19 19:05:34 crc kubenswrapper[4749]: I0219 19:05:34.215174 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-phkxw"] Feb 19 19:05:34 crc kubenswrapper[4749]: I0219 19:05:34.840842 4749 generic.go:334] "Generic (PLEG): container finished" podID="356a254a-9963-46fb-8c4e-97414e163c64" containerID="09da5846e964121505fd6c5f198a160f8afa6b0b64af7d74cbda4f2de69d08b6" exitCode=0 Feb 19 19:05:34 crc kubenswrapper[4749]: I0219 19:05:34.840897 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phkxw" event={"ID":"356a254a-9963-46fb-8c4e-97414e163c64","Type":"ContainerDied","Data":"09da5846e964121505fd6c5f198a160f8afa6b0b64af7d74cbda4f2de69d08b6"} Feb 19 19:05:34 crc kubenswrapper[4749]: I0219 19:05:34.841260 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phkxw" event={"ID":"356a254a-9963-46fb-8c4e-97414e163c64","Type":"ContainerStarted","Data":"fbdb200ebdf3f29b89e98d81a7b75b36a301c53cf22173ed7ef9c5d578cb78d9"} Feb 19 19:05:34 crc kubenswrapper[4749]: I0219 19:05:34.844900 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 19:05:35 crc kubenswrapper[4749]: I0219 19:05:35.763564 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4kjtl"] Feb 19 19:05:35 crc kubenswrapper[4749]: I0219 19:05:35.770626 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4kjtl" Feb 19 19:05:35 crc kubenswrapper[4749]: I0219 19:05:35.782051 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4kjtl"] Feb 19 19:05:35 crc kubenswrapper[4749]: I0219 19:05:35.867114 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4n8w\" (UniqueName: \"kubernetes.io/projected/718b5905-f54d-4d1a-829b-9009840ab909-kube-api-access-w4n8w\") pod \"certified-operators-4kjtl\" (UID: \"718b5905-f54d-4d1a-829b-9009840ab909\") " pod="openshift-marketplace/certified-operators-4kjtl" Feb 19 19:05:35 crc kubenswrapper[4749]: I0219 19:05:35.868280 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/718b5905-f54d-4d1a-829b-9009840ab909-utilities\") pod \"certified-operators-4kjtl\" (UID: \"718b5905-f54d-4d1a-829b-9009840ab909\") " pod="openshift-marketplace/certified-operators-4kjtl" Feb 19 19:05:35 crc kubenswrapper[4749]: I0219 19:05:35.868370 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/718b5905-f54d-4d1a-829b-9009840ab909-catalog-content\") pod \"certified-operators-4kjtl\" (UID: \"718b5905-f54d-4d1a-829b-9009840ab909\") " pod="openshift-marketplace/certified-operators-4kjtl" Feb 19 19:05:35 crc kubenswrapper[4749]: I0219 19:05:35.969883 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/718b5905-f54d-4d1a-829b-9009840ab909-utilities\") pod \"certified-operators-4kjtl\" (UID: \"718b5905-f54d-4d1a-829b-9009840ab909\") " pod="openshift-marketplace/certified-operators-4kjtl" Feb 19 19:05:35 crc kubenswrapper[4749]: I0219 19:05:35.969927 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/718b5905-f54d-4d1a-829b-9009840ab909-catalog-content\") pod \"certified-operators-4kjtl\" (UID: \"718b5905-f54d-4d1a-829b-9009840ab909\") " pod="openshift-marketplace/certified-operators-4kjtl" Feb 19 19:05:35 crc kubenswrapper[4749]: I0219 19:05:35.969959 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4n8w\" (UniqueName: \"kubernetes.io/projected/718b5905-f54d-4d1a-829b-9009840ab909-kube-api-access-w4n8w\") pod \"certified-operators-4kjtl\" (UID: \"718b5905-f54d-4d1a-829b-9009840ab909\") " pod="openshift-marketplace/certified-operators-4kjtl" Feb 19 19:05:35 crc kubenswrapper[4749]: I0219 19:05:35.970492 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/718b5905-f54d-4d1a-829b-9009840ab909-utilities\") pod \"certified-operators-4kjtl\" (UID: \"718b5905-f54d-4d1a-829b-9009840ab909\") " pod="openshift-marketplace/certified-operators-4kjtl" Feb 19 19:05:35 crc kubenswrapper[4749]: I0219 19:05:35.970582 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/718b5905-f54d-4d1a-829b-9009840ab909-catalog-content\") pod \"certified-operators-4kjtl\" (UID: \"718b5905-f54d-4d1a-829b-9009840ab909\") " pod="openshift-marketplace/certified-operators-4kjtl" Feb 19 19:05:35 crc kubenswrapper[4749]: I0219 19:05:35.991242 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4n8w\" (UniqueName: \"kubernetes.io/projected/718b5905-f54d-4d1a-829b-9009840ab909-kube-api-access-w4n8w\") pod \"certified-operators-4kjtl\" (UID: \"718b5905-f54d-4d1a-829b-9009840ab909\") " pod="openshift-marketplace/certified-operators-4kjtl" Feb 19 19:05:36 crc kubenswrapper[4749]: I0219 19:05:36.117156 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4kjtl" Feb 19 19:05:36 crc kubenswrapper[4749]: I0219 19:05:36.671796 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4kjtl"] Feb 19 19:05:36 crc kubenswrapper[4749]: I0219 19:05:36.864288 4749 generic.go:334] "Generic (PLEG): container finished" podID="718b5905-f54d-4d1a-829b-9009840ab909" containerID="1bd35351f55184fe26d509e45059c078fd18104013b0c34c81409df28fe01a48" exitCode=0 Feb 19 19:05:36 crc kubenswrapper[4749]: I0219 19:05:36.864411 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4kjtl" event={"ID":"718b5905-f54d-4d1a-829b-9009840ab909","Type":"ContainerDied","Data":"1bd35351f55184fe26d509e45059c078fd18104013b0c34c81409df28fe01a48"} Feb 19 19:05:36 crc kubenswrapper[4749]: I0219 19:05:36.864464 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4kjtl" event={"ID":"718b5905-f54d-4d1a-829b-9009840ab909","Type":"ContainerStarted","Data":"ff137339bcc9531bd2689fefaa5d1f855cbe27965e2c092547ee0775877ee85a"} Feb 19 19:05:36 crc kubenswrapper[4749]: I0219 19:05:36.867384 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phkxw" event={"ID":"356a254a-9963-46fb-8c4e-97414e163c64","Type":"ContainerStarted","Data":"09de8547cbe9f502a11f18624666c91fb9d099eddbf30d69e2043f8ae5221429"} Feb 19 19:05:37 crc kubenswrapper[4749]: I0219 19:05:37.878874 4749 generic.go:334] "Generic (PLEG): container finished" podID="b354c1a0-43cd-442f-b818-54fc0bc89cad" containerID="f853a2467f109ca91564f6612ec4866bd57d42394f530cabec19434d70cf59a1" exitCode=0 Feb 19 19:05:37 crc kubenswrapper[4749]: I0219 19:05:37.879216 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bqhvv" event={"ID":"b354c1a0-43cd-442f-b818-54fc0bc89cad","Type":"ContainerDied","Data":"f853a2467f109ca91564f6612ec4866bd57d42394f530cabec19434d70cf59a1"} Feb 19 19:05:38 crc kubenswrapper[4749]: I0219 19:05:38.888457 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4kjtl" event={"ID":"718b5905-f54d-4d1a-829b-9009840ab909","Type":"ContainerStarted","Data":"d494190809ddaefedf6efdf5011bac32793db30a7c65ef78f3b832085e77bf05"} Feb 19 19:05:39 crc kubenswrapper[4749]: I0219 19:05:39.349312 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bqhvv" Feb 19 19:05:39 crc kubenswrapper[4749]: I0219 19:05:39.539935 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tf4bl\" (UniqueName: \"kubernetes.io/projected/b354c1a0-43cd-442f-b818-54fc0bc89cad-kube-api-access-tf4bl\") pod \"b354c1a0-43cd-442f-b818-54fc0bc89cad\" (UID: \"b354c1a0-43cd-442f-b818-54fc0bc89cad\") " Feb 19 19:05:39 crc kubenswrapper[4749]: I0219 19:05:39.540041 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b354c1a0-43cd-442f-b818-54fc0bc89cad-inventory\") pod \"b354c1a0-43cd-442f-b818-54fc0bc89cad\" (UID: \"b354c1a0-43cd-442f-b818-54fc0bc89cad\") " Feb 19 19:05:39 crc kubenswrapper[4749]: I0219 19:05:39.540100 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b354c1a0-43cd-442f-b818-54fc0bc89cad-ssh-key-openstack-edpm-ipam\") pod \"b354c1a0-43cd-442f-b818-54fc0bc89cad\" (UID: \"b354c1a0-43cd-442f-b818-54fc0bc89cad\") " Feb 19 19:05:39 crc kubenswrapper[4749]: I0219 19:05:39.553045 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b354c1a0-43cd-442f-b818-54fc0bc89cad-kube-api-access-tf4bl" (OuterVolumeSpecName: "kube-api-access-tf4bl") pod "b354c1a0-43cd-442f-b818-54fc0bc89cad" (UID: "b354c1a0-43cd-442f-b818-54fc0bc89cad"). InnerVolumeSpecName "kube-api-access-tf4bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:05:39 crc kubenswrapper[4749]: I0219 19:05:39.566991 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b354c1a0-43cd-442f-b818-54fc0bc89cad-inventory" (OuterVolumeSpecName: "inventory") pod "b354c1a0-43cd-442f-b818-54fc0bc89cad" (UID: "b354c1a0-43cd-442f-b818-54fc0bc89cad"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:05:39 crc kubenswrapper[4749]: I0219 19:05:39.568193 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b354c1a0-43cd-442f-b818-54fc0bc89cad-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b354c1a0-43cd-442f-b818-54fc0bc89cad" (UID: "b354c1a0-43cd-442f-b818-54fc0bc89cad"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:05:39 crc kubenswrapper[4749]: I0219 19:05:39.642611 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tf4bl\" (UniqueName: \"kubernetes.io/projected/b354c1a0-43cd-442f-b818-54fc0bc89cad-kube-api-access-tf4bl\") on node \"crc\" DevicePath \"\"" Feb 19 19:05:39 crc kubenswrapper[4749]: I0219 19:05:39.642652 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b354c1a0-43cd-442f-b818-54fc0bc89cad-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:05:39 crc kubenswrapper[4749]: I0219 19:05:39.642662 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b354c1a0-43cd-442f-b818-54fc0bc89cad-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:05:39 crc kubenswrapper[4749]: I0219 19:05:39.905478 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bqhvv" event={"ID":"b354c1a0-43cd-442f-b818-54fc0bc89cad","Type":"ContainerDied","Data":"a58cdb8c148c8eb1b095104c56f9038be6726fb58cdf7c1ece46534e9d5f9f2b"} Feb 19 19:05:39 crc kubenswrapper[4749]: I0219 19:05:39.905515 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bqhvv" Feb 19 19:05:39 crc kubenswrapper[4749]: I0219 19:05:39.905534 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a58cdb8c148c8eb1b095104c56f9038be6726fb58cdf7c1ece46534e9d5f9f2b" Feb 19 19:05:40 crc kubenswrapper[4749]: I0219 19:05:40.082762 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd"] Feb 19 19:05:40 crc kubenswrapper[4749]: E0219 19:05:40.083459 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b354c1a0-43cd-442f-b818-54fc0bc89cad" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:05:40 crc kubenswrapper[4749]: I0219 19:05:40.083493 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b354c1a0-43cd-442f-b818-54fc0bc89cad" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:05:40 crc kubenswrapper[4749]: I0219 19:05:40.083939 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b354c1a0-43cd-442f-b818-54fc0bc89cad" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:05:40 crc kubenswrapper[4749]: I0219 19:05:40.085356 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd" Feb 19 19:05:40 crc kubenswrapper[4749]: I0219 19:05:40.087296 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:05:40 crc kubenswrapper[4749]: I0219 19:05:40.087437 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4pz98" Feb 19 19:05:40 crc kubenswrapper[4749]: I0219 19:05:40.088288 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:05:40 crc kubenswrapper[4749]: I0219 19:05:40.088335 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:05:40 crc kubenswrapper[4749]: I0219 19:05:40.098627 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd"] Feb 19 19:05:40 crc kubenswrapper[4749]: I0219 19:05:40.255645 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8nbq\" (UniqueName: \"kubernetes.io/projected/52a3df8c-e606-4fe8-990f-cef2a807956d-kube-api-access-x8nbq\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd\" (UID: \"52a3df8c-e606-4fe8-990f-cef2a807956d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd" Feb 19 19:05:40 crc kubenswrapper[4749]: I0219 19:05:40.256543 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/52a3df8c-e606-4fe8-990f-cef2a807956d-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd\" (UID: \"52a3df8c-e606-4fe8-990f-cef2a807956d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd" Feb 19 19:05:40 crc kubenswrapper[4749]: I0219 19:05:40.256667 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52a3df8c-e606-4fe8-990f-cef2a807956d-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd\" (UID: \"52a3df8c-e606-4fe8-990f-cef2a807956d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd" Feb 19 19:05:40 crc kubenswrapper[4749]: I0219 19:05:40.359257 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/52a3df8c-e606-4fe8-990f-cef2a807956d-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd\" (UID: \"52a3df8c-e606-4fe8-990f-cef2a807956d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd" Feb 19 19:05:40 crc kubenswrapper[4749]: I0219 19:05:40.359536 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52a3df8c-e606-4fe8-990f-cef2a807956d-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd\" (UID: \"52a3df8c-e606-4fe8-990f-cef2a807956d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd" Feb 19 19:05:40 crc kubenswrapper[4749]: I0219 19:05:40.359725 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8nbq\" (UniqueName: \"kubernetes.io/projected/52a3df8c-e606-4fe8-990f-cef2a807956d-kube-api-access-x8nbq\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd\" (UID: \"52a3df8c-e606-4fe8-990f-cef2a807956d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd" Feb 19 19:05:40 crc kubenswrapper[4749]: I0219 19:05:40.368066 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/52a3df8c-e606-4fe8-990f-cef2a807956d-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd\" (UID: \"52a3df8c-e606-4fe8-990f-cef2a807956d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd" Feb 19 19:05:40 crc kubenswrapper[4749]: I0219 19:05:40.368066 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52a3df8c-e606-4fe8-990f-cef2a807956d-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd\" (UID: \"52a3df8c-e606-4fe8-990f-cef2a807956d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd" Feb 19 19:05:40 crc kubenswrapper[4749]: I0219 19:05:40.386635 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8nbq\" (UniqueName: \"kubernetes.io/projected/52a3df8c-e606-4fe8-990f-cef2a807956d-kube-api-access-x8nbq\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd\" (UID: \"52a3df8c-e606-4fe8-990f-cef2a807956d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd" Feb 19 19:05:40 crc kubenswrapper[4749]: I0219 19:05:40.410045 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd" Feb 19 19:05:40 crc kubenswrapper[4749]: I0219 19:05:40.965618 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd"] Feb 19 19:05:40 crc kubenswrapper[4749]: W0219 19:05:40.968952 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52a3df8c_e606_4fe8_990f_cef2a807956d.slice/crio-4ced978e34a61c0f2eaab72c5aa0fbec0e4fcf5d07f03c7b170896be22a4dc64 WatchSource:0}: Error finding container 4ced978e34a61c0f2eaab72c5aa0fbec0e4fcf5d07f03c7b170896be22a4dc64: Status 404 returned error can't find the container with id 4ced978e34a61c0f2eaab72c5aa0fbec0e4fcf5d07f03c7b170896be22a4dc64 Feb 19 19:05:41 crc kubenswrapper[4749]: I0219 19:05:41.921949 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd" event={"ID":"52a3df8c-e606-4fe8-990f-cef2a807956d","Type":"ContainerStarted","Data":"4ced978e34a61c0f2eaab72c5aa0fbec0e4fcf5d07f03c7b170896be22a4dc64"} Feb 19 19:05:42 crc kubenswrapper[4749]: I0219 19:05:42.930758 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd" event={"ID":"52a3df8c-e606-4fe8-990f-cef2a807956d","Type":"ContainerStarted","Data":"4e059bdf69e8d5dadf65ab23420048396565f9c0a64a4fc96641d31f9f2d5c87"} Feb 19 19:05:42 crc kubenswrapper[4749]: I0219 19:05:42.933794 4749 generic.go:334] "Generic (PLEG): container finished" podID="718b5905-f54d-4d1a-829b-9009840ab909" containerID="d494190809ddaefedf6efdf5011bac32793db30a7c65ef78f3b832085e77bf05" exitCode=0 Feb 19 19:05:42 crc kubenswrapper[4749]: I0219 19:05:42.933862 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4kjtl" event={"ID":"718b5905-f54d-4d1a-829b-9009840ab909","Type":"ContainerDied","Data":"d494190809ddaefedf6efdf5011bac32793db30a7c65ef78f3b832085e77bf05"} Feb 19 19:05:42 crc kubenswrapper[4749]: I0219 19:05:42.938375 4749 generic.go:334] "Generic (PLEG): container finished" podID="356a254a-9963-46fb-8c4e-97414e163c64" containerID="09de8547cbe9f502a11f18624666c91fb9d099eddbf30d69e2043f8ae5221429" exitCode=0 Feb 19 19:05:42 crc kubenswrapper[4749]: I0219 19:05:42.938405 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phkxw" event={"ID":"356a254a-9963-46fb-8c4e-97414e163c64","Type":"ContainerDied","Data":"09de8547cbe9f502a11f18624666c91fb9d099eddbf30d69e2043f8ae5221429"} Feb 19 19:05:42 crc kubenswrapper[4749]: I0219 19:05:42.980491 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd" podStartSLOduration=2.2917213849999998 podStartE2EDuration="2.980475393s" podCreationTimestamp="2026-02-19 19:05:40 +0000 UTC" firstStartedPulling="2026-02-19 19:05:40.970766564 +0000 UTC m=+1914.931986528" lastFinishedPulling="2026-02-19 19:05:41.659520582 +0000 UTC m=+1915.620740536" observedRunningTime="2026-02-19 19:05:42.970673717 +0000 UTC m=+1916.931893671" watchObservedRunningTime="2026-02-19 19:05:42.980475393 +0000 UTC m=+1916.941695347" Feb 19 19:05:43 crc kubenswrapper[4749]: I0219 19:05:43.948332 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4kjtl" event={"ID":"718b5905-f54d-4d1a-829b-9009840ab909","Type":"ContainerStarted","Data":"52bad48dac56d7de2cf750ba41935c5b0d3867e270f262218c74723449674180"} Feb 19 19:05:43 crc kubenswrapper[4749]: I0219 19:05:43.950451 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phkxw" event={"ID":"356a254a-9963-46fb-8c4e-97414e163c64","Type":"ContainerStarted","Data":"88de1f32cdf2b2d648a5934ee0130691a2f8d98ed5eba4de39ac9d01287b5e05"} Feb 19 19:05:43 crc kubenswrapper[4749]: I0219 19:05:43.968420 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4kjtl" podStartSLOduration=2.497974794 podStartE2EDuration="8.968399629s" podCreationTimestamp="2026-02-19 19:05:35 +0000 UTC" firstStartedPulling="2026-02-19 19:05:36.866905208 +0000 UTC m=+1910.828125162" lastFinishedPulling="2026-02-19 19:05:43.337330053 +0000 UTC m=+1917.298549997" observedRunningTime="2026-02-19 19:05:43.966266888 +0000 UTC m=+1917.927486842" watchObservedRunningTime="2026-02-19 19:05:43.968399629 +0000 UTC m=+1917.929619583" Feb 19 19:05:43 crc kubenswrapper[4749]: I0219 19:05:43.998642 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-phkxw" podStartSLOduration=2.36744983 podStartE2EDuration="10.998624348s" podCreationTimestamp="2026-02-19 19:05:33 +0000 UTC" firstStartedPulling="2026-02-19 19:05:34.844704688 +0000 UTC m=+1908.805924642" lastFinishedPulling="2026-02-19 19:05:43.475879216 +0000 UTC m=+1917.437099160" observedRunningTime="2026-02-19 19:05:43.993401722 +0000 UTC m=+1917.954621686" watchObservedRunningTime="2026-02-19 19:05:43.998624348 +0000 UTC m=+1917.959844302" Feb 19 19:05:45 crc kubenswrapper[4749]: I0219 19:05:45.679302 4749 scope.go:117] "RemoveContainer" containerID="45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" Feb 19 19:05:45 crc kubenswrapper[4749]: E0219 19:05:45.679754 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:05:46 crc kubenswrapper[4749]: I0219 19:05:46.118287 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4kjtl" Feb 19 19:05:46 crc kubenswrapper[4749]: I0219 19:05:46.118334 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4kjtl" Feb 19 19:05:46 crc kubenswrapper[4749]: I0219 19:05:46.166266 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4kjtl" Feb 19 19:05:51 crc kubenswrapper[4749]: I0219 19:05:51.009613 4749 generic.go:334] "Generic (PLEG): container finished" podID="52a3df8c-e606-4fe8-990f-cef2a807956d" containerID="4e059bdf69e8d5dadf65ab23420048396565f9c0a64a4fc96641d31f9f2d5c87" exitCode=0 Feb 19 19:05:51 crc kubenswrapper[4749]: I0219 19:05:51.009702 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd" event={"ID":"52a3df8c-e606-4fe8-990f-cef2a807956d","Type":"ContainerDied","Data":"4e059bdf69e8d5dadf65ab23420048396565f9c0a64a4fc96641d31f9f2d5c87"} Feb 19 19:05:52 crc kubenswrapper[4749]: I0219 19:05:52.495485 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd" Feb 19 19:05:52 crc kubenswrapper[4749]: I0219 19:05:52.574741 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52a3df8c-e606-4fe8-990f-cef2a807956d-inventory\") pod \"52a3df8c-e606-4fe8-990f-cef2a807956d\" (UID: \"52a3df8c-e606-4fe8-990f-cef2a807956d\") " Feb 19 19:05:52 crc kubenswrapper[4749]: I0219 19:05:52.574817 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/52a3df8c-e606-4fe8-990f-cef2a807956d-ssh-key-openstack-edpm-ipam\") pod \"52a3df8c-e606-4fe8-990f-cef2a807956d\" (UID: \"52a3df8c-e606-4fe8-990f-cef2a807956d\") " Feb 19 19:05:52 crc kubenswrapper[4749]: I0219 19:05:52.574973 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8nbq\" (UniqueName: \"kubernetes.io/projected/52a3df8c-e606-4fe8-990f-cef2a807956d-kube-api-access-x8nbq\") pod \"52a3df8c-e606-4fe8-990f-cef2a807956d\" (UID: \"52a3df8c-e606-4fe8-990f-cef2a807956d\") " Feb 19 19:05:52 crc kubenswrapper[4749]: I0219 19:05:52.581766 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52a3df8c-e606-4fe8-990f-cef2a807956d-kube-api-access-x8nbq" (OuterVolumeSpecName: "kube-api-access-x8nbq") pod "52a3df8c-e606-4fe8-990f-cef2a807956d" (UID: "52a3df8c-e606-4fe8-990f-cef2a807956d"). InnerVolumeSpecName "kube-api-access-x8nbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:05:52 crc kubenswrapper[4749]: I0219 19:05:52.615378 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52a3df8c-e606-4fe8-990f-cef2a807956d-inventory" (OuterVolumeSpecName: "inventory") pod "52a3df8c-e606-4fe8-990f-cef2a807956d" (UID: "52a3df8c-e606-4fe8-990f-cef2a807956d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:05:52 crc kubenswrapper[4749]: I0219 19:05:52.626626 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52a3df8c-e606-4fe8-990f-cef2a807956d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "52a3df8c-e606-4fe8-990f-cef2a807956d" (UID: "52a3df8c-e606-4fe8-990f-cef2a807956d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:05:52 crc kubenswrapper[4749]: I0219 19:05:52.677721 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52a3df8c-e606-4fe8-990f-cef2a807956d-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:05:52 crc kubenswrapper[4749]: I0219 19:05:52.677960 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/52a3df8c-e606-4fe8-990f-cef2a807956d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:05:52 crc kubenswrapper[4749]: I0219 19:05:52.678058 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8nbq\" (UniqueName: \"kubernetes.io/projected/52a3df8c-e606-4fe8-990f-cef2a807956d-kube-api-access-x8nbq\") on node \"crc\" DevicePath \"\"" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.032425 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd" event={"ID":"52a3df8c-e606-4fe8-990f-cef2a807956d","Type":"ContainerDied","Data":"4ced978e34a61c0f2eaab72c5aa0fbec0e4fcf5d07f03c7b170896be22a4dc64"} Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.032790 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ced978e34a61c0f2eaab72c5aa0fbec0e4fcf5d07f03c7b170896be22a4dc64" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.032467 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.178019 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2"] Feb 19 19:05:53 crc kubenswrapper[4749]: E0219 19:05:53.178654 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52a3df8c-e606-4fe8-990f-cef2a807956d" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.178681 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="52a3df8c-e606-4fe8-990f-cef2a807956d" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.178946 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="52a3df8c-e606-4fe8-990f-cef2a807956d" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.179769 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.183659 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.183908 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.184281 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.185016 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.188001 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.188057 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.189077 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4pz98" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.190867 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2"] Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.192745 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.290870 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.290916 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.290982 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.291017 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lmrt\" (UniqueName: \"kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-kube-api-access-6lmrt\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.291189 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.291238 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.291287 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.291373 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.291395 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.291465 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.291509 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.291543 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.291663 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.291692 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.393469 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.393535 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.393567 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lmrt\" (UniqueName: \"kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-kube-api-access-6lmrt\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.393603 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.393630 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.393654 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.393694 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.393725 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.393743 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.393777 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.393805 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.393859 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.393878 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.393925 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.399615 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.399618 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.400539 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.401134 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.401201 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.401404 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.401950 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.402134 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.402685 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.402871 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.403803 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.404147 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.409990 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lmrt\" (UniqueName: \"kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-kube-api-access-6lmrt\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.411652 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.502528 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.721268 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-phkxw" Feb 19 19:05:53 crc kubenswrapper[4749]: I0219 19:05:53.724717 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-phkxw" Feb 19 19:05:54 crc kubenswrapper[4749]: I0219 19:05:54.023949 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2"] Feb 19 19:05:54 crc kubenswrapper[4749]: I0219 19:05:54.047549 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" event={"ID":"486b7134-04b2-4255-b831-c7da1c6fdcfe","Type":"ContainerStarted","Data":"d717abefb7075228e451ff92f8569360c4f89dd3ef7460e15d69a89040ccccfa"} Feb 19 19:05:54 crc kubenswrapper[4749]: I0219 19:05:54.784199 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-phkxw" podUID="356a254a-9963-46fb-8c4e-97414e163c64" containerName="registry-server" probeResult="failure" output=< Feb 19 19:05:54 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Feb 19 19:05:54 crc kubenswrapper[4749]: > Feb 19 19:05:56 crc kubenswrapper[4749]: I0219 19:05:56.975137 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4kjtl" Feb 19 19:05:57 crc kubenswrapper[4749]: I0219 19:05:57.037909 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4kjtl"] Feb 19 19:05:57 crc kubenswrapper[4749]: I0219 19:05:57.679562 4749 scope.go:117] "RemoveContainer" containerID="45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" Feb 19 19:05:57 crc kubenswrapper[4749]: E0219 19:05:57.680132 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:05:57 crc kubenswrapper[4749]: I0219 19:05:57.961253 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4kjtl" podUID="718b5905-f54d-4d1a-829b-9009840ab909" containerName="registry-server" containerID="cri-o://52bad48dac56d7de2cf750ba41935c5b0d3867e270f262218c74723449674180" gracePeriod=2 Feb 19 19:05:57 crc kubenswrapper[4749]: I0219 19:05:57.962076 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" event={"ID":"486b7134-04b2-4255-b831-c7da1c6fdcfe","Type":"ContainerStarted","Data":"2fb3b167d0f6cb0ac5a25ef6480bac2f94517f15a5d7fe7e3dc21ed3f6dbed92"} Feb 19 19:05:57 crc kubenswrapper[4749]: I0219 19:05:57.988661 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" podStartSLOduration=4.587290297 podStartE2EDuration="4.988647441s" podCreationTimestamp="2026-02-19 19:05:53 +0000 UTC" firstStartedPulling="2026-02-19 19:05:54.027762055 +0000 UTC m=+1927.988982029" lastFinishedPulling="2026-02-19 19:05:54.429119219 +0000 UTC m=+1928.390339173" observedRunningTime="2026-02-19 19:05:57.985017903 +0000 UTC m=+1931.946237917" watchObservedRunningTime="2026-02-19 19:05:57.988647441 +0000 UTC m=+1931.949867395" Feb 19 19:05:58 crc kubenswrapper[4749]: I0219 19:05:58.420201 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4kjtl" Feb 19 19:05:58 crc kubenswrapper[4749]: I0219 19:05:58.563284 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/718b5905-f54d-4d1a-829b-9009840ab909-utilities\") pod \"718b5905-f54d-4d1a-829b-9009840ab909\" (UID: \"718b5905-f54d-4d1a-829b-9009840ab909\") " Feb 19 19:05:58 crc kubenswrapper[4749]: I0219 19:05:58.563376 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4n8w\" (UniqueName: \"kubernetes.io/projected/718b5905-f54d-4d1a-829b-9009840ab909-kube-api-access-w4n8w\") pod \"718b5905-f54d-4d1a-829b-9009840ab909\" (UID: \"718b5905-f54d-4d1a-829b-9009840ab909\") " Feb 19 19:05:58 crc kubenswrapper[4749]: I0219 19:05:58.563507 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/718b5905-f54d-4d1a-829b-9009840ab909-catalog-content\") pod \"718b5905-f54d-4d1a-829b-9009840ab909\" (UID: \"718b5905-f54d-4d1a-829b-9009840ab909\") " Feb 19 19:05:58 crc kubenswrapper[4749]: I0219 19:05:58.564265 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/718b5905-f54d-4d1a-829b-9009840ab909-utilities" (OuterVolumeSpecName: "utilities") pod "718b5905-f54d-4d1a-829b-9009840ab909" (UID: "718b5905-f54d-4d1a-829b-9009840ab909"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:05:58 crc kubenswrapper[4749]: I0219 19:05:58.570255 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/718b5905-f54d-4d1a-829b-9009840ab909-kube-api-access-w4n8w" (OuterVolumeSpecName: "kube-api-access-w4n8w") pod "718b5905-f54d-4d1a-829b-9009840ab909" (UID: "718b5905-f54d-4d1a-829b-9009840ab909"). InnerVolumeSpecName "kube-api-access-w4n8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:05:58 crc kubenswrapper[4749]: I0219 19:05:58.612324 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/718b5905-f54d-4d1a-829b-9009840ab909-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "718b5905-f54d-4d1a-829b-9009840ab909" (UID: "718b5905-f54d-4d1a-829b-9009840ab909"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:05:58 crc kubenswrapper[4749]: I0219 19:05:58.665285 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/718b5905-f54d-4d1a-829b-9009840ab909-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:05:58 crc kubenswrapper[4749]: I0219 19:05:58.665320 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/718b5905-f54d-4d1a-829b-9009840ab909-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:05:58 crc kubenswrapper[4749]: I0219 19:05:58.665332 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4n8w\" (UniqueName: \"kubernetes.io/projected/718b5905-f54d-4d1a-829b-9009840ab909-kube-api-access-w4n8w\") on node \"crc\" DevicePath \"\"" Feb 19 19:05:58 crc kubenswrapper[4749]: I0219 19:05:58.972552 4749 generic.go:334] "Generic (PLEG): container finished" podID="718b5905-f54d-4d1a-829b-9009840ab909" containerID="52bad48dac56d7de2cf750ba41935c5b0d3867e270f262218c74723449674180" exitCode=0 Feb 19 19:05:58 crc kubenswrapper[4749]: I0219 19:05:58.972615 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4kjtl" event={"ID":"718b5905-f54d-4d1a-829b-9009840ab909","Type":"ContainerDied","Data":"52bad48dac56d7de2cf750ba41935c5b0d3867e270f262218c74723449674180"} Feb 19 19:05:58 crc kubenswrapper[4749]: I0219 19:05:58.972707 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4kjtl" event={"ID":"718b5905-f54d-4d1a-829b-9009840ab909","Type":"ContainerDied","Data":"ff137339bcc9531bd2689fefaa5d1f855cbe27965e2c092547ee0775877ee85a"} Feb 19 19:05:58 crc kubenswrapper[4749]: I0219 19:05:58.972717 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4kjtl" Feb 19 19:05:58 crc kubenswrapper[4749]: I0219 19:05:58.972732 4749 scope.go:117] "RemoveContainer" containerID="52bad48dac56d7de2cf750ba41935c5b0d3867e270f262218c74723449674180" Feb 19 19:05:58 crc kubenswrapper[4749]: I0219 19:05:58.999065 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4kjtl"] Feb 19 19:05:59 crc kubenswrapper[4749]: I0219 19:05:59.009333 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4kjtl"] Feb 19 19:05:59 crc kubenswrapper[4749]: I0219 19:05:59.011760 4749 scope.go:117] "RemoveContainer" containerID="d494190809ddaefedf6efdf5011bac32793db30a7c65ef78f3b832085e77bf05" Feb 19 19:05:59 crc kubenswrapper[4749]: I0219 19:05:59.052153 4749 scope.go:117] "RemoveContainer" containerID="1bd35351f55184fe26d509e45059c078fd18104013b0c34c81409df28fe01a48" Feb 19 19:05:59 crc kubenswrapper[4749]: I0219 19:05:59.082996 4749 scope.go:117] "RemoveContainer" containerID="52bad48dac56d7de2cf750ba41935c5b0d3867e270f262218c74723449674180" Feb 19 19:05:59 crc kubenswrapper[4749]: E0219 19:05:59.083507 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52bad48dac56d7de2cf750ba41935c5b0d3867e270f262218c74723449674180\": container with ID starting with 52bad48dac56d7de2cf750ba41935c5b0d3867e270f262218c74723449674180 not found: ID does not exist" containerID="52bad48dac56d7de2cf750ba41935c5b0d3867e270f262218c74723449674180" Feb 19 19:05:59 crc kubenswrapper[4749]: I0219 19:05:59.083545 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52bad48dac56d7de2cf750ba41935c5b0d3867e270f262218c74723449674180"} err="failed to get container status \"52bad48dac56d7de2cf750ba41935c5b0d3867e270f262218c74723449674180\": rpc error: code = NotFound desc = could not find container \"52bad48dac56d7de2cf750ba41935c5b0d3867e270f262218c74723449674180\": container with ID starting with 52bad48dac56d7de2cf750ba41935c5b0d3867e270f262218c74723449674180 not found: ID does not exist" Feb 19 19:05:59 crc kubenswrapper[4749]: I0219 19:05:59.083570 4749 scope.go:117] "RemoveContainer" containerID="d494190809ddaefedf6efdf5011bac32793db30a7c65ef78f3b832085e77bf05" Feb 19 19:05:59 crc kubenswrapper[4749]: E0219 19:05:59.083794 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d494190809ddaefedf6efdf5011bac32793db30a7c65ef78f3b832085e77bf05\": container with ID starting with d494190809ddaefedf6efdf5011bac32793db30a7c65ef78f3b832085e77bf05 not found: ID does not exist" containerID="d494190809ddaefedf6efdf5011bac32793db30a7c65ef78f3b832085e77bf05" Feb 19 19:05:59 crc kubenswrapper[4749]: I0219 19:05:59.083824 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d494190809ddaefedf6efdf5011bac32793db30a7c65ef78f3b832085e77bf05"} err="failed to get container status \"d494190809ddaefedf6efdf5011bac32793db30a7c65ef78f3b832085e77bf05\": rpc error: code = NotFound desc = could not find container \"d494190809ddaefedf6efdf5011bac32793db30a7c65ef78f3b832085e77bf05\": container with ID starting with d494190809ddaefedf6efdf5011bac32793db30a7c65ef78f3b832085e77bf05 not found: ID does not exist" Feb 19 19:05:59 crc kubenswrapper[4749]: I0219 19:05:59.083841 4749 scope.go:117] "RemoveContainer" containerID="1bd35351f55184fe26d509e45059c078fd18104013b0c34c81409df28fe01a48" Feb 19 19:05:59 crc kubenswrapper[4749]: E0219 19:05:59.084283 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bd35351f55184fe26d509e45059c078fd18104013b0c34c81409df28fe01a48\": container with ID starting with 1bd35351f55184fe26d509e45059c078fd18104013b0c34c81409df28fe01a48 not found: ID does not exist" containerID="1bd35351f55184fe26d509e45059c078fd18104013b0c34c81409df28fe01a48" Feb 19 19:05:59 crc kubenswrapper[4749]: I0219 19:05:59.084308 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bd35351f55184fe26d509e45059c078fd18104013b0c34c81409df28fe01a48"} err="failed to get container status \"1bd35351f55184fe26d509e45059c078fd18104013b0c34c81409df28fe01a48\": rpc error: code = NotFound desc = could not find container \"1bd35351f55184fe26d509e45059c078fd18104013b0c34c81409df28fe01a48\": container with ID starting with 1bd35351f55184fe26d509e45059c078fd18104013b0c34c81409df28fe01a48 not found: ID does not exist" Feb 19 19:06:00 crc kubenswrapper[4749]: I0219 19:06:00.689768 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="718b5905-f54d-4d1a-829b-9009840ab909" path="/var/lib/kubelet/pods/718b5905-f54d-4d1a-829b-9009840ab909/volumes" Feb 19 19:06:03 crc kubenswrapper[4749]: I0219 19:06:03.788735 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-phkxw" Feb 19 19:06:03 crc kubenswrapper[4749]: I0219 19:06:03.877913 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-phkxw" Feb 19 19:06:04 crc kubenswrapper[4749]: I0219 19:06:04.037210 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-phkxw"] Feb 19 19:06:05 crc kubenswrapper[4749]: I0219 19:06:05.026654 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-phkxw" podUID="356a254a-9963-46fb-8c4e-97414e163c64" containerName="registry-server" containerID="cri-o://88de1f32cdf2b2d648a5934ee0130691a2f8d98ed5eba4de39ac9d01287b5e05" gracePeriod=2 Feb 19 19:06:05 crc kubenswrapper[4749]: I0219 19:06:05.499075 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phkxw" Feb 19 19:06:05 crc kubenswrapper[4749]: I0219 19:06:05.636995 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/356a254a-9963-46fb-8c4e-97414e163c64-utilities\") pod \"356a254a-9963-46fb-8c4e-97414e163c64\" (UID: \"356a254a-9963-46fb-8c4e-97414e163c64\") " Feb 19 19:06:05 crc kubenswrapper[4749]: I0219 19:06:05.637293 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plsdh\" (UniqueName: \"kubernetes.io/projected/356a254a-9963-46fb-8c4e-97414e163c64-kube-api-access-plsdh\") pod \"356a254a-9963-46fb-8c4e-97414e163c64\" (UID: \"356a254a-9963-46fb-8c4e-97414e163c64\") " Feb 19 19:06:05 crc kubenswrapper[4749]: I0219 19:06:05.637493 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/356a254a-9963-46fb-8c4e-97414e163c64-catalog-content\") pod \"356a254a-9963-46fb-8c4e-97414e163c64\" (UID: \"356a254a-9963-46fb-8c4e-97414e163c64\") " Feb 19 19:06:05 crc kubenswrapper[4749]: I0219 19:06:05.637686 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/356a254a-9963-46fb-8c4e-97414e163c64-utilities" (OuterVolumeSpecName: "utilities") pod "356a254a-9963-46fb-8c4e-97414e163c64" (UID: "356a254a-9963-46fb-8c4e-97414e163c64"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:06:05 crc kubenswrapper[4749]: I0219 19:06:05.638047 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/356a254a-9963-46fb-8c4e-97414e163c64-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:06:05 crc kubenswrapper[4749]: I0219 19:06:05.642921 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/356a254a-9963-46fb-8c4e-97414e163c64-kube-api-access-plsdh" (OuterVolumeSpecName: "kube-api-access-plsdh") pod "356a254a-9963-46fb-8c4e-97414e163c64" (UID: "356a254a-9963-46fb-8c4e-97414e163c64"). InnerVolumeSpecName "kube-api-access-plsdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:06:05 crc kubenswrapper[4749]: I0219 19:06:05.739842 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plsdh\" (UniqueName: \"kubernetes.io/projected/356a254a-9963-46fb-8c4e-97414e163c64-kube-api-access-plsdh\") on node \"crc\" DevicePath \"\"" Feb 19 19:06:05 crc kubenswrapper[4749]: I0219 19:06:05.765631 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/356a254a-9963-46fb-8c4e-97414e163c64-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "356a254a-9963-46fb-8c4e-97414e163c64" (UID: "356a254a-9963-46fb-8c4e-97414e163c64"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:06:05 crc kubenswrapper[4749]: I0219 19:06:05.841681 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/356a254a-9963-46fb-8c4e-97414e163c64-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:06:06 crc kubenswrapper[4749]: I0219 19:06:06.047906 4749 generic.go:334] "Generic (PLEG): container finished" podID="356a254a-9963-46fb-8c4e-97414e163c64" containerID="88de1f32cdf2b2d648a5934ee0130691a2f8d98ed5eba4de39ac9d01287b5e05" exitCode=0 Feb 19 19:06:06 crc kubenswrapper[4749]: I0219 19:06:06.047960 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phkxw" event={"ID":"356a254a-9963-46fb-8c4e-97414e163c64","Type":"ContainerDied","Data":"88de1f32cdf2b2d648a5934ee0130691a2f8d98ed5eba4de39ac9d01287b5e05"} Feb 19 19:06:06 crc kubenswrapper[4749]: I0219 19:06:06.048005 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phkxw" event={"ID":"356a254a-9963-46fb-8c4e-97414e163c64","Type":"ContainerDied","Data":"fbdb200ebdf3f29b89e98d81a7b75b36a301c53cf22173ed7ef9c5d578cb78d9"} Feb 19 19:06:06 crc kubenswrapper[4749]: I0219 19:06:06.048046 4749 scope.go:117] "RemoveContainer" containerID="88de1f32cdf2b2d648a5934ee0130691a2f8d98ed5eba4de39ac9d01287b5e05" Feb 19 19:06:06 crc kubenswrapper[4749]: I0219 19:06:06.048212 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phkxw" Feb 19 19:06:06 crc kubenswrapper[4749]: I0219 19:06:06.079127 4749 scope.go:117] "RemoveContainer" containerID="09de8547cbe9f502a11f18624666c91fb9d099eddbf30d69e2043f8ae5221429" Feb 19 19:06:06 crc kubenswrapper[4749]: I0219 19:06:06.091956 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-phkxw"] Feb 19 19:06:06 crc kubenswrapper[4749]: I0219 19:06:06.101477 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-phkxw"] Feb 19 19:06:06 crc kubenswrapper[4749]: I0219 19:06:06.107308 4749 scope.go:117] "RemoveContainer" containerID="09da5846e964121505fd6c5f198a160f8afa6b0b64af7d74cbda4f2de69d08b6" Feb 19 19:06:06 crc kubenswrapper[4749]: I0219 19:06:06.144903 4749 scope.go:117] "RemoveContainer" containerID="88de1f32cdf2b2d648a5934ee0130691a2f8d98ed5eba4de39ac9d01287b5e05" Feb 19 19:06:06 crc kubenswrapper[4749]: E0219 19:06:06.145343 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88de1f32cdf2b2d648a5934ee0130691a2f8d98ed5eba4de39ac9d01287b5e05\": container with ID starting with 88de1f32cdf2b2d648a5934ee0130691a2f8d98ed5eba4de39ac9d01287b5e05 not found: ID does not exist" containerID="88de1f32cdf2b2d648a5934ee0130691a2f8d98ed5eba4de39ac9d01287b5e05" Feb 19 19:06:06 crc kubenswrapper[4749]: I0219 19:06:06.145386 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88de1f32cdf2b2d648a5934ee0130691a2f8d98ed5eba4de39ac9d01287b5e05"} err="failed to get container status \"88de1f32cdf2b2d648a5934ee0130691a2f8d98ed5eba4de39ac9d01287b5e05\": rpc error: code = NotFound desc = could not find container \"88de1f32cdf2b2d648a5934ee0130691a2f8d98ed5eba4de39ac9d01287b5e05\": container with ID starting with 88de1f32cdf2b2d648a5934ee0130691a2f8d98ed5eba4de39ac9d01287b5e05 not found: ID does not exist" Feb 19 19:06:06 crc kubenswrapper[4749]: I0219 19:06:06.145414 4749 scope.go:117] "RemoveContainer" containerID="09de8547cbe9f502a11f18624666c91fb9d099eddbf30d69e2043f8ae5221429" Feb 19 19:06:06 crc kubenswrapper[4749]: E0219 19:06:06.145907 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09de8547cbe9f502a11f18624666c91fb9d099eddbf30d69e2043f8ae5221429\": container with ID starting with 09de8547cbe9f502a11f18624666c91fb9d099eddbf30d69e2043f8ae5221429 not found: ID does not exist" containerID="09de8547cbe9f502a11f18624666c91fb9d099eddbf30d69e2043f8ae5221429" Feb 19 19:06:06 crc kubenswrapper[4749]: I0219 19:06:06.145928 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09de8547cbe9f502a11f18624666c91fb9d099eddbf30d69e2043f8ae5221429"} err="failed to get container status \"09de8547cbe9f502a11f18624666c91fb9d099eddbf30d69e2043f8ae5221429\": rpc error: code = NotFound desc = could not find container \"09de8547cbe9f502a11f18624666c91fb9d099eddbf30d69e2043f8ae5221429\": container with ID starting with 09de8547cbe9f502a11f18624666c91fb9d099eddbf30d69e2043f8ae5221429 not found: ID does not exist" Feb 19 19:06:06 crc kubenswrapper[4749]: I0219 19:06:06.145941 4749 scope.go:117] "RemoveContainer" containerID="09da5846e964121505fd6c5f198a160f8afa6b0b64af7d74cbda4f2de69d08b6" Feb 19 19:06:06 crc kubenswrapper[4749]: E0219 19:06:06.146229 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09da5846e964121505fd6c5f198a160f8afa6b0b64af7d74cbda4f2de69d08b6\": container with ID starting with 09da5846e964121505fd6c5f198a160f8afa6b0b64af7d74cbda4f2de69d08b6 not found: ID does not exist" containerID="09da5846e964121505fd6c5f198a160f8afa6b0b64af7d74cbda4f2de69d08b6" Feb 19 19:06:06 crc kubenswrapper[4749]: I0219 19:06:06.146260 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09da5846e964121505fd6c5f198a160f8afa6b0b64af7d74cbda4f2de69d08b6"} err="failed to get container status \"09da5846e964121505fd6c5f198a160f8afa6b0b64af7d74cbda4f2de69d08b6\": rpc error: code = NotFound desc = could not find container \"09da5846e964121505fd6c5f198a160f8afa6b0b64af7d74cbda4f2de69d08b6\": container with ID starting with 09da5846e964121505fd6c5f198a160f8afa6b0b64af7d74cbda4f2de69d08b6 not found: ID does not exist" Feb 19 19:06:06 crc kubenswrapper[4749]: I0219 19:06:06.691184 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="356a254a-9963-46fb-8c4e-97414e163c64" path="/var/lib/kubelet/pods/356a254a-9963-46fb-8c4e-97414e163c64/volumes" Feb 19 19:06:08 crc kubenswrapper[4749]: I0219 19:06:08.679297 4749 scope.go:117] "RemoveContainer" containerID="45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" Feb 19 19:06:08 crc kubenswrapper[4749]: E0219 19:06:08.679794 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:06:18 crc kubenswrapper[4749]: I0219 19:06:18.553637 4749 scope.go:117] "RemoveContainer" containerID="f59c625d7a05c202e5b8535a609c6df2377e8d2d375519dd3af845ed1b81b0a2" Feb 19 19:06:21 crc kubenswrapper[4749]: I0219 19:06:21.678896 4749 scope.go:117] "RemoveContainer" containerID="45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" Feb 19 19:06:21 crc kubenswrapper[4749]: E0219 19:06:21.679530 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:06:33 crc kubenswrapper[4749]: I0219 19:06:33.302170 4749 generic.go:334] "Generic (PLEG): container finished" podID="486b7134-04b2-4255-b831-c7da1c6fdcfe" containerID="2fb3b167d0f6cb0ac5a25ef6480bac2f94517f15a5d7fe7e3dc21ed3f6dbed92" exitCode=0 Feb 19 19:06:33 crc kubenswrapper[4749]: I0219 19:06:33.302245 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" event={"ID":"486b7134-04b2-4255-b831-c7da1c6fdcfe","Type":"ContainerDied","Data":"2fb3b167d0f6cb0ac5a25ef6480bac2f94517f15a5d7fe7e3dc21ed3f6dbed92"} Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.679494 4749 scope.go:117] "RemoveContainer" containerID="45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.725903 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.744890 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-inventory\") pod \"486b7134-04b2-4255-b831-c7da1c6fdcfe\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.744950 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-ovn-combined-ca-bundle\") pod \"486b7134-04b2-4255-b831-c7da1c6fdcfe\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.745005 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-openstack-edpm-ipam-ovn-default-certs-0\") pod \"486b7134-04b2-4255-b831-c7da1c6fdcfe\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.745038 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"486b7134-04b2-4255-b831-c7da1c6fdcfe\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.745087 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"486b7134-04b2-4255-b831-c7da1c6fdcfe\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.745129 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"486b7134-04b2-4255-b831-c7da1c6fdcfe\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.745176 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-bootstrap-combined-ca-bundle\") pod \"486b7134-04b2-4255-b831-c7da1c6fdcfe\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.745208 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-ssh-key-openstack-edpm-ipam\") pod \"486b7134-04b2-4255-b831-c7da1c6fdcfe\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.745254 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lmrt\" (UniqueName: \"kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-kube-api-access-6lmrt\") pod \"486b7134-04b2-4255-b831-c7da1c6fdcfe\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.745271 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-libvirt-combined-ca-bundle\") pod \"486b7134-04b2-4255-b831-c7da1c6fdcfe\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.745329 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-nova-combined-ca-bundle\") pod \"486b7134-04b2-4255-b831-c7da1c6fdcfe\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.745357 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-repo-setup-combined-ca-bundle\") pod \"486b7134-04b2-4255-b831-c7da1c6fdcfe\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.745398 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-telemetry-combined-ca-bundle\") pod \"486b7134-04b2-4255-b831-c7da1c6fdcfe\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.745434 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-neutron-metadata-combined-ca-bundle\") pod \"486b7134-04b2-4255-b831-c7da1c6fdcfe\" (UID: \"486b7134-04b2-4255-b831-c7da1c6fdcfe\") " Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.754160 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-kube-api-access-6lmrt" (OuterVolumeSpecName: "kube-api-access-6lmrt") pod "486b7134-04b2-4255-b831-c7da1c6fdcfe" (UID: "486b7134-04b2-4255-b831-c7da1c6fdcfe"). InnerVolumeSpecName "kube-api-access-6lmrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.754250 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "486b7134-04b2-4255-b831-c7da1c6fdcfe" (UID: "486b7134-04b2-4255-b831-c7da1c6fdcfe"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.754767 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "486b7134-04b2-4255-b831-c7da1c6fdcfe" (UID: "486b7134-04b2-4255-b831-c7da1c6fdcfe"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.755172 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "486b7134-04b2-4255-b831-c7da1c6fdcfe" (UID: "486b7134-04b2-4255-b831-c7da1c6fdcfe"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.755560 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "486b7134-04b2-4255-b831-c7da1c6fdcfe" (UID: "486b7134-04b2-4255-b831-c7da1c6fdcfe"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.756178 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "486b7134-04b2-4255-b831-c7da1c6fdcfe" (UID: "486b7134-04b2-4255-b831-c7da1c6fdcfe"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.757493 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "486b7134-04b2-4255-b831-c7da1c6fdcfe" (UID: "486b7134-04b2-4255-b831-c7da1c6fdcfe"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.758228 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "486b7134-04b2-4255-b831-c7da1c6fdcfe" (UID: "486b7134-04b2-4255-b831-c7da1c6fdcfe"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.759062 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "486b7134-04b2-4255-b831-c7da1c6fdcfe" (UID: "486b7134-04b2-4255-b831-c7da1c6fdcfe"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.760252 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "486b7134-04b2-4255-b831-c7da1c6fdcfe" (UID: "486b7134-04b2-4255-b831-c7da1c6fdcfe"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.773722 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "486b7134-04b2-4255-b831-c7da1c6fdcfe" (UID: "486b7134-04b2-4255-b831-c7da1c6fdcfe"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.775927 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "486b7134-04b2-4255-b831-c7da1c6fdcfe" (UID: "486b7134-04b2-4255-b831-c7da1c6fdcfe"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.793463 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-inventory" (OuterVolumeSpecName: "inventory") pod "486b7134-04b2-4255-b831-c7da1c6fdcfe" (UID: "486b7134-04b2-4255-b831-c7da1c6fdcfe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.801848 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "486b7134-04b2-4255-b831-c7da1c6fdcfe" (UID: "486b7134-04b2-4255-b831-c7da1c6fdcfe"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.848520 4749 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.848565 4749 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.848580 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.848593 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.848608 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.848620 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.848632 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.848645 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.848658 4749 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.848669 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.848682 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lmrt\" (UniqueName: \"kubernetes.io/projected/486b7134-04b2-4255-b831-c7da1c6fdcfe-kube-api-access-6lmrt\") on node \"crc\" DevicePath \"\"" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.848693 4749 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.848765 4749 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:06:34 crc kubenswrapper[4749]: I0219 19:06:34.848779 4749 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486b7134-04b2-4255-b831-c7da1c6fdcfe-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.321223 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerStarted","Data":"c5ad61d4b4012dbeef208de5f27d39e7f654c70f2c2ac0549c67e3ec110793ae"} Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.322974 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" event={"ID":"486b7134-04b2-4255-b831-c7da1c6fdcfe","Type":"ContainerDied","Data":"d717abefb7075228e451ff92f8569360c4f89dd3ef7460e15d69a89040ccccfa"} Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.323048 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d717abefb7075228e451ff92f8569360c4f89dd3ef7460e15d69a89040ccccfa" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.323069 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.446097 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-qtct9"] Feb 19 19:06:35 crc kubenswrapper[4749]: E0219 19:06:35.446545 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="718b5905-f54d-4d1a-829b-9009840ab909" containerName="registry-server" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.446566 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="718b5905-f54d-4d1a-829b-9009840ab909" containerName="registry-server" Feb 19 19:06:35 crc kubenswrapper[4749]: E0219 19:06:35.446587 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="486b7134-04b2-4255-b831-c7da1c6fdcfe" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.446595 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="486b7134-04b2-4255-b831-c7da1c6fdcfe" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 19 19:06:35 crc kubenswrapper[4749]: E0219 19:06:35.446615 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="718b5905-f54d-4d1a-829b-9009840ab909" containerName="extract-utilities" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.446621 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="718b5905-f54d-4d1a-829b-9009840ab909" containerName="extract-utilities" Feb 19 19:06:35 crc kubenswrapper[4749]: E0219 19:06:35.446631 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="356a254a-9963-46fb-8c4e-97414e163c64" containerName="registry-server" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.446637 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="356a254a-9963-46fb-8c4e-97414e163c64" containerName="registry-server" Feb 19 19:06:35 crc kubenswrapper[4749]: E0219 19:06:35.446670 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="356a254a-9963-46fb-8c4e-97414e163c64" containerName="extract-content" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.446677 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="356a254a-9963-46fb-8c4e-97414e163c64" containerName="extract-content" Feb 19 19:06:35 crc kubenswrapper[4749]: E0219 19:06:35.446695 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="356a254a-9963-46fb-8c4e-97414e163c64" containerName="extract-utilities" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.446702 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="356a254a-9963-46fb-8c4e-97414e163c64" containerName="extract-utilities" Feb 19 19:06:35 crc kubenswrapper[4749]: E0219 19:06:35.446714 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="718b5905-f54d-4d1a-829b-9009840ab909" containerName="extract-content" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.446720 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="718b5905-f54d-4d1a-829b-9009840ab909" containerName="extract-content" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.446897 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="486b7134-04b2-4255-b831-c7da1c6fdcfe" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.446916 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="356a254a-9963-46fb-8c4e-97414e163c64" containerName="registry-server" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.446930 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="718b5905-f54d-4d1a-829b-9009840ab909" containerName="registry-server" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.447690 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qtct9" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.451171 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.454245 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.454646 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4pz98" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.454903 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.455129 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.460847 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qtct9\" (UID: \"8c86e6e6-0776-48c9-9c58-e1b2d41a4552\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qtct9" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.460963 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qtct9\" (UID: \"8c86e6e6-0776-48c9-9c58-e1b2d41a4552\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qtct9" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.460984 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmhn2\" (UniqueName: \"kubernetes.io/projected/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-kube-api-access-qmhn2\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qtct9\" (UID: \"8c86e6e6-0776-48c9-9c58-e1b2d41a4552\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qtct9" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.461011 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qtct9\" (UID: \"8c86e6e6-0776-48c9-9c58-e1b2d41a4552\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qtct9" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.461057 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qtct9\" (UID: \"8c86e6e6-0776-48c9-9c58-e1b2d41a4552\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qtct9" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.465788 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-qtct9"] Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.562491 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qtct9\" (UID: \"8c86e6e6-0776-48c9-9c58-e1b2d41a4552\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qtct9" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.562533 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmhn2\" (UniqueName: \"kubernetes.io/projected/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-kube-api-access-qmhn2\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qtct9\" (UID: \"8c86e6e6-0776-48c9-9c58-e1b2d41a4552\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qtct9" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.562565 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qtct9\" (UID: \"8c86e6e6-0776-48c9-9c58-e1b2d41a4552\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qtct9" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.562595 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qtct9\" (UID: \"8c86e6e6-0776-48c9-9c58-e1b2d41a4552\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qtct9" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.562645 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qtct9\" (UID: \"8c86e6e6-0776-48c9-9c58-e1b2d41a4552\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qtct9" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.563446 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qtct9\" (UID: \"8c86e6e6-0776-48c9-9c58-e1b2d41a4552\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qtct9" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.567532 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qtct9\" (UID: \"8c86e6e6-0776-48c9-9c58-e1b2d41a4552\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qtct9" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.568955 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qtct9\" (UID: \"8c86e6e6-0776-48c9-9c58-e1b2d41a4552\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qtct9" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.569164 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qtct9\" (UID: \"8c86e6e6-0776-48c9-9c58-e1b2d41a4552\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qtct9" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.582398 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmhn2\" (UniqueName: \"kubernetes.io/projected/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-kube-api-access-qmhn2\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qtct9\" (UID: \"8c86e6e6-0776-48c9-9c58-e1b2d41a4552\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qtct9" Feb 19 19:06:35 crc kubenswrapper[4749]: I0219 19:06:35.775930 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qtct9" Feb 19 19:06:36 crc kubenswrapper[4749]: I0219 19:06:36.306990 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-qtct9"] Feb 19 19:06:36 crc kubenswrapper[4749]: W0219 19:06:36.311486 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c86e6e6_0776_48c9_9c58_e1b2d41a4552.slice/crio-7f4e9f1822b68301a977ddcbc2df98ab0cda2c4770b72eebc0f59de0210a7988 WatchSource:0}: Error finding container 7f4e9f1822b68301a977ddcbc2df98ab0cda2c4770b72eebc0f59de0210a7988: Status 404 returned error can't find the container with id 7f4e9f1822b68301a977ddcbc2df98ab0cda2c4770b72eebc0f59de0210a7988 Feb 19 19:06:36 crc kubenswrapper[4749]: I0219 19:06:36.346093 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qtct9" event={"ID":"8c86e6e6-0776-48c9-9c58-e1b2d41a4552","Type":"ContainerStarted","Data":"7f4e9f1822b68301a977ddcbc2df98ab0cda2c4770b72eebc0f59de0210a7988"} Feb 19 19:06:37 crc kubenswrapper[4749]: I0219 19:06:37.357332 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qtct9" event={"ID":"8c86e6e6-0776-48c9-9c58-e1b2d41a4552","Type":"ContainerStarted","Data":"ca43e1a872d01030922c4693a10f81dd5cdede98be1e17d21ce0c62332afecc6"} Feb 19 19:07:10 crc kubenswrapper[4749]: I0219 19:07:10.086527 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qtct9" podStartSLOduration=34.548174634 podStartE2EDuration="35.086501882s" podCreationTimestamp="2026-02-19 19:06:35 +0000 UTC" firstStartedPulling="2026-02-19 19:06:36.314086154 +0000 UTC m=+1970.275306128" lastFinishedPulling="2026-02-19 19:06:36.852413422 +0000 UTC m=+1970.813633376" observedRunningTime="2026-02-19 19:06:37.380433252 +0000 UTC m=+1971.341653216" watchObservedRunningTime="2026-02-19 19:07:10.086501882 +0000 UTC m=+2004.047721856" Feb 19 19:07:10 crc kubenswrapper[4749]: I0219 19:07:10.096196 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qzjj4"] Feb 19 19:07:10 crc kubenswrapper[4749]: I0219 19:07:10.098748 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qzjj4" Feb 19 19:07:10 crc kubenswrapper[4749]: I0219 19:07:10.125478 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzjj4"] Feb 19 19:07:10 crc kubenswrapper[4749]: I0219 19:07:10.289301 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grmmq\" (UniqueName: \"kubernetes.io/projected/29084b55-cf30-4d08-89fd-16c6ffb4a44c-kube-api-access-grmmq\") pod \"redhat-marketplace-qzjj4\" (UID: \"29084b55-cf30-4d08-89fd-16c6ffb4a44c\") " pod="openshift-marketplace/redhat-marketplace-qzjj4" Feb 19 19:07:10 crc kubenswrapper[4749]: I0219 19:07:10.290134 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29084b55-cf30-4d08-89fd-16c6ffb4a44c-utilities\") pod \"redhat-marketplace-qzjj4\" (UID: \"29084b55-cf30-4d08-89fd-16c6ffb4a44c\") " pod="openshift-marketplace/redhat-marketplace-qzjj4" Feb 19 19:07:10 crc kubenswrapper[4749]: I0219 19:07:10.290168 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29084b55-cf30-4d08-89fd-16c6ffb4a44c-catalog-content\") pod \"redhat-marketplace-qzjj4\" (UID: \"29084b55-cf30-4d08-89fd-16c6ffb4a44c\") " pod="openshift-marketplace/redhat-marketplace-qzjj4" Feb 19 19:07:10 crc kubenswrapper[4749]: I0219 19:07:10.391631 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29084b55-cf30-4d08-89fd-16c6ffb4a44c-utilities\") pod \"redhat-marketplace-qzjj4\" (UID: \"29084b55-cf30-4d08-89fd-16c6ffb4a44c\") " pod="openshift-marketplace/redhat-marketplace-qzjj4" Feb 19 19:07:10 crc kubenswrapper[4749]: I0219 19:07:10.391934 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29084b55-cf30-4d08-89fd-16c6ffb4a44c-catalog-content\") pod \"redhat-marketplace-qzjj4\" (UID: \"29084b55-cf30-4d08-89fd-16c6ffb4a44c\") " pod="openshift-marketplace/redhat-marketplace-qzjj4" Feb 19 19:07:10 crc kubenswrapper[4749]: I0219 19:07:10.392086 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grmmq\" (UniqueName: \"kubernetes.io/projected/29084b55-cf30-4d08-89fd-16c6ffb4a44c-kube-api-access-grmmq\") pod \"redhat-marketplace-qzjj4\" (UID: \"29084b55-cf30-4d08-89fd-16c6ffb4a44c\") " pod="openshift-marketplace/redhat-marketplace-qzjj4" Feb 19 19:07:10 crc kubenswrapper[4749]: I0219 19:07:10.392093 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29084b55-cf30-4d08-89fd-16c6ffb4a44c-utilities\") pod \"redhat-marketplace-qzjj4\" (UID: \"29084b55-cf30-4d08-89fd-16c6ffb4a44c\") " pod="openshift-marketplace/redhat-marketplace-qzjj4" Feb 19 19:07:10 crc kubenswrapper[4749]: I0219 19:07:10.392307 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29084b55-cf30-4d08-89fd-16c6ffb4a44c-catalog-content\") pod \"redhat-marketplace-qzjj4\" (UID: \"29084b55-cf30-4d08-89fd-16c6ffb4a44c\") " pod="openshift-marketplace/redhat-marketplace-qzjj4" Feb 19 19:07:10 crc kubenswrapper[4749]: I0219 19:07:10.413389 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grmmq\" (UniqueName: \"kubernetes.io/projected/29084b55-cf30-4d08-89fd-16c6ffb4a44c-kube-api-access-grmmq\") pod \"redhat-marketplace-qzjj4\" (UID: \"29084b55-cf30-4d08-89fd-16c6ffb4a44c\") " pod="openshift-marketplace/redhat-marketplace-qzjj4" Feb 19 19:07:10 crc kubenswrapper[4749]: I0219 19:07:10.426052 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qzjj4" Feb 19 19:07:10 crc kubenswrapper[4749]: I0219 19:07:10.899657 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzjj4"] Feb 19 19:07:10 crc kubenswrapper[4749]: W0219 19:07:10.935397 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29084b55_cf30_4d08_89fd_16c6ffb4a44c.slice/crio-4e145bc23f0e5d75f179fdab3b0515877837033c3da0f5b5e8b4bbb5ee5e1dd8 WatchSource:0}: Error finding container 4e145bc23f0e5d75f179fdab3b0515877837033c3da0f5b5e8b4bbb5ee5e1dd8: Status 404 returned error can't find the container with id 4e145bc23f0e5d75f179fdab3b0515877837033c3da0f5b5e8b4bbb5ee5e1dd8 Feb 19 19:07:11 crc kubenswrapper[4749]: I0219 19:07:11.665423 4749 generic.go:334] "Generic (PLEG): container finished" podID="29084b55-cf30-4d08-89fd-16c6ffb4a44c" containerID="9208780b1b0f9c986d007fbff618a50654e46ec772c585e4f34b4cdcd3148252" exitCode=0 Feb 19 19:07:11 crc kubenswrapper[4749]: I0219 19:07:11.665473 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzjj4" event={"ID":"29084b55-cf30-4d08-89fd-16c6ffb4a44c","Type":"ContainerDied","Data":"9208780b1b0f9c986d007fbff618a50654e46ec772c585e4f34b4cdcd3148252"} Feb 19 19:07:11 crc kubenswrapper[4749]: I0219 19:07:11.665758 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzjj4" event={"ID":"29084b55-cf30-4d08-89fd-16c6ffb4a44c","Type":"ContainerStarted","Data":"4e145bc23f0e5d75f179fdab3b0515877837033c3da0f5b5e8b4bbb5ee5e1dd8"} Feb 19 19:07:12 crc kubenswrapper[4749]: I0219 19:07:12.674623 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzjj4" event={"ID":"29084b55-cf30-4d08-89fd-16c6ffb4a44c","Type":"ContainerStarted","Data":"6a46d5dec5d05c831b92ac6f3874ebdd88a9e83f7c7d5ade84d588ce276ef459"} Feb 19 19:07:13 crc kubenswrapper[4749]: I0219 19:07:13.686865 4749 generic.go:334] "Generic (PLEG): container finished" podID="29084b55-cf30-4d08-89fd-16c6ffb4a44c" containerID="6a46d5dec5d05c831b92ac6f3874ebdd88a9e83f7c7d5ade84d588ce276ef459" exitCode=0 Feb 19 19:07:13 crc kubenswrapper[4749]: I0219 19:07:13.686929 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzjj4" event={"ID":"29084b55-cf30-4d08-89fd-16c6ffb4a44c","Type":"ContainerDied","Data":"6a46d5dec5d05c831b92ac6f3874ebdd88a9e83f7c7d5ade84d588ce276ef459"} Feb 19 19:07:14 crc kubenswrapper[4749]: I0219 19:07:14.698538 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzjj4" event={"ID":"29084b55-cf30-4d08-89fd-16c6ffb4a44c","Type":"ContainerStarted","Data":"69edc137779b9a25d26338c297ce40dc7babef7eb5c1c4e5bbed1760f9686f8a"} Feb 19 19:07:14 crc kubenswrapper[4749]: I0219 19:07:14.723672 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qzjj4" podStartSLOduration=2.242337719 podStartE2EDuration="4.723652306s" podCreationTimestamp="2026-02-19 19:07:10 +0000 UTC" firstStartedPulling="2026-02-19 19:07:11.669298241 +0000 UTC m=+2005.630518195" lastFinishedPulling="2026-02-19 19:07:14.150612818 +0000 UTC m=+2008.111832782" observedRunningTime="2026-02-19 19:07:14.718642115 +0000 UTC m=+2008.679862069" watchObservedRunningTime="2026-02-19 19:07:14.723652306 +0000 UTC m=+2008.684872260" Feb 19 19:07:20 crc kubenswrapper[4749]: I0219 19:07:20.426450 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qzjj4" Feb 19 19:07:20 crc kubenswrapper[4749]: I0219 19:07:20.427081 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qzjj4" Feb 19 19:07:20 crc kubenswrapper[4749]: I0219 19:07:20.516244 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qzjj4" Feb 19 19:07:20 crc kubenswrapper[4749]: I0219 19:07:20.850172 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qzjj4" Feb 19 19:07:20 crc kubenswrapper[4749]: I0219 19:07:20.904158 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzjj4"] Feb 19 19:07:22 crc kubenswrapper[4749]: I0219 19:07:22.795810 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qzjj4" podUID="29084b55-cf30-4d08-89fd-16c6ffb4a44c" containerName="registry-server" containerID="cri-o://69edc137779b9a25d26338c297ce40dc7babef7eb5c1c4e5bbed1760f9686f8a" gracePeriod=2 Feb 19 19:07:23 crc kubenswrapper[4749]: I0219 19:07:23.773962 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qzjj4" Feb 19 19:07:23 crc kubenswrapper[4749]: I0219 19:07:23.820108 4749 generic.go:334] "Generic (PLEG): container finished" podID="29084b55-cf30-4d08-89fd-16c6ffb4a44c" containerID="69edc137779b9a25d26338c297ce40dc7babef7eb5c1c4e5bbed1760f9686f8a" exitCode=0 Feb 19 19:07:23 crc kubenswrapper[4749]: I0219 19:07:23.820164 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzjj4" event={"ID":"29084b55-cf30-4d08-89fd-16c6ffb4a44c","Type":"ContainerDied","Data":"69edc137779b9a25d26338c297ce40dc7babef7eb5c1c4e5bbed1760f9686f8a"} Feb 19 19:07:23 crc kubenswrapper[4749]: I0219 19:07:23.820194 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzjj4" event={"ID":"29084b55-cf30-4d08-89fd-16c6ffb4a44c","Type":"ContainerDied","Data":"4e145bc23f0e5d75f179fdab3b0515877837033c3da0f5b5e8b4bbb5ee5e1dd8"} Feb 19 19:07:23 crc kubenswrapper[4749]: I0219 19:07:23.820215 4749 scope.go:117] "RemoveContainer" containerID="69edc137779b9a25d26338c297ce40dc7babef7eb5c1c4e5bbed1760f9686f8a" Feb 19 19:07:23 crc kubenswrapper[4749]: I0219 19:07:23.820302 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qzjj4" Feb 19 19:07:23 crc kubenswrapper[4749]: I0219 19:07:23.840155 4749 scope.go:117] "RemoveContainer" containerID="6a46d5dec5d05c831b92ac6f3874ebdd88a9e83f7c7d5ade84d588ce276ef459" Feb 19 19:07:23 crc kubenswrapper[4749]: I0219 19:07:23.859013 4749 scope.go:117] "RemoveContainer" containerID="9208780b1b0f9c986d007fbff618a50654e46ec772c585e4f34b4cdcd3148252" Feb 19 19:07:23 crc kubenswrapper[4749]: I0219 19:07:23.890859 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29084b55-cf30-4d08-89fd-16c6ffb4a44c-catalog-content\") pod \"29084b55-cf30-4d08-89fd-16c6ffb4a44c\" (UID: \"29084b55-cf30-4d08-89fd-16c6ffb4a44c\") " Feb 19 19:07:23 crc kubenswrapper[4749]: I0219 19:07:23.890938 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29084b55-cf30-4d08-89fd-16c6ffb4a44c-utilities\") pod \"29084b55-cf30-4d08-89fd-16c6ffb4a44c\" (UID: \"29084b55-cf30-4d08-89fd-16c6ffb4a44c\") " Feb 19 19:07:23 crc kubenswrapper[4749]: I0219 19:07:23.891038 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grmmq\" (UniqueName: \"kubernetes.io/projected/29084b55-cf30-4d08-89fd-16c6ffb4a44c-kube-api-access-grmmq\") pod \"29084b55-cf30-4d08-89fd-16c6ffb4a44c\" (UID: \"29084b55-cf30-4d08-89fd-16c6ffb4a44c\") " Feb 19 19:07:23 crc kubenswrapper[4749]: I0219 19:07:23.892735 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29084b55-cf30-4d08-89fd-16c6ffb4a44c-utilities" (OuterVolumeSpecName: "utilities") pod "29084b55-cf30-4d08-89fd-16c6ffb4a44c" (UID: "29084b55-cf30-4d08-89fd-16c6ffb4a44c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:07:23 crc kubenswrapper[4749]: I0219 19:07:23.898853 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29084b55-cf30-4d08-89fd-16c6ffb4a44c-kube-api-access-grmmq" (OuterVolumeSpecName: "kube-api-access-grmmq") pod "29084b55-cf30-4d08-89fd-16c6ffb4a44c" (UID: "29084b55-cf30-4d08-89fd-16c6ffb4a44c"). InnerVolumeSpecName "kube-api-access-grmmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:07:23 crc kubenswrapper[4749]: I0219 19:07:23.905456 4749 scope.go:117] "RemoveContainer" containerID="69edc137779b9a25d26338c297ce40dc7babef7eb5c1c4e5bbed1760f9686f8a" Feb 19 19:07:23 crc kubenswrapper[4749]: E0219 19:07:23.906637 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69edc137779b9a25d26338c297ce40dc7babef7eb5c1c4e5bbed1760f9686f8a\": container with ID starting with 69edc137779b9a25d26338c297ce40dc7babef7eb5c1c4e5bbed1760f9686f8a not found: ID does not exist" containerID="69edc137779b9a25d26338c297ce40dc7babef7eb5c1c4e5bbed1760f9686f8a" Feb 19 19:07:23 crc kubenswrapper[4749]: I0219 19:07:23.906681 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69edc137779b9a25d26338c297ce40dc7babef7eb5c1c4e5bbed1760f9686f8a"} err="failed to get container status \"69edc137779b9a25d26338c297ce40dc7babef7eb5c1c4e5bbed1760f9686f8a\": rpc error: code = NotFound desc = could not find container \"69edc137779b9a25d26338c297ce40dc7babef7eb5c1c4e5bbed1760f9686f8a\": container with ID starting with 69edc137779b9a25d26338c297ce40dc7babef7eb5c1c4e5bbed1760f9686f8a not found: ID does not exist" Feb 19 19:07:23 crc kubenswrapper[4749]: I0219 19:07:23.906701 4749 scope.go:117] "RemoveContainer" containerID="6a46d5dec5d05c831b92ac6f3874ebdd88a9e83f7c7d5ade84d588ce276ef459" Feb 19 19:07:23 crc kubenswrapper[4749]: E0219 19:07:23.907023 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a46d5dec5d05c831b92ac6f3874ebdd88a9e83f7c7d5ade84d588ce276ef459\": container with ID starting with 6a46d5dec5d05c831b92ac6f3874ebdd88a9e83f7c7d5ade84d588ce276ef459 not found: ID does not exist" containerID="6a46d5dec5d05c831b92ac6f3874ebdd88a9e83f7c7d5ade84d588ce276ef459" Feb 19 19:07:23 crc kubenswrapper[4749]: I0219 19:07:23.907062 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a46d5dec5d05c831b92ac6f3874ebdd88a9e83f7c7d5ade84d588ce276ef459"} err="failed to get container status \"6a46d5dec5d05c831b92ac6f3874ebdd88a9e83f7c7d5ade84d588ce276ef459\": rpc error: code = NotFound desc = could not find container \"6a46d5dec5d05c831b92ac6f3874ebdd88a9e83f7c7d5ade84d588ce276ef459\": container with ID starting with 6a46d5dec5d05c831b92ac6f3874ebdd88a9e83f7c7d5ade84d588ce276ef459 not found: ID does not exist" Feb 19 19:07:23 crc kubenswrapper[4749]: I0219 19:07:23.907078 4749 scope.go:117] "RemoveContainer" containerID="9208780b1b0f9c986d007fbff618a50654e46ec772c585e4f34b4cdcd3148252" Feb 19 19:07:23 crc kubenswrapper[4749]: E0219 19:07:23.907641 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9208780b1b0f9c986d007fbff618a50654e46ec772c585e4f34b4cdcd3148252\": container with ID starting with 9208780b1b0f9c986d007fbff618a50654e46ec772c585e4f34b4cdcd3148252 not found: ID does not exist" containerID="9208780b1b0f9c986d007fbff618a50654e46ec772c585e4f34b4cdcd3148252" Feb 19 19:07:23 crc kubenswrapper[4749]: I0219 19:07:23.907685 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9208780b1b0f9c986d007fbff618a50654e46ec772c585e4f34b4cdcd3148252"} err="failed to get container status \"9208780b1b0f9c986d007fbff618a50654e46ec772c585e4f34b4cdcd3148252\": rpc error: code = NotFound desc = could not find container \"9208780b1b0f9c986d007fbff618a50654e46ec772c585e4f34b4cdcd3148252\": container with ID starting with 9208780b1b0f9c986d007fbff618a50654e46ec772c585e4f34b4cdcd3148252 not found: ID does not exist" Feb 19 19:07:23 crc kubenswrapper[4749]: I0219 19:07:23.917429 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29084b55-cf30-4d08-89fd-16c6ffb4a44c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29084b55-cf30-4d08-89fd-16c6ffb4a44c" (UID: "29084b55-cf30-4d08-89fd-16c6ffb4a44c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:07:23 crc kubenswrapper[4749]: I0219 19:07:23.993697 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29084b55-cf30-4d08-89fd-16c6ffb4a44c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:07:23 crc kubenswrapper[4749]: I0219 19:07:23.993743 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29084b55-cf30-4d08-89fd-16c6ffb4a44c-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:07:23 crc kubenswrapper[4749]: I0219 19:07:23.993756 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grmmq\" (UniqueName: \"kubernetes.io/projected/29084b55-cf30-4d08-89fd-16c6ffb4a44c-kube-api-access-grmmq\") on node \"crc\" DevicePath \"\"" Feb 19 19:07:24 crc kubenswrapper[4749]: I0219 19:07:24.159412 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzjj4"] Feb 19 19:07:24 crc kubenswrapper[4749]: I0219 19:07:24.169342 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzjj4"] Feb 19 19:07:24 crc kubenswrapper[4749]: I0219 19:07:24.721056 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29084b55-cf30-4d08-89fd-16c6ffb4a44c" path="/var/lib/kubelet/pods/29084b55-cf30-4d08-89fd-16c6ffb4a44c/volumes" Feb 19 19:07:40 crc kubenswrapper[4749]: I0219 19:07:40.978493 4749 generic.go:334] "Generic (PLEG): container finished" podID="8c86e6e6-0776-48c9-9c58-e1b2d41a4552" containerID="ca43e1a872d01030922c4693a10f81dd5cdede98be1e17d21ce0c62332afecc6" exitCode=0 Feb 19 19:07:40 crc kubenswrapper[4749]: I0219 19:07:40.978600 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qtct9" event={"ID":"8c86e6e6-0776-48c9-9c58-e1b2d41a4552","Type":"ContainerDied","Data":"ca43e1a872d01030922c4693a10f81dd5cdede98be1e17d21ce0c62332afecc6"} Feb 19 19:07:42 crc kubenswrapper[4749]: I0219 19:07:42.437201 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qtct9" Feb 19 19:07:42 crc kubenswrapper[4749]: I0219 19:07:42.592857 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-ovn-combined-ca-bundle\") pod \"8c86e6e6-0776-48c9-9c58-e1b2d41a4552\" (UID: \"8c86e6e6-0776-48c9-9c58-e1b2d41a4552\") " Feb 19 19:07:42 crc kubenswrapper[4749]: I0219 19:07:42.592982 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-ssh-key-openstack-edpm-ipam\") pod \"8c86e6e6-0776-48c9-9c58-e1b2d41a4552\" (UID: \"8c86e6e6-0776-48c9-9c58-e1b2d41a4552\") " Feb 19 19:07:42 crc kubenswrapper[4749]: I0219 19:07:42.593045 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-inventory\") pod \"8c86e6e6-0776-48c9-9c58-e1b2d41a4552\" (UID: \"8c86e6e6-0776-48c9-9c58-e1b2d41a4552\") " Feb 19 19:07:42 crc kubenswrapper[4749]: I0219 19:07:42.593161 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-ovncontroller-config-0\") pod \"8c86e6e6-0776-48c9-9c58-e1b2d41a4552\" (UID: \"8c86e6e6-0776-48c9-9c58-e1b2d41a4552\") " Feb 19 19:07:42 crc kubenswrapper[4749]: I0219 19:07:42.593240 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmhn2\" (UniqueName: \"kubernetes.io/projected/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-kube-api-access-qmhn2\") pod \"8c86e6e6-0776-48c9-9c58-e1b2d41a4552\" (UID: \"8c86e6e6-0776-48c9-9c58-e1b2d41a4552\") " Feb 19 19:07:42 crc kubenswrapper[4749]: I0219 19:07:42.599414 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-kube-api-access-qmhn2" (OuterVolumeSpecName: "kube-api-access-qmhn2") pod "8c86e6e6-0776-48c9-9c58-e1b2d41a4552" (UID: "8c86e6e6-0776-48c9-9c58-e1b2d41a4552"). InnerVolumeSpecName "kube-api-access-qmhn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:07:42 crc kubenswrapper[4749]: I0219 19:07:42.600281 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "8c86e6e6-0776-48c9-9c58-e1b2d41a4552" (UID: "8c86e6e6-0776-48c9-9c58-e1b2d41a4552"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:07:42 crc kubenswrapper[4749]: I0219 19:07:42.625458 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "8c86e6e6-0776-48c9-9c58-e1b2d41a4552" (UID: "8c86e6e6-0776-48c9-9c58-e1b2d41a4552"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:07:42 crc kubenswrapper[4749]: I0219 19:07:42.642884 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-inventory" (OuterVolumeSpecName: "inventory") pod "8c86e6e6-0776-48c9-9c58-e1b2d41a4552" (UID: "8c86e6e6-0776-48c9-9c58-e1b2d41a4552"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:07:42 crc kubenswrapper[4749]: I0219 19:07:42.644159 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8c86e6e6-0776-48c9-9c58-e1b2d41a4552" (UID: "8c86e6e6-0776-48c9-9c58-e1b2d41a4552"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:07:42 crc kubenswrapper[4749]: I0219 19:07:42.702597 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmhn2\" (UniqueName: \"kubernetes.io/projected/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-kube-api-access-qmhn2\") on node \"crc\" DevicePath \"\"" Feb 19 19:07:42 crc kubenswrapper[4749]: I0219 19:07:42.702656 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:07:42 crc kubenswrapper[4749]: I0219 19:07:42.702679 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:07:42 crc kubenswrapper[4749]: I0219 19:07:42.702692 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:07:42 crc kubenswrapper[4749]: I0219 19:07:42.702728 4749 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8c86e6e6-0776-48c9-9c58-e1b2d41a4552-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.005712 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qtct9" event={"ID":"8c86e6e6-0776-48c9-9c58-e1b2d41a4552","Type":"ContainerDied","Data":"7f4e9f1822b68301a977ddcbc2df98ab0cda2c4770b72eebc0f59de0210a7988"} Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.006087 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f4e9f1822b68301a977ddcbc2df98ab0cda2c4770b72eebc0f59de0210a7988" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.005923 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qtct9" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.166704 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27"] Feb 19 19:07:43 crc kubenswrapper[4749]: E0219 19:07:43.167192 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29084b55-cf30-4d08-89fd-16c6ffb4a44c" containerName="registry-server" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.167209 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="29084b55-cf30-4d08-89fd-16c6ffb4a44c" containerName="registry-server" Feb 19 19:07:43 crc kubenswrapper[4749]: E0219 19:07:43.167239 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29084b55-cf30-4d08-89fd-16c6ffb4a44c" containerName="extract-content" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.167245 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="29084b55-cf30-4d08-89fd-16c6ffb4a44c" containerName="extract-content" Feb 19 19:07:43 crc kubenswrapper[4749]: E0219 19:07:43.167254 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29084b55-cf30-4d08-89fd-16c6ffb4a44c" containerName="extract-utilities" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.167260 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="29084b55-cf30-4d08-89fd-16c6ffb4a44c" containerName="extract-utilities" Feb 19 19:07:43 crc kubenswrapper[4749]: E0219 19:07:43.167271 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c86e6e6-0776-48c9-9c58-e1b2d41a4552" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.167277 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c86e6e6-0776-48c9-9c58-e1b2d41a4552" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.167458 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c86e6e6-0776-48c9-9c58-e1b2d41a4552" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.167469 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="29084b55-cf30-4d08-89fd-16c6ffb4a44c" containerName="registry-server" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.168169 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.175240 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27"] Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.175474 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.175496 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.175908 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.175970 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.176072 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4pz98" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.183270 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.315184 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27\" (UID: \"cdcbdafe-8bad-41be-91bb-59bc54994227\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.315266 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn4v9\" (UniqueName: \"kubernetes.io/projected/cdcbdafe-8bad-41be-91bb-59bc54994227-kube-api-access-tn4v9\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27\" (UID: \"cdcbdafe-8bad-41be-91bb-59bc54994227\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.315349 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27\" (UID: \"cdcbdafe-8bad-41be-91bb-59bc54994227\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.315376 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27\" (UID: \"cdcbdafe-8bad-41be-91bb-59bc54994227\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.315436 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27\" (UID: \"cdcbdafe-8bad-41be-91bb-59bc54994227\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.315467 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27\" (UID: \"cdcbdafe-8bad-41be-91bb-59bc54994227\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.416732 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn4v9\" (UniqueName: \"kubernetes.io/projected/cdcbdafe-8bad-41be-91bb-59bc54994227-kube-api-access-tn4v9\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27\" (UID: \"cdcbdafe-8bad-41be-91bb-59bc54994227\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.416837 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27\" (UID: \"cdcbdafe-8bad-41be-91bb-59bc54994227\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.416868 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27\" (UID: \"cdcbdafe-8bad-41be-91bb-59bc54994227\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.416929 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27\" (UID: \"cdcbdafe-8bad-41be-91bb-59bc54994227\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.416961 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27\" (UID: \"cdcbdafe-8bad-41be-91bb-59bc54994227\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.416980 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27\" (UID: \"cdcbdafe-8bad-41be-91bb-59bc54994227\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.421389 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27\" (UID: \"cdcbdafe-8bad-41be-91bb-59bc54994227\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.421508 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27\" (UID: \"cdcbdafe-8bad-41be-91bb-59bc54994227\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.421943 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27\" (UID: \"cdcbdafe-8bad-41be-91bb-59bc54994227\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.427627 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27\" (UID: \"cdcbdafe-8bad-41be-91bb-59bc54994227\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.433824 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27\" (UID: \"cdcbdafe-8bad-41be-91bb-59bc54994227\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.434604 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn4v9\" (UniqueName: \"kubernetes.io/projected/cdcbdafe-8bad-41be-91bb-59bc54994227-kube-api-access-tn4v9\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27\" (UID: \"cdcbdafe-8bad-41be-91bb-59bc54994227\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27" Feb 19 19:07:43 crc kubenswrapper[4749]: I0219 19:07:43.494146 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27" Feb 19 19:07:44 crc kubenswrapper[4749]: I0219 19:07:44.030113 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27"] Feb 19 19:07:45 crc kubenswrapper[4749]: I0219 19:07:45.048671 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27" event={"ID":"cdcbdafe-8bad-41be-91bb-59bc54994227","Type":"ContainerStarted","Data":"24c1403f0df3555a45cae1667828a4faee92a3774cf0870944106427fb359660"} Feb 19 19:07:45 crc kubenswrapper[4749]: I0219 19:07:45.049119 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27" event={"ID":"cdcbdafe-8bad-41be-91bb-59bc54994227","Type":"ContainerStarted","Data":"666738555d447f75e32ca2f0f0ea12a4d7797da1cb21bbe444d4dba912bf2dc1"} Feb 19 19:07:45 crc kubenswrapper[4749]: I0219 19:07:45.082594 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27" podStartSLOduration=1.513168152 podStartE2EDuration="2.082560041s" podCreationTimestamp="2026-02-19 19:07:43 +0000 UTC" firstStartedPulling="2026-02-19 19:07:44.04199299 +0000 UTC m=+2038.003212944" lastFinishedPulling="2026-02-19 19:07:44.611384859 +0000 UTC m=+2038.572604833" observedRunningTime="2026-02-19 19:07:45.074101237 +0000 UTC m=+2039.035321221" watchObservedRunningTime="2026-02-19 19:07:45.082560041 +0000 UTC m=+2039.043780025" Feb 19 19:08:03 crc kubenswrapper[4749]: I0219 19:08:03.501934 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-86qdr"] Feb 19 19:08:03 crc kubenswrapper[4749]: I0219 19:08:03.505414 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86qdr" Feb 19 19:08:03 crc kubenswrapper[4749]: I0219 19:08:03.525534 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-86qdr"] Feb 19 19:08:03 crc kubenswrapper[4749]: I0219 19:08:03.656473 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/984f5a1e-6057-49c2-aa79-19b6e2e8e51d-catalog-content\") pod \"community-operators-86qdr\" (UID: \"984f5a1e-6057-49c2-aa79-19b6e2e8e51d\") " pod="openshift-marketplace/community-operators-86qdr" Feb 19 19:08:03 crc kubenswrapper[4749]: I0219 19:08:03.656748 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/984f5a1e-6057-49c2-aa79-19b6e2e8e51d-utilities\") pod \"community-operators-86qdr\" (UID: \"984f5a1e-6057-49c2-aa79-19b6e2e8e51d\") " pod="openshift-marketplace/community-operators-86qdr" Feb 19 19:08:03 crc kubenswrapper[4749]: I0219 19:08:03.656930 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvbz6\" (UniqueName: \"kubernetes.io/projected/984f5a1e-6057-49c2-aa79-19b6e2e8e51d-kube-api-access-bvbz6\") pod \"community-operators-86qdr\" (UID: \"984f5a1e-6057-49c2-aa79-19b6e2e8e51d\") " pod="openshift-marketplace/community-operators-86qdr" Feb 19 19:08:03 crc kubenswrapper[4749]: I0219 19:08:03.759260 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/984f5a1e-6057-49c2-aa79-19b6e2e8e51d-utilities\") pod \"community-operators-86qdr\" (UID: \"984f5a1e-6057-49c2-aa79-19b6e2e8e51d\") " pod="openshift-marketplace/community-operators-86qdr" Feb 19 19:08:03 crc kubenswrapper[4749]: I0219 19:08:03.759339 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvbz6\" (UniqueName: \"kubernetes.io/projected/984f5a1e-6057-49c2-aa79-19b6e2e8e51d-kube-api-access-bvbz6\") pod \"community-operators-86qdr\" (UID: \"984f5a1e-6057-49c2-aa79-19b6e2e8e51d\") " pod="openshift-marketplace/community-operators-86qdr" Feb 19 19:08:03 crc kubenswrapper[4749]: I0219 19:08:03.759443 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/984f5a1e-6057-49c2-aa79-19b6e2e8e51d-catalog-content\") pod \"community-operators-86qdr\" (UID: \"984f5a1e-6057-49c2-aa79-19b6e2e8e51d\") " pod="openshift-marketplace/community-operators-86qdr" Feb 19 19:08:03 crc kubenswrapper[4749]: I0219 19:08:03.759894 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/984f5a1e-6057-49c2-aa79-19b6e2e8e51d-utilities\") pod \"community-operators-86qdr\" (UID: \"984f5a1e-6057-49c2-aa79-19b6e2e8e51d\") " pod="openshift-marketplace/community-operators-86qdr" Feb 19 19:08:03 crc kubenswrapper[4749]: I0219 19:08:03.760204 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/984f5a1e-6057-49c2-aa79-19b6e2e8e51d-catalog-content\") pod \"community-operators-86qdr\" (UID: \"984f5a1e-6057-49c2-aa79-19b6e2e8e51d\") " pod="openshift-marketplace/community-operators-86qdr" Feb 19 19:08:03 crc kubenswrapper[4749]: I0219 19:08:03.782767 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvbz6\" (UniqueName: \"kubernetes.io/projected/984f5a1e-6057-49c2-aa79-19b6e2e8e51d-kube-api-access-bvbz6\") pod \"community-operators-86qdr\" (UID: \"984f5a1e-6057-49c2-aa79-19b6e2e8e51d\") " pod="openshift-marketplace/community-operators-86qdr" Feb 19 19:08:03 crc kubenswrapper[4749]: I0219 19:08:03.847923 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86qdr" Feb 19 19:08:04 crc kubenswrapper[4749]: I0219 19:08:04.350897 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-86qdr"] Feb 19 19:08:05 crc kubenswrapper[4749]: I0219 19:08:05.248122 4749 generic.go:334] "Generic (PLEG): container finished" podID="984f5a1e-6057-49c2-aa79-19b6e2e8e51d" containerID="698ed67684f5a7f57043dcc7fae618cde859eb26ad20e91d69c5ddf0c4116ae5" exitCode=0 Feb 19 19:08:05 crc kubenswrapper[4749]: I0219 19:08:05.248389 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86qdr" event={"ID":"984f5a1e-6057-49c2-aa79-19b6e2e8e51d","Type":"ContainerDied","Data":"698ed67684f5a7f57043dcc7fae618cde859eb26ad20e91d69c5ddf0c4116ae5"} Feb 19 19:08:05 crc kubenswrapper[4749]: I0219 19:08:05.248414 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86qdr" event={"ID":"984f5a1e-6057-49c2-aa79-19b6e2e8e51d","Type":"ContainerStarted","Data":"f24cd3a868538b592764da7ad227051b43adfe239a40441f86b8c07d7399d9d8"} Feb 19 19:08:06 crc kubenswrapper[4749]: I0219 19:08:06.258868 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86qdr" event={"ID":"984f5a1e-6057-49c2-aa79-19b6e2e8e51d","Type":"ContainerStarted","Data":"65d3b408ec2a0942bf2d522324f108e4c659a58719cf2820fd79615c4ba7bfae"} Feb 19 19:08:08 crc kubenswrapper[4749]: I0219 19:08:08.278061 4749 generic.go:334] "Generic (PLEG): container finished" podID="984f5a1e-6057-49c2-aa79-19b6e2e8e51d" containerID="65d3b408ec2a0942bf2d522324f108e4c659a58719cf2820fd79615c4ba7bfae" exitCode=0 Feb 19 19:08:08 crc kubenswrapper[4749]: I0219 19:08:08.278149 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86qdr" event={"ID":"984f5a1e-6057-49c2-aa79-19b6e2e8e51d","Type":"ContainerDied","Data":"65d3b408ec2a0942bf2d522324f108e4c659a58719cf2820fd79615c4ba7bfae"} Feb 19 19:08:09 crc kubenswrapper[4749]: I0219 19:08:09.289497 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86qdr" event={"ID":"984f5a1e-6057-49c2-aa79-19b6e2e8e51d","Type":"ContainerStarted","Data":"bb2678bc6f7d5dcceee7b201370efe2f5d207a1d0b1032f4610139bb436b33c8"} Feb 19 19:08:09 crc kubenswrapper[4749]: I0219 19:08:09.316656 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-86qdr" podStartSLOduration=2.909075681 podStartE2EDuration="6.316638539s" podCreationTimestamp="2026-02-19 19:08:03 +0000 UTC" firstStartedPulling="2026-02-19 19:08:05.251061425 +0000 UTC m=+2059.212281379" lastFinishedPulling="2026-02-19 19:08:08.658624283 +0000 UTC m=+2062.619844237" observedRunningTime="2026-02-19 19:08:09.312302385 +0000 UTC m=+2063.273522339" watchObservedRunningTime="2026-02-19 19:08:09.316638539 +0000 UTC m=+2063.277858483" Feb 19 19:08:13 crc kubenswrapper[4749]: I0219 19:08:13.848284 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-86qdr" Feb 19 19:08:13 crc kubenswrapper[4749]: I0219 19:08:13.848825 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-86qdr" Feb 19 19:08:13 crc kubenswrapper[4749]: I0219 19:08:13.891955 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-86qdr" Feb 19 19:08:14 crc kubenswrapper[4749]: I0219 19:08:14.375836 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-86qdr" Feb 19 19:08:14 crc kubenswrapper[4749]: I0219 19:08:14.419638 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-86qdr"] Feb 19 19:08:16 crc kubenswrapper[4749]: I0219 19:08:16.351452 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-86qdr" podUID="984f5a1e-6057-49c2-aa79-19b6e2e8e51d" containerName="registry-server" containerID="cri-o://bb2678bc6f7d5dcceee7b201370efe2f5d207a1d0b1032f4610139bb436b33c8" gracePeriod=2 Feb 19 19:08:16 crc kubenswrapper[4749]: I0219 19:08:16.843674 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86qdr" Feb 19 19:08:16 crc kubenswrapper[4749]: I0219 19:08:16.931722 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/984f5a1e-6057-49c2-aa79-19b6e2e8e51d-utilities\") pod \"984f5a1e-6057-49c2-aa79-19b6e2e8e51d\" (UID: \"984f5a1e-6057-49c2-aa79-19b6e2e8e51d\") " Feb 19 19:08:16 crc kubenswrapper[4749]: I0219 19:08:16.932280 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/984f5a1e-6057-49c2-aa79-19b6e2e8e51d-catalog-content\") pod \"984f5a1e-6057-49c2-aa79-19b6e2e8e51d\" (UID: \"984f5a1e-6057-49c2-aa79-19b6e2e8e51d\") " Feb 19 19:08:16 crc kubenswrapper[4749]: I0219 19:08:16.932418 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvbz6\" (UniqueName: \"kubernetes.io/projected/984f5a1e-6057-49c2-aa79-19b6e2e8e51d-kube-api-access-bvbz6\") pod \"984f5a1e-6057-49c2-aa79-19b6e2e8e51d\" (UID: \"984f5a1e-6057-49c2-aa79-19b6e2e8e51d\") " Feb 19 19:08:16 crc kubenswrapper[4749]: I0219 19:08:16.932901 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/984f5a1e-6057-49c2-aa79-19b6e2e8e51d-utilities" (OuterVolumeSpecName: "utilities") pod "984f5a1e-6057-49c2-aa79-19b6e2e8e51d" (UID: "984f5a1e-6057-49c2-aa79-19b6e2e8e51d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:08:16 crc kubenswrapper[4749]: I0219 19:08:16.933239 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/984f5a1e-6057-49c2-aa79-19b6e2e8e51d-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:08:16 crc kubenswrapper[4749]: I0219 19:08:16.938810 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/984f5a1e-6057-49c2-aa79-19b6e2e8e51d-kube-api-access-bvbz6" (OuterVolumeSpecName: "kube-api-access-bvbz6") pod "984f5a1e-6057-49c2-aa79-19b6e2e8e51d" (UID: "984f5a1e-6057-49c2-aa79-19b6e2e8e51d"). InnerVolumeSpecName "kube-api-access-bvbz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:08:16 crc kubenswrapper[4749]: I0219 19:08:16.994692 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/984f5a1e-6057-49c2-aa79-19b6e2e8e51d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "984f5a1e-6057-49c2-aa79-19b6e2e8e51d" (UID: "984f5a1e-6057-49c2-aa79-19b6e2e8e51d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:08:17 crc kubenswrapper[4749]: I0219 19:08:17.035471 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/984f5a1e-6057-49c2-aa79-19b6e2e8e51d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:08:17 crc kubenswrapper[4749]: I0219 19:08:17.035517 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvbz6\" (UniqueName: \"kubernetes.io/projected/984f5a1e-6057-49c2-aa79-19b6e2e8e51d-kube-api-access-bvbz6\") on node \"crc\" DevicePath \"\"" Feb 19 19:08:17 crc kubenswrapper[4749]: I0219 19:08:17.363129 4749 generic.go:334] "Generic (PLEG): container finished" podID="984f5a1e-6057-49c2-aa79-19b6e2e8e51d" containerID="bb2678bc6f7d5dcceee7b201370efe2f5d207a1d0b1032f4610139bb436b33c8" exitCode=0 Feb 19 19:08:17 crc kubenswrapper[4749]: I0219 19:08:17.363183 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86qdr" event={"ID":"984f5a1e-6057-49c2-aa79-19b6e2e8e51d","Type":"ContainerDied","Data":"bb2678bc6f7d5dcceee7b201370efe2f5d207a1d0b1032f4610139bb436b33c8"} Feb 19 19:08:17 crc kubenswrapper[4749]: I0219 19:08:17.363204 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86qdr" Feb 19 19:08:17 crc kubenswrapper[4749]: I0219 19:08:17.363219 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86qdr" event={"ID":"984f5a1e-6057-49c2-aa79-19b6e2e8e51d","Type":"ContainerDied","Data":"f24cd3a868538b592764da7ad227051b43adfe239a40441f86b8c07d7399d9d8"} Feb 19 19:08:17 crc kubenswrapper[4749]: I0219 19:08:17.363245 4749 scope.go:117] "RemoveContainer" containerID="bb2678bc6f7d5dcceee7b201370efe2f5d207a1d0b1032f4610139bb436b33c8" Feb 19 19:08:17 crc kubenswrapper[4749]: I0219 19:08:17.385200 4749 scope.go:117] "RemoveContainer" containerID="65d3b408ec2a0942bf2d522324f108e4c659a58719cf2820fd79615c4ba7bfae" Feb 19 19:08:17 crc kubenswrapper[4749]: I0219 19:08:17.427476 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-86qdr"] Feb 19 19:08:17 crc kubenswrapper[4749]: I0219 19:08:17.442632 4749 scope.go:117] "RemoveContainer" containerID="698ed67684f5a7f57043dcc7fae618cde859eb26ad20e91d69c5ddf0c4116ae5" Feb 19 19:08:17 crc kubenswrapper[4749]: I0219 19:08:17.443425 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-86qdr"] Feb 19 19:08:17 crc kubenswrapper[4749]: I0219 19:08:17.464992 4749 scope.go:117] "RemoveContainer" containerID="bb2678bc6f7d5dcceee7b201370efe2f5d207a1d0b1032f4610139bb436b33c8" Feb 19 19:08:17 crc kubenswrapper[4749]: E0219 19:08:17.465544 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb2678bc6f7d5dcceee7b201370efe2f5d207a1d0b1032f4610139bb436b33c8\": container with ID starting with bb2678bc6f7d5dcceee7b201370efe2f5d207a1d0b1032f4610139bb436b33c8 not found: ID does not exist" containerID="bb2678bc6f7d5dcceee7b201370efe2f5d207a1d0b1032f4610139bb436b33c8" Feb 19 19:08:17 crc kubenswrapper[4749]: I0219 19:08:17.465679 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb2678bc6f7d5dcceee7b201370efe2f5d207a1d0b1032f4610139bb436b33c8"} err="failed to get container status \"bb2678bc6f7d5dcceee7b201370efe2f5d207a1d0b1032f4610139bb436b33c8\": rpc error: code = NotFound desc = could not find container \"bb2678bc6f7d5dcceee7b201370efe2f5d207a1d0b1032f4610139bb436b33c8\": container with ID starting with bb2678bc6f7d5dcceee7b201370efe2f5d207a1d0b1032f4610139bb436b33c8 not found: ID does not exist" Feb 19 19:08:17 crc kubenswrapper[4749]: I0219 19:08:17.465784 4749 scope.go:117] "RemoveContainer" containerID="65d3b408ec2a0942bf2d522324f108e4c659a58719cf2820fd79615c4ba7bfae" Feb 19 19:08:17 crc kubenswrapper[4749]: E0219 19:08:17.466277 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65d3b408ec2a0942bf2d522324f108e4c659a58719cf2820fd79615c4ba7bfae\": container with ID starting with 65d3b408ec2a0942bf2d522324f108e4c659a58719cf2820fd79615c4ba7bfae not found: ID does not exist" containerID="65d3b408ec2a0942bf2d522324f108e4c659a58719cf2820fd79615c4ba7bfae" Feb 19 19:08:17 crc kubenswrapper[4749]: I0219 19:08:17.466421 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65d3b408ec2a0942bf2d522324f108e4c659a58719cf2820fd79615c4ba7bfae"} err="failed to get container status \"65d3b408ec2a0942bf2d522324f108e4c659a58719cf2820fd79615c4ba7bfae\": rpc error: code = NotFound desc = could not find container \"65d3b408ec2a0942bf2d522324f108e4c659a58719cf2820fd79615c4ba7bfae\": container with ID starting with 65d3b408ec2a0942bf2d522324f108e4c659a58719cf2820fd79615c4ba7bfae not found: ID does not exist" Feb 19 19:08:17 crc kubenswrapper[4749]: I0219 19:08:17.466535 4749 scope.go:117] "RemoveContainer" containerID="698ed67684f5a7f57043dcc7fae618cde859eb26ad20e91d69c5ddf0c4116ae5" Feb 19 19:08:17 crc kubenswrapper[4749]: E0219 19:08:17.466900 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"698ed67684f5a7f57043dcc7fae618cde859eb26ad20e91d69c5ddf0c4116ae5\": container with ID starting with 698ed67684f5a7f57043dcc7fae618cde859eb26ad20e91d69c5ddf0c4116ae5 not found: ID does not exist" containerID="698ed67684f5a7f57043dcc7fae618cde859eb26ad20e91d69c5ddf0c4116ae5" Feb 19 19:08:17 crc kubenswrapper[4749]: I0219 19:08:17.466999 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"698ed67684f5a7f57043dcc7fae618cde859eb26ad20e91d69c5ddf0c4116ae5"} err="failed to get container status \"698ed67684f5a7f57043dcc7fae618cde859eb26ad20e91d69c5ddf0c4116ae5\": rpc error: code = NotFound desc = could not find container \"698ed67684f5a7f57043dcc7fae618cde859eb26ad20e91d69c5ddf0c4116ae5\": container with ID starting with 698ed67684f5a7f57043dcc7fae618cde859eb26ad20e91d69c5ddf0c4116ae5 not found: ID does not exist" Feb 19 19:08:18 crc kubenswrapper[4749]: I0219 19:08:18.698607 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="984f5a1e-6057-49c2-aa79-19b6e2e8e51d" path="/var/lib/kubelet/pods/984f5a1e-6057-49c2-aa79-19b6e2e8e51d/volumes" Feb 19 19:08:34 crc kubenswrapper[4749]: I0219 19:08:34.519813 4749 generic.go:334] "Generic (PLEG): container finished" podID="cdcbdafe-8bad-41be-91bb-59bc54994227" containerID="24c1403f0df3555a45cae1667828a4faee92a3774cf0870944106427fb359660" exitCode=0 Feb 19 19:08:34 crc kubenswrapper[4749]: I0219 19:08:34.519921 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27" event={"ID":"cdcbdafe-8bad-41be-91bb-59bc54994227","Type":"ContainerDied","Data":"24c1403f0df3555a45cae1667828a4faee92a3774cf0870944106427fb359660"} Feb 19 19:08:35 crc kubenswrapper[4749]: I0219 19:08:35.949405 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.146627 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-ssh-key-openstack-edpm-ipam\") pod \"cdcbdafe-8bad-41be-91bb-59bc54994227\" (UID: \"cdcbdafe-8bad-41be-91bb-59bc54994227\") " Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.146757 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-neutron-ovn-metadata-agent-neutron-config-0\") pod \"cdcbdafe-8bad-41be-91bb-59bc54994227\" (UID: \"cdcbdafe-8bad-41be-91bb-59bc54994227\") " Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.146803 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-inventory\") pod \"cdcbdafe-8bad-41be-91bb-59bc54994227\" (UID: \"cdcbdafe-8bad-41be-91bb-59bc54994227\") " Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.146841 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-nova-metadata-neutron-config-0\") pod \"cdcbdafe-8bad-41be-91bb-59bc54994227\" (UID: \"cdcbdafe-8bad-41be-91bb-59bc54994227\") " Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.146875 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-neutron-metadata-combined-ca-bundle\") pod \"cdcbdafe-8bad-41be-91bb-59bc54994227\" (UID: \"cdcbdafe-8bad-41be-91bb-59bc54994227\") " Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.146900 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn4v9\" (UniqueName: \"kubernetes.io/projected/cdcbdafe-8bad-41be-91bb-59bc54994227-kube-api-access-tn4v9\") pod \"cdcbdafe-8bad-41be-91bb-59bc54994227\" (UID: \"cdcbdafe-8bad-41be-91bb-59bc54994227\") " Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.158250 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "cdcbdafe-8bad-41be-91bb-59bc54994227" (UID: "cdcbdafe-8bad-41be-91bb-59bc54994227"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.158373 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdcbdafe-8bad-41be-91bb-59bc54994227-kube-api-access-tn4v9" (OuterVolumeSpecName: "kube-api-access-tn4v9") pod "cdcbdafe-8bad-41be-91bb-59bc54994227" (UID: "cdcbdafe-8bad-41be-91bb-59bc54994227"). InnerVolumeSpecName "kube-api-access-tn4v9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.175674 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "cdcbdafe-8bad-41be-91bb-59bc54994227" (UID: "cdcbdafe-8bad-41be-91bb-59bc54994227"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.175753 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-inventory" (OuterVolumeSpecName: "inventory") pod "cdcbdafe-8bad-41be-91bb-59bc54994227" (UID: "cdcbdafe-8bad-41be-91bb-59bc54994227"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.178232 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "cdcbdafe-8bad-41be-91bb-59bc54994227" (UID: "cdcbdafe-8bad-41be-91bb-59bc54994227"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.184049 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cdcbdafe-8bad-41be-91bb-59bc54994227" (UID: "cdcbdafe-8bad-41be-91bb-59bc54994227"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.249209 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.249247 4749 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.249268 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.249284 4749 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.249298 4749 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdcbdafe-8bad-41be-91bb-59bc54994227-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.249312 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn4v9\" (UniqueName: \"kubernetes.io/projected/cdcbdafe-8bad-41be-91bb-59bc54994227-kube-api-access-tn4v9\") on node \"crc\" DevicePath \"\"" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.545002 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27" event={"ID":"cdcbdafe-8bad-41be-91bb-59bc54994227","Type":"ContainerDied","Data":"666738555d447f75e32ca2f0f0ea12a4d7797da1cb21bbe444d4dba912bf2dc1"} Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.545407 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="666738555d447f75e32ca2f0f0ea12a4d7797da1cb21bbe444d4dba912bf2dc1" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.545100 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.691157 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4"] Feb 19 19:08:36 crc kubenswrapper[4749]: E0219 19:08:36.691536 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="984f5a1e-6057-49c2-aa79-19b6e2e8e51d" containerName="extract-utilities" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.691554 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="984f5a1e-6057-49c2-aa79-19b6e2e8e51d" containerName="extract-utilities" Feb 19 19:08:36 crc kubenswrapper[4749]: E0219 19:08:36.691584 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="984f5a1e-6057-49c2-aa79-19b6e2e8e51d" containerName="registry-server" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.691592 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="984f5a1e-6057-49c2-aa79-19b6e2e8e51d" containerName="registry-server" Feb 19 19:08:36 crc kubenswrapper[4749]: E0219 19:08:36.691613 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="984f5a1e-6057-49c2-aa79-19b6e2e8e51d" containerName="extract-content" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.691620 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="984f5a1e-6057-49c2-aa79-19b6e2e8e51d" containerName="extract-content" Feb 19 19:08:36 crc kubenswrapper[4749]: E0219 19:08:36.691638 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcbdafe-8bad-41be-91bb-59bc54994227" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.691646 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcbdafe-8bad-41be-91bb-59bc54994227" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.691887 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="984f5a1e-6057-49c2-aa79-19b6e2e8e51d" containerName="registry-server" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.691914 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcbdafe-8bad-41be-91bb-59bc54994227" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.692766 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.695727 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.695916 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.695978 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.696065 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.696218 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4pz98" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.703825 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4"] Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.758703 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1032ad4c-247e-48e1-805c-31aadc54415d-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4\" (UID: \"1032ad4c-247e-48e1-805c-31aadc54415d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.758751 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1032ad4c-247e-48e1-805c-31aadc54415d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4\" (UID: \"1032ad4c-247e-48e1-805c-31aadc54415d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.758822 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7tqb\" (UniqueName: \"kubernetes.io/projected/1032ad4c-247e-48e1-805c-31aadc54415d-kube-api-access-h7tqb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4\" (UID: \"1032ad4c-247e-48e1-805c-31aadc54415d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.758908 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1032ad4c-247e-48e1-805c-31aadc54415d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4\" (UID: \"1032ad4c-247e-48e1-805c-31aadc54415d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.758997 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/1032ad4c-247e-48e1-805c-31aadc54415d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4\" (UID: \"1032ad4c-247e-48e1-805c-31aadc54415d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.860445 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7tqb\" (UniqueName: \"kubernetes.io/projected/1032ad4c-247e-48e1-805c-31aadc54415d-kube-api-access-h7tqb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4\" (UID: \"1032ad4c-247e-48e1-805c-31aadc54415d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.860535 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1032ad4c-247e-48e1-805c-31aadc54415d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4\" (UID: \"1032ad4c-247e-48e1-805c-31aadc54415d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.860661 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/1032ad4c-247e-48e1-805c-31aadc54415d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4\" (UID: \"1032ad4c-247e-48e1-805c-31aadc54415d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.860766 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1032ad4c-247e-48e1-805c-31aadc54415d-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4\" (UID: \"1032ad4c-247e-48e1-805c-31aadc54415d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.860802 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1032ad4c-247e-48e1-805c-31aadc54415d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4\" (UID: \"1032ad4c-247e-48e1-805c-31aadc54415d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.866435 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1032ad4c-247e-48e1-805c-31aadc54415d-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4\" (UID: \"1032ad4c-247e-48e1-805c-31aadc54415d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.866560 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1032ad4c-247e-48e1-805c-31aadc54415d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4\" (UID: \"1032ad4c-247e-48e1-805c-31aadc54415d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.866949 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/1032ad4c-247e-48e1-805c-31aadc54415d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4\" (UID: \"1032ad4c-247e-48e1-805c-31aadc54415d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.867647 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1032ad4c-247e-48e1-805c-31aadc54415d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4\" (UID: \"1032ad4c-247e-48e1-805c-31aadc54415d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4" Feb 19 19:08:36 crc kubenswrapper[4749]: I0219 19:08:36.888634 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7tqb\" (UniqueName: \"kubernetes.io/projected/1032ad4c-247e-48e1-805c-31aadc54415d-kube-api-access-h7tqb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4\" (UID: \"1032ad4c-247e-48e1-805c-31aadc54415d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4" Feb 19 19:08:37 crc kubenswrapper[4749]: I0219 19:08:37.022644 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4" Feb 19 19:08:37 crc kubenswrapper[4749]: I0219 19:08:37.583225 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4"] Feb 19 19:08:38 crc kubenswrapper[4749]: I0219 19:08:38.562853 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4" event={"ID":"1032ad4c-247e-48e1-805c-31aadc54415d","Type":"ContainerStarted","Data":"672c27014926a224e69a7e034cd04363e78bb4a197cfa13865b5c85cb750d4e7"} Feb 19 19:08:38 crc kubenswrapper[4749]: I0219 19:08:38.564805 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4" event={"ID":"1032ad4c-247e-48e1-805c-31aadc54415d","Type":"ContainerStarted","Data":"76b60e22d22b076e4f744a04facaf9f6ce2eab6fb053a1e66ea27de237695427"} Feb 19 19:08:38 crc kubenswrapper[4749]: I0219 19:08:38.589843 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4" podStartSLOduration=2.168323715 podStartE2EDuration="2.589817824s" podCreationTimestamp="2026-02-19 19:08:36 +0000 UTC" firstStartedPulling="2026-02-19 19:08:37.593529966 +0000 UTC m=+2091.554749920" lastFinishedPulling="2026-02-19 19:08:38.015024075 +0000 UTC m=+2091.976244029" observedRunningTime="2026-02-19 19:08:38.5780871 +0000 UTC m=+2092.539307074" watchObservedRunningTime="2026-02-19 19:08:38.589817824 +0000 UTC m=+2092.551037798" Feb 19 19:08:54 crc kubenswrapper[4749]: I0219 19:08:54.724970 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:08:54 crc kubenswrapper[4749]: I0219 19:08:54.725511 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:09:24 crc kubenswrapper[4749]: I0219 19:09:24.725730 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:09:24 crc kubenswrapper[4749]: I0219 19:09:24.726314 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:09:54 crc kubenswrapper[4749]: I0219 19:09:54.725444 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:09:54 crc kubenswrapper[4749]: I0219 19:09:54.726195 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:09:54 crc kubenswrapper[4749]: I0219 19:09:54.726263 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 19:09:54 crc kubenswrapper[4749]: I0219 19:09:54.727410 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c5ad61d4b4012dbeef208de5f27d39e7f654c70f2c2ac0549c67e3ec110793ae"} pod="openshift-machine-config-operator/machine-config-daemon-nzldt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:09:54 crc kubenswrapper[4749]: I0219 19:09:54.727508 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" containerID="cri-o://c5ad61d4b4012dbeef208de5f27d39e7f654c70f2c2ac0549c67e3ec110793ae" gracePeriod=600 Feb 19 19:09:55 crc kubenswrapper[4749]: I0219 19:09:55.324300 4749 generic.go:334] "Generic (PLEG): container finished" podID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerID="c5ad61d4b4012dbeef208de5f27d39e7f654c70f2c2ac0549c67e3ec110793ae" exitCode=0 Feb 19 19:09:55 crc kubenswrapper[4749]: I0219 19:09:55.324478 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerDied","Data":"c5ad61d4b4012dbeef208de5f27d39e7f654c70f2c2ac0549c67e3ec110793ae"} Feb 19 19:09:55 crc kubenswrapper[4749]: I0219 19:09:55.324528 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerStarted","Data":"8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55"} Feb 19 19:09:55 crc kubenswrapper[4749]: I0219 19:09:55.324547 4749 scope.go:117] "RemoveContainer" containerID="45ae2dc65ae4cc63b8473879837354344505b334c75e25fc391dff3af3c1ed1f" Feb 19 19:12:24 crc kubenswrapper[4749]: I0219 19:12:24.725643 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:12:24 crc kubenswrapper[4749]: I0219 19:12:24.726445 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:12:31 crc kubenswrapper[4749]: I0219 19:12:31.887613 4749 generic.go:334] "Generic (PLEG): container finished" podID="1032ad4c-247e-48e1-805c-31aadc54415d" containerID="672c27014926a224e69a7e034cd04363e78bb4a197cfa13865b5c85cb750d4e7" exitCode=0 Feb 19 19:12:31 crc kubenswrapper[4749]: I0219 19:12:31.887698 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4" event={"ID":"1032ad4c-247e-48e1-805c-31aadc54415d","Type":"ContainerDied","Data":"672c27014926a224e69a7e034cd04363e78bb4a197cfa13865b5c85cb750d4e7"} Feb 19 19:12:33 crc kubenswrapper[4749]: I0219 19:12:33.364391 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4" Feb 19 19:12:33 crc kubenswrapper[4749]: I0219 19:12:33.427203 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/1032ad4c-247e-48e1-805c-31aadc54415d-libvirt-secret-0\") pod \"1032ad4c-247e-48e1-805c-31aadc54415d\" (UID: \"1032ad4c-247e-48e1-805c-31aadc54415d\") " Feb 19 19:12:33 crc kubenswrapper[4749]: I0219 19:12:33.427513 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7tqb\" (UniqueName: \"kubernetes.io/projected/1032ad4c-247e-48e1-805c-31aadc54415d-kube-api-access-h7tqb\") pod \"1032ad4c-247e-48e1-805c-31aadc54415d\" (UID: \"1032ad4c-247e-48e1-805c-31aadc54415d\") " Feb 19 19:12:33 crc kubenswrapper[4749]: I0219 19:12:33.427602 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1032ad4c-247e-48e1-805c-31aadc54415d-ssh-key-openstack-edpm-ipam\") pod \"1032ad4c-247e-48e1-805c-31aadc54415d\" (UID: \"1032ad4c-247e-48e1-805c-31aadc54415d\") " Feb 19 19:12:33 crc kubenswrapper[4749]: I0219 19:12:33.427683 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1032ad4c-247e-48e1-805c-31aadc54415d-libvirt-combined-ca-bundle\") pod \"1032ad4c-247e-48e1-805c-31aadc54415d\" (UID: \"1032ad4c-247e-48e1-805c-31aadc54415d\") " Feb 19 19:12:33 crc kubenswrapper[4749]: I0219 19:12:33.427721 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1032ad4c-247e-48e1-805c-31aadc54415d-inventory\") pod \"1032ad4c-247e-48e1-805c-31aadc54415d\" (UID: \"1032ad4c-247e-48e1-805c-31aadc54415d\") " Feb 19 19:12:33 crc kubenswrapper[4749]: I0219 19:12:33.438202 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1032ad4c-247e-48e1-805c-31aadc54415d-kube-api-access-h7tqb" (OuterVolumeSpecName: "kube-api-access-h7tqb") pod "1032ad4c-247e-48e1-805c-31aadc54415d" (UID: "1032ad4c-247e-48e1-805c-31aadc54415d"). InnerVolumeSpecName "kube-api-access-h7tqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:12:33 crc kubenswrapper[4749]: I0219 19:12:33.452198 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1032ad4c-247e-48e1-805c-31aadc54415d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "1032ad4c-247e-48e1-805c-31aadc54415d" (UID: "1032ad4c-247e-48e1-805c-31aadc54415d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:12:33 crc kubenswrapper[4749]: I0219 19:12:33.460657 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1032ad4c-247e-48e1-805c-31aadc54415d-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "1032ad4c-247e-48e1-805c-31aadc54415d" (UID: "1032ad4c-247e-48e1-805c-31aadc54415d"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:12:33 crc kubenswrapper[4749]: I0219 19:12:33.461227 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1032ad4c-247e-48e1-805c-31aadc54415d-inventory" (OuterVolumeSpecName: "inventory") pod "1032ad4c-247e-48e1-805c-31aadc54415d" (UID: "1032ad4c-247e-48e1-805c-31aadc54415d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:12:33 crc kubenswrapper[4749]: I0219 19:12:33.472716 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1032ad4c-247e-48e1-805c-31aadc54415d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1032ad4c-247e-48e1-805c-31aadc54415d" (UID: "1032ad4c-247e-48e1-805c-31aadc54415d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:12:33 crc kubenswrapper[4749]: I0219 19:12:33.530544 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7tqb\" (UniqueName: \"kubernetes.io/projected/1032ad4c-247e-48e1-805c-31aadc54415d-kube-api-access-h7tqb\") on node \"crc\" DevicePath \"\"" Feb 19 19:12:33 crc kubenswrapper[4749]: I0219 19:12:33.530575 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1032ad4c-247e-48e1-805c-31aadc54415d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:12:33 crc kubenswrapper[4749]: I0219 19:12:33.530586 4749 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1032ad4c-247e-48e1-805c-31aadc54415d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:12:33 crc kubenswrapper[4749]: I0219 19:12:33.530595 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1032ad4c-247e-48e1-805c-31aadc54415d-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:12:33 crc kubenswrapper[4749]: I0219 19:12:33.530605 4749 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/1032ad4c-247e-48e1-805c-31aadc54415d-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:12:33 crc kubenswrapper[4749]: I0219 19:12:33.918222 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4" event={"ID":"1032ad4c-247e-48e1-805c-31aadc54415d","Type":"ContainerDied","Data":"76b60e22d22b076e4f744a04facaf9f6ce2eab6fb053a1e66ea27de237695427"} Feb 19 19:12:33 crc kubenswrapper[4749]: I0219 19:12:33.918258 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76b60e22d22b076e4f744a04facaf9f6ce2eab6fb053a1e66ea27de237695427" Feb 19 19:12:33 crc kubenswrapper[4749]: I0219 19:12:33.918259 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.083520 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj"] Feb 19 19:12:34 crc kubenswrapper[4749]: E0219 19:12:34.084294 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1032ad4c-247e-48e1-805c-31aadc54415d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.084312 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1032ad4c-247e-48e1-805c-31aadc54415d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.084505 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1032ad4c-247e-48e1-805c-31aadc54415d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.085230 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.087326 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4pz98" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.087950 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.093961 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.094039 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.095122 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.095438 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.095914 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.096377 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj"] Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.150962 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e120e358-960c-435a-9655-35499a01c0c0-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.151322 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.151545 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.151706 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.151769 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75b27\" (UniqueName: \"kubernetes.io/projected/e120e358-960c-435a-9655-35499a01c0c0-kube-api-access-75b27\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.151858 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.151933 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.151965 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.152018 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.152279 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.152408 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.253767 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.254147 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.254276 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.254421 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e120e358-960c-435a-9655-35499a01c0c0-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.254523 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.254631 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.254755 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.254850 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75b27\" (UniqueName: \"kubernetes.io/projected/e120e358-960c-435a-9655-35499a01c0c0-kube-api-access-75b27\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.254948 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.255055 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.255159 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.256614 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e120e358-960c-435a-9655-35499a01c0c0-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.261632 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.261647 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.261960 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.263241 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.263624 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.264392 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.264842 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.266452 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.266867 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.273590 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75b27\" (UniqueName: \"kubernetes.io/projected/e120e358-960c-435a-9655-35499a01c0c0-kube-api-access-75b27\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr2gj\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.411425 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.975291 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj"] Feb 19 19:12:34 crc kubenswrapper[4749]: I0219 19:12:34.978414 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 19:12:35 crc kubenswrapper[4749]: I0219 19:12:35.935744 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" event={"ID":"e120e358-960c-435a-9655-35499a01c0c0","Type":"ContainerStarted","Data":"f4be0b10594f752d1a200b6e9843ed9ac34892d2455448ea491c653570fcd96c"} Feb 19 19:12:35 crc kubenswrapper[4749]: I0219 19:12:35.936282 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" event={"ID":"e120e358-960c-435a-9655-35499a01c0c0","Type":"ContainerStarted","Data":"f9eca380104e2dab07c1cc16d2464907f4501836c59018d6a880e6643af13400"} Feb 19 19:12:35 crc kubenswrapper[4749]: I0219 19:12:35.979231 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" podStartSLOduration=1.5975549789999999 podStartE2EDuration="1.979204256s" podCreationTimestamp="2026-02-19 19:12:34 +0000 UTC" firstStartedPulling="2026-02-19 19:12:34.978183142 +0000 UTC m=+2328.939403096" lastFinishedPulling="2026-02-19 19:12:35.359832419 +0000 UTC m=+2329.321052373" observedRunningTime="2026-02-19 19:12:35.969514692 +0000 UTC m=+2329.930734656" watchObservedRunningTime="2026-02-19 19:12:35.979204256 +0000 UTC m=+2329.940424240" Feb 19 19:12:54 crc kubenswrapper[4749]: I0219 19:12:54.725777 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:12:54 crc kubenswrapper[4749]: I0219 19:12:54.726471 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:13:24 crc kubenswrapper[4749]: I0219 19:13:24.725875 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:13:24 crc kubenswrapper[4749]: I0219 19:13:24.726458 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:13:24 crc kubenswrapper[4749]: I0219 19:13:24.726509 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 19:13:24 crc kubenswrapper[4749]: I0219 19:13:24.727184 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55"} pod="openshift-machine-config-operator/machine-config-daemon-nzldt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:13:24 crc kubenswrapper[4749]: I0219 19:13:24.727241 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" containerID="cri-o://8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" gracePeriod=600 Feb 19 19:13:24 crc kubenswrapper[4749]: E0219 19:13:24.870827 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:13:25 crc kubenswrapper[4749]: I0219 19:13:25.396461 4749 generic.go:334] "Generic (PLEG): container finished" podID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerID="8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" exitCode=0 Feb 19 19:13:25 crc kubenswrapper[4749]: I0219 19:13:25.396512 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerDied","Data":"8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55"} Feb 19 19:13:25 crc kubenswrapper[4749]: I0219 19:13:25.396553 4749 scope.go:117] "RemoveContainer" containerID="c5ad61d4b4012dbeef208de5f27d39e7f654c70f2c2ac0549c67e3ec110793ae" Feb 19 19:13:25 crc kubenswrapper[4749]: I0219 19:13:25.397365 4749 scope.go:117] "RemoveContainer" containerID="8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" Feb 19 19:13:25 crc kubenswrapper[4749]: E0219 19:13:25.397771 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:13:38 crc kubenswrapper[4749]: I0219 19:13:38.679591 4749 scope.go:117] "RemoveContainer" containerID="8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" Feb 19 19:13:38 crc kubenswrapper[4749]: E0219 19:13:38.680502 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:13:51 crc kubenswrapper[4749]: I0219 19:13:51.679232 4749 scope.go:117] "RemoveContainer" containerID="8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" Feb 19 19:13:51 crc kubenswrapper[4749]: E0219 19:13:51.680137 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:14:04 crc kubenswrapper[4749]: I0219 19:14:04.679114 4749 scope.go:117] "RemoveContainer" containerID="8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" Feb 19 19:14:04 crc kubenswrapper[4749]: E0219 19:14:04.680064 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:14:19 crc kubenswrapper[4749]: I0219 19:14:19.679088 4749 scope.go:117] "RemoveContainer" containerID="8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" Feb 19 19:14:19 crc kubenswrapper[4749]: E0219 19:14:19.679891 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:14:34 crc kubenswrapper[4749]: I0219 19:14:34.680202 4749 scope.go:117] "RemoveContainer" containerID="8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" Feb 19 19:14:34 crc kubenswrapper[4749]: E0219 19:14:34.682816 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:14:47 crc kubenswrapper[4749]: I0219 19:14:47.679630 4749 scope.go:117] "RemoveContainer" containerID="8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" Feb 19 19:14:47 crc kubenswrapper[4749]: E0219 19:14:47.681091 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:14:58 crc kubenswrapper[4749]: I0219 19:14:58.217802 4749 generic.go:334] "Generic (PLEG): container finished" podID="e120e358-960c-435a-9655-35499a01c0c0" containerID="f4be0b10594f752d1a200b6e9843ed9ac34892d2455448ea491c653570fcd96c" exitCode=0 Feb 19 19:14:58 crc kubenswrapper[4749]: I0219 19:14:58.217910 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" event={"ID":"e120e358-960c-435a-9655-35499a01c0c0","Type":"ContainerDied","Data":"f4be0b10594f752d1a200b6e9843ed9ac34892d2455448ea491c653570fcd96c"} Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.672659 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.761588 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-inventory\") pod \"e120e358-960c-435a-9655-35499a01c0c0\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.761918 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-cell1-compute-config-1\") pod \"e120e358-960c-435a-9655-35499a01c0c0\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.762012 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-migration-ssh-key-1\") pod \"e120e358-960c-435a-9655-35499a01c0c0\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.762237 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-cell1-compute-config-2\") pod \"e120e358-960c-435a-9655-35499a01c0c0\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.762652 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75b27\" (UniqueName: \"kubernetes.io/projected/e120e358-960c-435a-9655-35499a01c0c0-kube-api-access-75b27\") pod \"e120e358-960c-435a-9655-35499a01c0c0\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.762753 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-ssh-key-openstack-edpm-ipam\") pod \"e120e358-960c-435a-9655-35499a01c0c0\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.762860 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e120e358-960c-435a-9655-35499a01c0c0-nova-extra-config-0\") pod \"e120e358-960c-435a-9655-35499a01c0c0\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.762971 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-migration-ssh-key-0\") pod \"e120e358-960c-435a-9655-35499a01c0c0\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.763087 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-combined-ca-bundle\") pod \"e120e358-960c-435a-9655-35499a01c0c0\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.763470 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-cell1-compute-config-0\") pod \"e120e358-960c-435a-9655-35499a01c0c0\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.763614 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-cell1-compute-config-3\") pod \"e120e358-960c-435a-9655-35499a01c0c0\" (UID: \"e120e358-960c-435a-9655-35499a01c0c0\") " Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.779213 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "e120e358-960c-435a-9655-35499a01c0c0" (UID: "e120e358-960c-435a-9655-35499a01c0c0"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.790314 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e120e358-960c-435a-9655-35499a01c0c0-kube-api-access-75b27" (OuterVolumeSpecName: "kube-api-access-75b27") pod "e120e358-960c-435a-9655-35499a01c0c0" (UID: "e120e358-960c-435a-9655-35499a01c0c0"). InnerVolumeSpecName "kube-api-access-75b27". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.801665 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-inventory" (OuterVolumeSpecName: "inventory") pod "e120e358-960c-435a-9655-35499a01c0c0" (UID: "e120e358-960c-435a-9655-35499a01c0c0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.802307 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e120e358-960c-435a-9655-35499a01c0c0" (UID: "e120e358-960c-435a-9655-35499a01c0c0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.803576 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e120e358-960c-435a-9655-35499a01c0c0-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "e120e358-960c-435a-9655-35499a01c0c0" (UID: "e120e358-960c-435a-9655-35499a01c0c0"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.804652 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "e120e358-960c-435a-9655-35499a01c0c0" (UID: "e120e358-960c-435a-9655-35499a01c0c0"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.806998 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "e120e358-960c-435a-9655-35499a01c0c0" (UID: "e120e358-960c-435a-9655-35499a01c0c0"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.809186 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "e120e358-960c-435a-9655-35499a01c0c0" (UID: "e120e358-960c-435a-9655-35499a01c0c0"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.814075 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "e120e358-960c-435a-9655-35499a01c0c0" (UID: "e120e358-960c-435a-9655-35499a01c0c0"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.820591 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "e120e358-960c-435a-9655-35499a01c0c0" (UID: "e120e358-960c-435a-9655-35499a01c0c0"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.829312 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "e120e358-960c-435a-9655-35499a01c0c0" (UID: "e120e358-960c-435a-9655-35499a01c0c0"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.865713 4749 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.865967 4749 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.866107 4749 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.866171 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.866239 4749 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.866294 4749 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.866348 4749 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.866409 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75b27\" (UniqueName: \"kubernetes.io/projected/e120e358-960c-435a-9655-35499a01c0c0-kube-api-access-75b27\") on node \"crc\" DevicePath \"\"" Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.866470 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.866525 4749 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e120e358-960c-435a-9655-35499a01c0c0-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:14:59 crc kubenswrapper[4749]: I0219 19:14:59.866582 4749 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e120e358-960c-435a-9655-35499a01c0c0-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.146063 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525475-jbrdb"] Feb 19 19:15:00 crc kubenswrapper[4749]: E0219 19:15:00.146643 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e120e358-960c-435a-9655-35499a01c0c0" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.146668 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e120e358-960c-435a-9655-35499a01c0c0" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.146874 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e120e358-960c-435a-9655-35499a01c0c0" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.148017 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-jbrdb" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.152718 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.157089 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525475-jbrdb"] Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.157404 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.244824 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" event={"ID":"e120e358-960c-435a-9655-35499a01c0c0","Type":"ContainerDied","Data":"f9eca380104e2dab07c1cc16d2464907f4501836c59018d6a880e6643af13400"} Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.244875 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9eca380104e2dab07c1cc16d2464907f4501836c59018d6a880e6643af13400" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.244875 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr2gj" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.279894 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a7ee6e5-4a4d-49d3-83f1-59192e92daba-secret-volume\") pod \"collect-profiles-29525475-jbrdb\" (UID: \"9a7ee6e5-4a4d-49d3-83f1-59192e92daba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-jbrdb" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.279953 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a7ee6e5-4a4d-49d3-83f1-59192e92daba-config-volume\") pod \"collect-profiles-29525475-jbrdb\" (UID: \"9a7ee6e5-4a4d-49d3-83f1-59192e92daba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-jbrdb" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.280089 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59s2d\" (UniqueName: \"kubernetes.io/projected/9a7ee6e5-4a4d-49d3-83f1-59192e92daba-kube-api-access-59s2d\") pod \"collect-profiles-29525475-jbrdb\" (UID: \"9a7ee6e5-4a4d-49d3-83f1-59192e92daba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-jbrdb" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.339992 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr"] Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.341562 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.347298 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.347608 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.347890 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.348106 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4pz98" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.348261 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.349761 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr"] Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.381275 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a7ee6e5-4a4d-49d3-83f1-59192e92daba-secret-volume\") pod \"collect-profiles-29525475-jbrdb\" (UID: \"9a7ee6e5-4a4d-49d3-83f1-59192e92daba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-jbrdb" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.381330 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a7ee6e5-4a4d-49d3-83f1-59192e92daba-config-volume\") pod \"collect-profiles-29525475-jbrdb\" (UID: \"9a7ee6e5-4a4d-49d3-83f1-59192e92daba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-jbrdb" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.381459 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59s2d\" (UniqueName: \"kubernetes.io/projected/9a7ee6e5-4a4d-49d3-83f1-59192e92daba-kube-api-access-59s2d\") pod \"collect-profiles-29525475-jbrdb\" (UID: \"9a7ee6e5-4a4d-49d3-83f1-59192e92daba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-jbrdb" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.382558 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a7ee6e5-4a4d-49d3-83f1-59192e92daba-config-volume\") pod \"collect-profiles-29525475-jbrdb\" (UID: \"9a7ee6e5-4a4d-49d3-83f1-59192e92daba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-jbrdb" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.385517 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a7ee6e5-4a4d-49d3-83f1-59192e92daba-secret-volume\") pod \"collect-profiles-29525475-jbrdb\" (UID: \"9a7ee6e5-4a4d-49d3-83f1-59192e92daba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-jbrdb" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.398723 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59s2d\" (UniqueName: \"kubernetes.io/projected/9a7ee6e5-4a4d-49d3-83f1-59192e92daba-kube-api-access-59s2d\") pod \"collect-profiles-29525475-jbrdb\" (UID: \"9a7ee6e5-4a4d-49d3-83f1-59192e92daba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-jbrdb" Feb 19 19:15:00 crc kubenswrapper[4749]: E0219 19:15:00.408429 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode120e358_960c_435a_9655_35499a01c0c0.slice\": RecentStats: unable to find data in memory cache]" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.478733 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-jbrdb" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.483209 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv2d2\" (UniqueName: \"kubernetes.io/projected/39f74bf8-e240-408d-a674-c61fcf66fd06-kube-api-access-xv2d2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tsflr\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.483285 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tsflr\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.483305 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tsflr\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.483525 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tsflr\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.483701 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tsflr\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.483794 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tsflr\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.483972 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tsflr\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.586953 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv2d2\" (UniqueName: \"kubernetes.io/projected/39f74bf8-e240-408d-a674-c61fcf66fd06-kube-api-access-xv2d2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tsflr\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.587270 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tsflr\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.587294 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tsflr\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.587321 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tsflr\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.587348 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tsflr\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.587370 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tsflr\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.587421 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tsflr\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.593539 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tsflr\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.593828 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tsflr\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.594245 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tsflr\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.594997 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tsflr\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.595018 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tsflr\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.597269 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tsflr\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.609997 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv2d2\" (UniqueName: \"kubernetes.io/projected/39f74bf8-e240-408d-a674-c61fcf66fd06-kube-api-access-xv2d2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tsflr\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.658211 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" Feb 19 19:15:00 crc kubenswrapper[4749]: I0219 19:15:00.937889 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525475-jbrdb"] Feb 19 19:15:01 crc kubenswrapper[4749]: W0219 19:15:01.161121 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39f74bf8_e240_408d_a674_c61fcf66fd06.slice/crio-35958c58c5cdaeb0184aed6e1274503167a45e0451d9ceed4f26d492f4c3c7d4 WatchSource:0}: Error finding container 35958c58c5cdaeb0184aed6e1274503167a45e0451d9ceed4f26d492f4c3c7d4: Status 404 returned error can't find the container with id 35958c58c5cdaeb0184aed6e1274503167a45e0451d9ceed4f26d492f4c3c7d4 Feb 19 19:15:01 crc kubenswrapper[4749]: I0219 19:15:01.162683 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr"] Feb 19 19:15:01 crc kubenswrapper[4749]: I0219 19:15:01.254937 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-jbrdb" event={"ID":"9a7ee6e5-4a4d-49d3-83f1-59192e92daba","Type":"ContainerStarted","Data":"f0bf23d1584e1a3ec050cc5c328a9e55559dd4f6d25aa109e51fc657d81f3484"} Feb 19 19:15:01 crc kubenswrapper[4749]: I0219 19:15:01.254980 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-jbrdb" event={"ID":"9a7ee6e5-4a4d-49d3-83f1-59192e92daba","Type":"ContainerStarted","Data":"c52af85c72a82aba614095cae81777764d1ec1ad5b0ab809efd81d625c14faff"} Feb 19 19:15:01 crc kubenswrapper[4749]: I0219 19:15:01.258898 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" event={"ID":"39f74bf8-e240-408d-a674-c61fcf66fd06","Type":"ContainerStarted","Data":"35958c58c5cdaeb0184aed6e1274503167a45e0451d9ceed4f26d492f4c3c7d4"} Feb 19 19:15:01 crc kubenswrapper[4749]: I0219 19:15:01.281901 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-jbrdb" podStartSLOduration=1.281879298 podStartE2EDuration="1.281879298s" podCreationTimestamp="2026-02-19 19:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:15:01.269535311 +0000 UTC m=+2475.230755265" watchObservedRunningTime="2026-02-19 19:15:01.281879298 +0000 UTC m=+2475.243099252" Feb 19 19:15:01 crc kubenswrapper[4749]: I0219 19:15:01.679315 4749 scope.go:117] "RemoveContainer" containerID="8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" Feb 19 19:15:01 crc kubenswrapper[4749]: E0219 19:15:01.679674 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:15:02 crc kubenswrapper[4749]: I0219 19:15:02.284536 4749 generic.go:334] "Generic (PLEG): container finished" podID="9a7ee6e5-4a4d-49d3-83f1-59192e92daba" containerID="f0bf23d1584e1a3ec050cc5c328a9e55559dd4f6d25aa109e51fc657d81f3484" exitCode=0 Feb 19 19:15:02 crc kubenswrapper[4749]: I0219 19:15:02.284893 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-jbrdb" event={"ID":"9a7ee6e5-4a4d-49d3-83f1-59192e92daba","Type":"ContainerDied","Data":"f0bf23d1584e1a3ec050cc5c328a9e55559dd4f6d25aa109e51fc657d81f3484"} Feb 19 19:15:02 crc kubenswrapper[4749]: I0219 19:15:02.286685 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" event={"ID":"39f74bf8-e240-408d-a674-c61fcf66fd06","Type":"ContainerStarted","Data":"cb3f2b06f5bb0d06c284ad677b4c16fa69291ee3d1ecb6c64b3981c0992a0015"} Feb 19 19:15:02 crc kubenswrapper[4749]: I0219 19:15:02.325960 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" podStartSLOduration=1.774007514 podStartE2EDuration="2.325943556s" podCreationTimestamp="2026-02-19 19:15:00 +0000 UTC" firstStartedPulling="2026-02-19 19:15:01.164187315 +0000 UTC m=+2475.125407269" lastFinishedPulling="2026-02-19 19:15:01.716123357 +0000 UTC m=+2475.677343311" observedRunningTime="2026-02-19 19:15:02.325918026 +0000 UTC m=+2476.287138000" watchObservedRunningTime="2026-02-19 19:15:02.325943556 +0000 UTC m=+2476.287163510" Feb 19 19:15:03 crc kubenswrapper[4749]: I0219 19:15:03.636786 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-jbrdb" Feb 19 19:15:03 crc kubenswrapper[4749]: I0219 19:15:03.758454 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a7ee6e5-4a4d-49d3-83f1-59192e92daba-secret-volume\") pod \"9a7ee6e5-4a4d-49d3-83f1-59192e92daba\" (UID: \"9a7ee6e5-4a4d-49d3-83f1-59192e92daba\") " Feb 19 19:15:03 crc kubenswrapper[4749]: I0219 19:15:03.758867 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59s2d\" (UniqueName: \"kubernetes.io/projected/9a7ee6e5-4a4d-49d3-83f1-59192e92daba-kube-api-access-59s2d\") pod \"9a7ee6e5-4a4d-49d3-83f1-59192e92daba\" (UID: \"9a7ee6e5-4a4d-49d3-83f1-59192e92daba\") " Feb 19 19:15:03 crc kubenswrapper[4749]: I0219 19:15:03.758916 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a7ee6e5-4a4d-49d3-83f1-59192e92daba-config-volume\") pod \"9a7ee6e5-4a4d-49d3-83f1-59192e92daba\" (UID: \"9a7ee6e5-4a4d-49d3-83f1-59192e92daba\") " Feb 19 19:15:03 crc kubenswrapper[4749]: I0219 19:15:03.759753 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a7ee6e5-4a4d-49d3-83f1-59192e92daba-config-volume" (OuterVolumeSpecName: "config-volume") pod "9a7ee6e5-4a4d-49d3-83f1-59192e92daba" (UID: "9a7ee6e5-4a4d-49d3-83f1-59192e92daba"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:15:03 crc kubenswrapper[4749]: I0219 19:15:03.766229 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a7ee6e5-4a4d-49d3-83f1-59192e92daba-kube-api-access-59s2d" (OuterVolumeSpecName: "kube-api-access-59s2d") pod "9a7ee6e5-4a4d-49d3-83f1-59192e92daba" (UID: "9a7ee6e5-4a4d-49d3-83f1-59192e92daba"). InnerVolumeSpecName "kube-api-access-59s2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:15:03 crc kubenswrapper[4749]: I0219 19:15:03.768438 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a7ee6e5-4a4d-49d3-83f1-59192e92daba-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9a7ee6e5-4a4d-49d3-83f1-59192e92daba" (UID: "9a7ee6e5-4a4d-49d3-83f1-59192e92daba"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:15:03 crc kubenswrapper[4749]: I0219 19:15:03.861129 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a7ee6e5-4a4d-49d3-83f1-59192e92daba-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 19:15:03 crc kubenswrapper[4749]: I0219 19:15:03.861169 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59s2d\" (UniqueName: \"kubernetes.io/projected/9a7ee6e5-4a4d-49d3-83f1-59192e92daba-kube-api-access-59s2d\") on node \"crc\" DevicePath \"\"" Feb 19 19:15:03 crc kubenswrapper[4749]: I0219 19:15:03.861180 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a7ee6e5-4a4d-49d3-83f1-59192e92daba-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 19:15:04 crc kubenswrapper[4749]: I0219 19:15:04.312993 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-jbrdb" event={"ID":"9a7ee6e5-4a4d-49d3-83f1-59192e92daba","Type":"ContainerDied","Data":"c52af85c72a82aba614095cae81777764d1ec1ad5b0ab809efd81d625c14faff"} Feb 19 19:15:04 crc kubenswrapper[4749]: I0219 19:15:04.313046 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c52af85c72a82aba614095cae81777764d1ec1ad5b0ab809efd81d625c14faff" Feb 19 19:15:04 crc kubenswrapper[4749]: I0219 19:15:04.313086 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-jbrdb" Feb 19 19:15:04 crc kubenswrapper[4749]: I0219 19:15:04.346842 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525430-b9kh6"] Feb 19 19:15:04 crc kubenswrapper[4749]: I0219 19:15:04.358106 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525430-b9kh6"] Feb 19 19:15:04 crc kubenswrapper[4749]: I0219 19:15:04.692431 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2e534c2-c769-4ad3-942b-d181ed2cf11e" path="/var/lib/kubelet/pods/c2e534c2-c769-4ad3-942b-d181ed2cf11e/volumes" Feb 19 19:15:15 crc kubenswrapper[4749]: I0219 19:15:15.679844 4749 scope.go:117] "RemoveContainer" containerID="8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" Feb 19 19:15:15 crc kubenswrapper[4749]: E0219 19:15:15.681054 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:15:18 crc kubenswrapper[4749]: I0219 19:15:18.837800 4749 scope.go:117] "RemoveContainer" containerID="7b15c2b1a142dfdad1583f214541aacdcd68db7a0d7388d30b6da261a6fe8f8c" Feb 19 19:15:26 crc kubenswrapper[4749]: I0219 19:15:26.686895 4749 scope.go:117] "RemoveContainer" containerID="8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" Feb 19 19:15:26 crc kubenswrapper[4749]: E0219 19:15:26.687766 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:15:37 crc kubenswrapper[4749]: I0219 19:15:37.679856 4749 scope.go:117] "RemoveContainer" containerID="8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" Feb 19 19:15:37 crc kubenswrapper[4749]: E0219 19:15:37.680712 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:15:51 crc kubenswrapper[4749]: I0219 19:15:51.679300 4749 scope.go:117] "RemoveContainer" containerID="8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" Feb 19 19:15:51 crc kubenswrapper[4749]: E0219 19:15:51.680063 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:16:04 crc kubenswrapper[4749]: I0219 19:16:04.679187 4749 scope.go:117] "RemoveContainer" containerID="8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" Feb 19 19:16:04 crc kubenswrapper[4749]: E0219 19:16:04.680047 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:16:19 crc kubenswrapper[4749]: I0219 19:16:19.679383 4749 scope.go:117] "RemoveContainer" containerID="8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" Feb 19 19:16:19 crc kubenswrapper[4749]: E0219 19:16:19.680172 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:16:31 crc kubenswrapper[4749]: I0219 19:16:31.679787 4749 scope.go:117] "RemoveContainer" containerID="8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" Feb 19 19:16:31 crc kubenswrapper[4749]: E0219 19:16:31.680590 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:16:43 crc kubenswrapper[4749]: I0219 19:16:43.679658 4749 scope.go:117] "RemoveContainer" containerID="8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" Feb 19 19:16:43 crc kubenswrapper[4749]: E0219 19:16:43.681838 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:16:55 crc kubenswrapper[4749]: I0219 19:16:55.679382 4749 scope.go:117] "RemoveContainer" containerID="8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" Feb 19 19:16:55 crc kubenswrapper[4749]: E0219 19:16:55.680355 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:16:56 crc kubenswrapper[4749]: I0219 19:16:56.338858 4749 generic.go:334] "Generic (PLEG): container finished" podID="39f74bf8-e240-408d-a674-c61fcf66fd06" containerID="cb3f2b06f5bb0d06c284ad677b4c16fa69291ee3d1ecb6c64b3981c0992a0015" exitCode=0 Feb 19 19:16:56 crc kubenswrapper[4749]: I0219 19:16:56.338940 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" event={"ID":"39f74bf8-e240-408d-a674-c61fcf66fd06","Type":"ContainerDied","Data":"cb3f2b06f5bb0d06c284ad677b4c16fa69291ee3d1ecb6c64b3981c0992a0015"} Feb 19 19:16:57 crc kubenswrapper[4749]: I0219 19:16:57.745075 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" Feb 19 19:16:57 crc kubenswrapper[4749]: I0219 19:16:57.903187 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-ssh-key-openstack-edpm-ipam\") pod \"39f74bf8-e240-408d-a674-c61fcf66fd06\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " Feb 19 19:16:57 crc kubenswrapper[4749]: I0219 19:16:57.903320 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-ceilometer-compute-config-data-1\") pod \"39f74bf8-e240-408d-a674-c61fcf66fd06\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " Feb 19 19:16:57 crc kubenswrapper[4749]: I0219 19:16:57.903375 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-ceilometer-compute-config-data-0\") pod \"39f74bf8-e240-408d-a674-c61fcf66fd06\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " Feb 19 19:16:57 crc kubenswrapper[4749]: I0219 19:16:57.903428 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-ceilometer-compute-config-data-2\") pod \"39f74bf8-e240-408d-a674-c61fcf66fd06\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " Feb 19 19:16:57 crc kubenswrapper[4749]: I0219 19:16:57.903601 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-inventory\") pod \"39f74bf8-e240-408d-a674-c61fcf66fd06\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " Feb 19 19:16:57 crc kubenswrapper[4749]: I0219 19:16:57.903634 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-telemetry-combined-ca-bundle\") pod \"39f74bf8-e240-408d-a674-c61fcf66fd06\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " Feb 19 19:16:57 crc kubenswrapper[4749]: I0219 19:16:57.903692 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv2d2\" (UniqueName: \"kubernetes.io/projected/39f74bf8-e240-408d-a674-c61fcf66fd06-kube-api-access-xv2d2\") pod \"39f74bf8-e240-408d-a674-c61fcf66fd06\" (UID: \"39f74bf8-e240-408d-a674-c61fcf66fd06\") " Feb 19 19:16:57 crc kubenswrapper[4749]: I0219 19:16:57.917402 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "39f74bf8-e240-408d-a674-c61fcf66fd06" (UID: "39f74bf8-e240-408d-a674-c61fcf66fd06"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:16:57 crc kubenswrapper[4749]: I0219 19:16:57.917952 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39f74bf8-e240-408d-a674-c61fcf66fd06-kube-api-access-xv2d2" (OuterVolumeSpecName: "kube-api-access-xv2d2") pod "39f74bf8-e240-408d-a674-c61fcf66fd06" (UID: "39f74bf8-e240-408d-a674-c61fcf66fd06"). InnerVolumeSpecName "kube-api-access-xv2d2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:16:57 crc kubenswrapper[4749]: I0219 19:16:57.932162 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "39f74bf8-e240-408d-a674-c61fcf66fd06" (UID: "39f74bf8-e240-408d-a674-c61fcf66fd06"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:16:57 crc kubenswrapper[4749]: I0219 19:16:57.947952 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "39f74bf8-e240-408d-a674-c61fcf66fd06" (UID: "39f74bf8-e240-408d-a674-c61fcf66fd06"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:16:57 crc kubenswrapper[4749]: I0219 19:16:57.955478 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "39f74bf8-e240-408d-a674-c61fcf66fd06" (UID: "39f74bf8-e240-408d-a674-c61fcf66fd06"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:16:57 crc kubenswrapper[4749]: I0219 19:16:57.964562 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-inventory" (OuterVolumeSpecName: "inventory") pod "39f74bf8-e240-408d-a674-c61fcf66fd06" (UID: "39f74bf8-e240-408d-a674-c61fcf66fd06"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:16:57 crc kubenswrapper[4749]: I0219 19:16:57.970144 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "39f74bf8-e240-408d-a674-c61fcf66fd06" (UID: "39f74bf8-e240-408d-a674-c61fcf66fd06"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:16:58 crc kubenswrapper[4749]: I0219 19:16:58.006494 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:16:58 crc kubenswrapper[4749]: I0219 19:16:58.006535 4749 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:16:58 crc kubenswrapper[4749]: I0219 19:16:58.006552 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv2d2\" (UniqueName: \"kubernetes.io/projected/39f74bf8-e240-408d-a674-c61fcf66fd06-kube-api-access-xv2d2\") on node \"crc\" DevicePath \"\"" Feb 19 19:16:58 crc kubenswrapper[4749]: I0219 19:16:58.006564 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:16:58 crc kubenswrapper[4749]: I0219 19:16:58.006577 4749 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 19 19:16:58 crc kubenswrapper[4749]: I0219 19:16:58.006592 4749 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:16:58 crc kubenswrapper[4749]: I0219 19:16:58.006611 4749 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/39f74bf8-e240-408d-a674-c61fcf66fd06-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 19 19:16:58 crc kubenswrapper[4749]: I0219 19:16:58.369579 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" event={"ID":"39f74bf8-e240-408d-a674-c61fcf66fd06","Type":"ContainerDied","Data":"35958c58c5cdaeb0184aed6e1274503167a45e0451d9ceed4f26d492f4c3c7d4"} Feb 19 19:16:58 crc kubenswrapper[4749]: I0219 19:16:58.369626 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35958c58c5cdaeb0184aed6e1274503167a45e0451d9ceed4f26d492f4c3c7d4" Feb 19 19:16:58 crc kubenswrapper[4749]: I0219 19:16:58.369716 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tsflr" Feb 19 19:17:01 crc kubenswrapper[4749]: E0219 19:17:01.354558 4749 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.128:56462->38.102.83.128:36573: write tcp 38.102.83.128:56462->38.102.83.128:36573: write: broken pipe Feb 19 19:17:06 crc kubenswrapper[4749]: I0219 19:17:06.686925 4749 scope.go:117] "RemoveContainer" containerID="8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" Feb 19 19:17:06 crc kubenswrapper[4749]: E0219 19:17:06.688871 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:17:19 crc kubenswrapper[4749]: I0219 19:17:19.678805 4749 scope.go:117] "RemoveContainer" containerID="8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" Feb 19 19:17:19 crc kubenswrapper[4749]: E0219 19:17:19.679601 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.385513 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Feb 19 19:17:29 crc kubenswrapper[4749]: E0219 19:17:29.386556 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a7ee6e5-4a4d-49d3-83f1-59192e92daba" containerName="collect-profiles" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.386574 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a7ee6e5-4a4d-49d3-83f1-59192e92daba" containerName="collect-profiles" Feb 19 19:17:29 crc kubenswrapper[4749]: E0219 19:17:29.386590 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39f74bf8-e240-408d-a674-c61fcf66fd06" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.386600 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="39f74bf8-e240-408d-a674-c61fcf66fd06" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.386836 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a7ee6e5-4a4d-49d3-83f1-59192e92daba" containerName="collect-profiles" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.386873 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="39f74bf8-e240-408d-a674-c61fcf66fd06" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.387979 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.389892 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.405640 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.505148 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-0"] Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.507524 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.511786 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-config-data" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.523061 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.541147 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.542851 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.543910 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94d055c5-b069-494b-a250-27a5c39826c2-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.544039 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.544154 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-etc-nvme\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.544182 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-lib-modules\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.544219 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94d055c5-b069-494b-a250-27a5c39826c2-config-data\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.544291 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-sys\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.544368 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94d055c5-b069-494b-a250-27a5c39826c2-config-data-custom\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.544405 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.544452 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.544470 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc64w\" (UniqueName: \"kubernetes.io/projected/94d055c5-b069-494b-a250-27a5c39826c2-kube-api-access-pc64w\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.544501 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94d055c5-b069-494b-a250-27a5c39826c2-scripts\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.544536 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.544636 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-dev\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.544665 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.544745 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-run\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.545751 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-2-config-data" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.599774 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.647693 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94d055c5-b069-494b-a250-27a5c39826c2-scripts\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.647768 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.647797 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.647830 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.647853 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.647879 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.647910 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.647940 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-dev\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.647970 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.647990 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.648103 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afe70cf9-b491-4c93-8c94-ae052eb02db4-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.648130 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.648182 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.648222 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-run\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.648245 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.648268 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe70cf9-b491-4c93-8c94-ae052eb02db4-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.648288 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.648317 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.648349 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckhjz\" (UniqueName: \"kubernetes.io/projected/afe70cf9-b491-4c93-8c94-ae052eb02db4-kube-api-access-ckhjz\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.648382 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94d055c5-b069-494b-a250-27a5c39826c2-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.648405 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.648430 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.648449 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.648480 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.648501 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.648510 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.648528 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afe70cf9-b491-4c93-8c94-ae052eb02db4-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.648775 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.648805 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.648853 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.648887 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-etc-nvme\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.648908 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-lib-modules\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.648989 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.649098 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-dev\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.649134 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94d055c5-b069-494b-a250-27a5c39826c2-config-data\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.649161 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-dev\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.649366 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.649396 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-run\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.649376 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-lib-modules\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.649456 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-sys\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.649479 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-sys\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.649521 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.649618 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdmf2\" (UniqueName: \"kubernetes.io/projected/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-kube-api-access-fdmf2\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.649745 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe70cf9-b491-4c93-8c94-ae052eb02db4-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.649791 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94d055c5-b069-494b-a250-27a5c39826c2-config-data-custom\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.649801 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-etc-nvme\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.649808 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-run\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.649852 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.649886 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.649911 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.649939 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-sys\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.649972 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.649991 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc64w\" (UniqueName: \"kubernetes.io/projected/94d055c5-b069-494b-a250-27a5c39826c2-kube-api-access-pc64w\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.650953 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.651008 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/94d055c5-b069-494b-a250-27a5c39826c2-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.654362 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94d055c5-b069-494b-a250-27a5c39826c2-config-data-custom\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.654488 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94d055c5-b069-494b-a250-27a5c39826c2-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.655085 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94d055c5-b069-494b-a250-27a5c39826c2-scripts\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.656099 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94d055c5-b069-494b-a250-27a5c39826c2-config-data\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.668770 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc64w\" (UniqueName: \"kubernetes.io/projected/94d055c5-b069-494b-a250-27a5c39826c2-kube-api-access-pc64w\") pod \"cinder-backup-0\" (UID: \"94d055c5-b069-494b-a250-27a5c39826c2\") " pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.750050 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.751595 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.751677 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckhjz\" (UniqueName: \"kubernetes.io/projected/afe70cf9-b491-4c93-8c94-ae052eb02db4-kube-api-access-ckhjz\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.751735 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.751759 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.751766 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.751812 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.751828 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.751866 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.751885 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.751892 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.751865 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.751911 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afe70cf9-b491-4c93-8c94-ae052eb02db4-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.751951 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.751995 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.752057 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-dev\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.752089 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.752138 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdmf2\" (UniqueName: \"kubernetes.io/projected/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-kube-api-access-fdmf2\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.752174 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe70cf9-b491-4c93-8c94-ae052eb02db4-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.752204 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-run\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.752227 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.752260 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.752287 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-sys\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.752330 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.752364 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.752384 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.752412 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.752446 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.752485 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.752498 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.752518 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afe70cf9-b491-4c93-8c94-ae052eb02db4-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.752530 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-dev\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.752549 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.752556 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.752585 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.752608 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.752630 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.752654 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe70cf9-b491-4c93-8c94-ae052eb02db4-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.752681 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.752793 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.752820 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.752855 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.755379 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afe70cf9-b491-4c93-8c94-ae052eb02db4-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.755463 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.755487 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-run\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.755519 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.755545 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.756591 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.756790 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe70cf9-b491-4c93-8c94-ae052eb02db4-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.756834 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.756889 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-sys\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.756937 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.756974 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.757006 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/afe70cf9-b491-4c93-8c94-ae052eb02db4-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.759222 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.761056 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.761110 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.770235 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afe70cf9-b491-4c93-8c94-ae052eb02db4-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.774502 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe70cf9-b491-4c93-8c94-ae052eb02db4-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.774613 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckhjz\" (UniqueName: \"kubernetes.io/projected/afe70cf9-b491-4c93-8c94-ae052eb02db4-kube-api-access-ckhjz\") pod \"cinder-volume-nfs-0\" (UID: \"afe70cf9-b491-4c93-8c94-ae052eb02db4\") " pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.782105 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdmf2\" (UniqueName: \"kubernetes.io/projected/f2a5ecf9-3280-4da3-9ea6-6491401e4daa-kube-api-access-fdmf2\") pod \"cinder-volume-nfs-2-0\" (UID: \"f2a5ecf9-3280-4da3-9ea6-6491401e4daa\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.833197 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:29 crc kubenswrapper[4749]: I0219 19:17:29.873588 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:30 crc kubenswrapper[4749]: I0219 19:17:30.402136 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 19 19:17:30 crc kubenswrapper[4749]: W0219 19:17:30.404612 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94d055c5_b069_494b_a250_27a5c39826c2.slice/crio-83aed2c1baa40f4785efc03017dfea2a89f116e47fd46ea7dead03018b1520c1 WatchSource:0}: Error finding container 83aed2c1baa40f4785efc03017dfea2a89f116e47fd46ea7dead03018b1520c1: Status 404 returned error can't find the container with id 83aed2c1baa40f4785efc03017dfea2a89f116e47fd46ea7dead03018b1520c1 Feb 19 19:17:30 crc kubenswrapper[4749]: I0219 19:17:30.506535 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Feb 19 19:17:30 crc kubenswrapper[4749]: W0219 19:17:30.614684 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafe70cf9_b491_4c93_8c94_ae052eb02db4.slice/crio-e79bfa6adb2571376cc435c77cf9a5440b77ea41fcfa8c884f0e8708391acebb WatchSource:0}: Error finding container e79bfa6adb2571376cc435c77cf9a5440b77ea41fcfa8c884f0e8708391acebb: Status 404 returned error can't find the container with id e79bfa6adb2571376cc435c77cf9a5440b77ea41fcfa8c884f0e8708391acebb Feb 19 19:17:30 crc kubenswrapper[4749]: I0219 19:17:30.619755 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Feb 19 19:17:30 crc kubenswrapper[4749]: I0219 19:17:30.629319 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"f2a5ecf9-3280-4da3-9ea6-6491401e4daa","Type":"ContainerStarted","Data":"4da36b32c4c33ad72b5b04c25903111bdc1df7d89656d5182214e99c906130c5"} Feb 19 19:17:30 crc kubenswrapper[4749]: I0219 19:17:30.630987 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"94d055c5-b069-494b-a250-27a5c39826c2","Type":"ContainerStarted","Data":"83aed2c1baa40f4785efc03017dfea2a89f116e47fd46ea7dead03018b1520c1"} Feb 19 19:17:30 crc kubenswrapper[4749]: I0219 19:17:30.636113 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"afe70cf9-b491-4c93-8c94-ae052eb02db4","Type":"ContainerStarted","Data":"e79bfa6adb2571376cc435c77cf9a5440b77ea41fcfa8c884f0e8708391acebb"} Feb 19 19:17:31 crc kubenswrapper[4749]: I0219 19:17:31.647055 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"f2a5ecf9-3280-4da3-9ea6-6491401e4daa","Type":"ContainerStarted","Data":"1e20633af80b871017a4be57181f71e52bcdd5ecd995efe0593e2f86841b5ced"} Feb 19 19:17:31 crc kubenswrapper[4749]: I0219 19:17:31.647635 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"f2a5ecf9-3280-4da3-9ea6-6491401e4daa","Type":"ContainerStarted","Data":"56934273b27cf3743c0b62350ce8a083b257cded5c6d2e7af87257fe122c9204"} Feb 19 19:17:31 crc kubenswrapper[4749]: I0219 19:17:31.648629 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"94d055c5-b069-494b-a250-27a5c39826c2","Type":"ContainerStarted","Data":"16fb3bcd4e273ca8b87ac776dc994d6c9d5ef6583866cb509ebb6a8c268741d6"} Feb 19 19:17:31 crc kubenswrapper[4749]: I0219 19:17:31.648665 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"94d055c5-b069-494b-a250-27a5c39826c2","Type":"ContainerStarted","Data":"5a651a7bcd452765be118bdd90d260a2db731d2393fb4b6c8e9fab9e5916e24a"} Feb 19 19:17:31 crc kubenswrapper[4749]: I0219 19:17:31.650641 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"afe70cf9-b491-4c93-8c94-ae052eb02db4","Type":"ContainerStarted","Data":"ead87d21f3e874925254b1040cdc2c0c362133d6eff383b43cf8b297539da73f"} Feb 19 19:17:31 crc kubenswrapper[4749]: I0219 19:17:31.650675 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"afe70cf9-b491-4c93-8c94-ae052eb02db4","Type":"ContainerStarted","Data":"fbf457d72ef0eeaa42afc7ec01ede52c6ec7ad5c40c1bb725a09ada6c3cf9034"} Feb 19 19:17:31 crc kubenswrapper[4749]: I0219 19:17:31.670413 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-2-0" podStartSLOduration=2.407841955 podStartE2EDuration="2.670396407s" podCreationTimestamp="2026-02-19 19:17:29 +0000 UTC" firstStartedPulling="2026-02-19 19:17:30.552959155 +0000 UTC m=+2624.514179129" lastFinishedPulling="2026-02-19 19:17:30.815513627 +0000 UTC m=+2624.776733581" observedRunningTime="2026-02-19 19:17:31.66885946 +0000 UTC m=+2625.630079404" watchObservedRunningTime="2026-02-19 19:17:31.670396407 +0000 UTC m=+2625.631616361" Feb 19 19:17:31 crc kubenswrapper[4749]: I0219 19:17:31.679185 4749 scope.go:117] "RemoveContainer" containerID="8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" Feb 19 19:17:31 crc kubenswrapper[4749]: E0219 19:17:31.679474 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:17:31 crc kubenswrapper[4749]: I0219 19:17:31.703427 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-0" podStartSLOduration=2.502909791 podStartE2EDuration="2.703409311s" podCreationTimestamp="2026-02-19 19:17:29 +0000 UTC" firstStartedPulling="2026-02-19 19:17:30.617738542 +0000 UTC m=+2624.578958496" lastFinishedPulling="2026-02-19 19:17:30.818238062 +0000 UTC m=+2624.779458016" observedRunningTime="2026-02-19 19:17:31.693755699 +0000 UTC m=+2625.654975673" watchObservedRunningTime="2026-02-19 19:17:31.703409311 +0000 UTC m=+2625.664629265" Feb 19 19:17:31 crc kubenswrapper[4749]: I0219 19:17:31.721465 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.383783308 podStartE2EDuration="2.721447015s" podCreationTimestamp="2026-02-19 19:17:29 +0000 UTC" firstStartedPulling="2026-02-19 19:17:30.408459482 +0000 UTC m=+2624.369679436" lastFinishedPulling="2026-02-19 19:17:30.746123189 +0000 UTC m=+2624.707343143" observedRunningTime="2026-02-19 19:17:31.720484131 +0000 UTC m=+2625.681704085" watchObservedRunningTime="2026-02-19 19:17:31.721447015 +0000 UTC m=+2625.682666959" Feb 19 19:17:34 crc kubenswrapper[4749]: I0219 19:17:34.751052 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Feb 19 19:17:34 crc kubenswrapper[4749]: I0219 19:17:34.834278 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:34 crc kubenswrapper[4749]: I0219 19:17:34.874874 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:40 crc kubenswrapper[4749]: I0219 19:17:40.005789 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-2-0" Feb 19 19:17:40 crc kubenswrapper[4749]: I0219 19:17:40.009779 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Feb 19 19:17:40 crc kubenswrapper[4749]: I0219 19:17:40.070871 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-0" Feb 19 19:17:46 crc kubenswrapper[4749]: I0219 19:17:46.698213 4749 scope.go:117] "RemoveContainer" containerID="8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" Feb 19 19:17:46 crc kubenswrapper[4749]: E0219 19:17:46.699028 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:17:46 crc kubenswrapper[4749]: I0219 19:17:46.715788 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fht8c"] Feb 19 19:17:46 crc kubenswrapper[4749]: I0219 19:17:46.717960 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fht8c" Feb 19 19:17:46 crc kubenswrapper[4749]: I0219 19:17:46.728169 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fht8c"] Feb 19 19:17:46 crc kubenswrapper[4749]: I0219 19:17:46.754221 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5e73456-dba9-4a6d-a39f-510b438a8a04-catalog-content\") pod \"redhat-marketplace-fht8c\" (UID: \"e5e73456-dba9-4a6d-a39f-510b438a8a04\") " pod="openshift-marketplace/redhat-marketplace-fht8c" Feb 19 19:17:46 crc kubenswrapper[4749]: I0219 19:17:46.754424 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9rwd\" (UniqueName: \"kubernetes.io/projected/e5e73456-dba9-4a6d-a39f-510b438a8a04-kube-api-access-s9rwd\") pod \"redhat-marketplace-fht8c\" (UID: \"e5e73456-dba9-4a6d-a39f-510b438a8a04\") " pod="openshift-marketplace/redhat-marketplace-fht8c" Feb 19 19:17:46 crc kubenswrapper[4749]: I0219 19:17:46.754459 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5e73456-dba9-4a6d-a39f-510b438a8a04-utilities\") pod \"redhat-marketplace-fht8c\" (UID: \"e5e73456-dba9-4a6d-a39f-510b438a8a04\") " pod="openshift-marketplace/redhat-marketplace-fht8c" Feb 19 19:17:46 crc kubenswrapper[4749]: I0219 19:17:46.856307 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5e73456-dba9-4a6d-a39f-510b438a8a04-catalog-content\") pod \"redhat-marketplace-fht8c\" (UID: \"e5e73456-dba9-4a6d-a39f-510b438a8a04\") " pod="openshift-marketplace/redhat-marketplace-fht8c" Feb 19 19:17:46 crc kubenswrapper[4749]: I0219 19:17:46.856445 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9rwd\" (UniqueName: \"kubernetes.io/projected/e5e73456-dba9-4a6d-a39f-510b438a8a04-kube-api-access-s9rwd\") pod \"redhat-marketplace-fht8c\" (UID: \"e5e73456-dba9-4a6d-a39f-510b438a8a04\") " pod="openshift-marketplace/redhat-marketplace-fht8c" Feb 19 19:17:46 crc kubenswrapper[4749]: I0219 19:17:46.856473 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5e73456-dba9-4a6d-a39f-510b438a8a04-utilities\") pod \"redhat-marketplace-fht8c\" (UID: \"e5e73456-dba9-4a6d-a39f-510b438a8a04\") " pod="openshift-marketplace/redhat-marketplace-fht8c" Feb 19 19:17:46 crc kubenswrapper[4749]: I0219 19:17:46.856903 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5e73456-dba9-4a6d-a39f-510b438a8a04-catalog-content\") pod \"redhat-marketplace-fht8c\" (UID: \"e5e73456-dba9-4a6d-a39f-510b438a8a04\") " pod="openshift-marketplace/redhat-marketplace-fht8c" Feb 19 19:17:46 crc kubenswrapper[4749]: I0219 19:17:46.856942 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5e73456-dba9-4a6d-a39f-510b438a8a04-utilities\") pod \"redhat-marketplace-fht8c\" (UID: \"e5e73456-dba9-4a6d-a39f-510b438a8a04\") " pod="openshift-marketplace/redhat-marketplace-fht8c" Feb 19 19:17:46 crc kubenswrapper[4749]: I0219 19:17:46.881324 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9rwd\" (UniqueName: \"kubernetes.io/projected/e5e73456-dba9-4a6d-a39f-510b438a8a04-kube-api-access-s9rwd\") pod \"redhat-marketplace-fht8c\" (UID: \"e5e73456-dba9-4a6d-a39f-510b438a8a04\") " pod="openshift-marketplace/redhat-marketplace-fht8c" Feb 19 19:17:47 crc kubenswrapper[4749]: I0219 19:17:47.050355 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fht8c" Feb 19 19:17:47 crc kubenswrapper[4749]: I0219 19:17:47.572774 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fht8c"] Feb 19 19:17:47 crc kubenswrapper[4749]: I0219 19:17:47.836154 4749 generic.go:334] "Generic (PLEG): container finished" podID="e5e73456-dba9-4a6d-a39f-510b438a8a04" containerID="3a7e8186dc02efb70bc0e28f7821c9721eb8ba703e893c8f07e0f4c4956585f4" exitCode=0 Feb 19 19:17:47 crc kubenswrapper[4749]: I0219 19:17:47.836199 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fht8c" event={"ID":"e5e73456-dba9-4a6d-a39f-510b438a8a04","Type":"ContainerDied","Data":"3a7e8186dc02efb70bc0e28f7821c9721eb8ba703e893c8f07e0f4c4956585f4"} Feb 19 19:17:47 crc kubenswrapper[4749]: I0219 19:17:47.836240 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fht8c" event={"ID":"e5e73456-dba9-4a6d-a39f-510b438a8a04","Type":"ContainerStarted","Data":"29ac2b1ec6647907cf66bc2dcb2dacd302efaea29f4d09a9f68727c139d9b161"} Feb 19 19:17:47 crc kubenswrapper[4749]: I0219 19:17:47.837816 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 19:17:49 crc kubenswrapper[4749]: I0219 19:17:49.860869 4749 generic.go:334] "Generic (PLEG): container finished" podID="e5e73456-dba9-4a6d-a39f-510b438a8a04" containerID="f7515800503d7ab54d2fd2a8c0f0cc8d59b4e4f9d64eb94dad2012cd131da074" exitCode=0 Feb 19 19:17:49 crc kubenswrapper[4749]: I0219 19:17:49.860929 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fht8c" event={"ID":"e5e73456-dba9-4a6d-a39f-510b438a8a04","Type":"ContainerDied","Data":"f7515800503d7ab54d2fd2a8c0f0cc8d59b4e4f9d64eb94dad2012cd131da074"} Feb 19 19:17:50 crc kubenswrapper[4749]: I0219 19:17:50.871398 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fht8c" event={"ID":"e5e73456-dba9-4a6d-a39f-510b438a8a04","Type":"ContainerStarted","Data":"d9047d4fe869707d76d6fad315f08e8086633856b8f53acfaf8b7227aeac8704"} Feb 19 19:17:50 crc kubenswrapper[4749]: I0219 19:17:50.895509 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fht8c" podStartSLOduration=2.4400483570000002 podStartE2EDuration="4.895489722s" podCreationTimestamp="2026-02-19 19:17:46 +0000 UTC" firstStartedPulling="2026-02-19 19:17:47.837524553 +0000 UTC m=+2641.798744507" lastFinishedPulling="2026-02-19 19:17:50.292965918 +0000 UTC m=+2644.254185872" observedRunningTime="2026-02-19 19:17:50.887491789 +0000 UTC m=+2644.848711763" watchObservedRunningTime="2026-02-19 19:17:50.895489722 +0000 UTC m=+2644.856709676" Feb 19 19:17:57 crc kubenswrapper[4749]: I0219 19:17:57.050996 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fht8c" Feb 19 19:17:57 crc kubenswrapper[4749]: I0219 19:17:57.051544 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fht8c" Feb 19 19:17:57 crc kubenswrapper[4749]: I0219 19:17:57.103580 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fht8c" Feb 19 19:17:57 crc kubenswrapper[4749]: I0219 19:17:57.990821 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fht8c" Feb 19 19:17:58 crc kubenswrapper[4749]: I0219 19:17:58.038191 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fht8c"] Feb 19 19:17:58 crc kubenswrapper[4749]: I0219 19:17:58.679879 4749 scope.go:117] "RemoveContainer" containerID="8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" Feb 19 19:17:58 crc kubenswrapper[4749]: E0219 19:17:58.680441 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:17:59 crc kubenswrapper[4749]: I0219 19:17:59.955449 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fht8c" podUID="e5e73456-dba9-4a6d-a39f-510b438a8a04" containerName="registry-server" containerID="cri-o://d9047d4fe869707d76d6fad315f08e8086633856b8f53acfaf8b7227aeac8704" gracePeriod=2 Feb 19 19:18:00 crc kubenswrapper[4749]: I0219 19:18:00.448019 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fht8c" Feb 19 19:18:00 crc kubenswrapper[4749]: I0219 19:18:00.561075 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9rwd\" (UniqueName: \"kubernetes.io/projected/e5e73456-dba9-4a6d-a39f-510b438a8a04-kube-api-access-s9rwd\") pod \"e5e73456-dba9-4a6d-a39f-510b438a8a04\" (UID: \"e5e73456-dba9-4a6d-a39f-510b438a8a04\") " Feb 19 19:18:00 crc kubenswrapper[4749]: I0219 19:18:00.561975 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5e73456-dba9-4a6d-a39f-510b438a8a04-catalog-content\") pod \"e5e73456-dba9-4a6d-a39f-510b438a8a04\" (UID: \"e5e73456-dba9-4a6d-a39f-510b438a8a04\") " Feb 19 19:18:00 crc kubenswrapper[4749]: I0219 19:18:00.562069 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5e73456-dba9-4a6d-a39f-510b438a8a04-utilities\") pod \"e5e73456-dba9-4a6d-a39f-510b438a8a04\" (UID: \"e5e73456-dba9-4a6d-a39f-510b438a8a04\") " Feb 19 19:18:00 crc kubenswrapper[4749]: I0219 19:18:00.563176 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5e73456-dba9-4a6d-a39f-510b438a8a04-utilities" (OuterVolumeSpecName: "utilities") pod "e5e73456-dba9-4a6d-a39f-510b438a8a04" (UID: "e5e73456-dba9-4a6d-a39f-510b438a8a04"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:18:00 crc kubenswrapper[4749]: I0219 19:18:00.566153 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5e73456-dba9-4a6d-a39f-510b438a8a04-kube-api-access-s9rwd" (OuterVolumeSpecName: "kube-api-access-s9rwd") pod "e5e73456-dba9-4a6d-a39f-510b438a8a04" (UID: "e5e73456-dba9-4a6d-a39f-510b438a8a04"). InnerVolumeSpecName "kube-api-access-s9rwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:00 crc kubenswrapper[4749]: I0219 19:18:00.593111 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5e73456-dba9-4a6d-a39f-510b438a8a04-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5e73456-dba9-4a6d-a39f-510b438a8a04" (UID: "e5e73456-dba9-4a6d-a39f-510b438a8a04"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:18:00 crc kubenswrapper[4749]: I0219 19:18:00.664955 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9rwd\" (UniqueName: \"kubernetes.io/projected/e5e73456-dba9-4a6d-a39f-510b438a8a04-kube-api-access-s9rwd\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:00 crc kubenswrapper[4749]: I0219 19:18:00.665005 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5e73456-dba9-4a6d-a39f-510b438a8a04-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:00 crc kubenswrapper[4749]: I0219 19:18:00.665017 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5e73456-dba9-4a6d-a39f-510b438a8a04-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:00 crc kubenswrapper[4749]: I0219 19:18:00.966567 4749 generic.go:334] "Generic (PLEG): container finished" podID="e5e73456-dba9-4a6d-a39f-510b438a8a04" containerID="d9047d4fe869707d76d6fad315f08e8086633856b8f53acfaf8b7227aeac8704" exitCode=0 Feb 19 19:18:00 crc kubenswrapper[4749]: I0219 19:18:00.966635 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fht8c" event={"ID":"e5e73456-dba9-4a6d-a39f-510b438a8a04","Type":"ContainerDied","Data":"d9047d4fe869707d76d6fad315f08e8086633856b8f53acfaf8b7227aeac8704"} Feb 19 19:18:00 crc kubenswrapper[4749]: I0219 19:18:00.966878 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fht8c" event={"ID":"e5e73456-dba9-4a6d-a39f-510b438a8a04","Type":"ContainerDied","Data":"29ac2b1ec6647907cf66bc2dcb2dacd302efaea29f4d09a9f68727c139d9b161"} Feb 19 19:18:00 crc kubenswrapper[4749]: I0219 19:18:00.966901 4749 scope.go:117] "RemoveContainer" containerID="d9047d4fe869707d76d6fad315f08e8086633856b8f53acfaf8b7227aeac8704" Feb 19 19:18:00 crc kubenswrapper[4749]: I0219 19:18:00.966769 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fht8c" Feb 19 19:18:00 crc kubenswrapper[4749]: I0219 19:18:00.989411 4749 scope.go:117] "RemoveContainer" containerID="f7515800503d7ab54d2fd2a8c0f0cc8d59b4e4f9d64eb94dad2012cd131da074" Feb 19 19:18:01 crc kubenswrapper[4749]: I0219 19:18:01.001170 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fht8c"] Feb 19 19:18:01 crc kubenswrapper[4749]: I0219 19:18:01.011714 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fht8c"] Feb 19 19:18:01 crc kubenswrapper[4749]: I0219 19:18:01.023614 4749 scope.go:117] "RemoveContainer" containerID="3a7e8186dc02efb70bc0e28f7821c9721eb8ba703e893c8f07e0f4c4956585f4" Feb 19 19:18:01 crc kubenswrapper[4749]: I0219 19:18:01.057422 4749 scope.go:117] "RemoveContainer" containerID="d9047d4fe869707d76d6fad315f08e8086633856b8f53acfaf8b7227aeac8704" Feb 19 19:18:01 crc kubenswrapper[4749]: E0219 19:18:01.057745 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9047d4fe869707d76d6fad315f08e8086633856b8f53acfaf8b7227aeac8704\": container with ID starting with d9047d4fe869707d76d6fad315f08e8086633856b8f53acfaf8b7227aeac8704 not found: ID does not exist" containerID="d9047d4fe869707d76d6fad315f08e8086633856b8f53acfaf8b7227aeac8704" Feb 19 19:18:01 crc kubenswrapper[4749]: I0219 19:18:01.057799 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9047d4fe869707d76d6fad315f08e8086633856b8f53acfaf8b7227aeac8704"} err="failed to get container status \"d9047d4fe869707d76d6fad315f08e8086633856b8f53acfaf8b7227aeac8704\": rpc error: code = NotFound desc = could not find container \"d9047d4fe869707d76d6fad315f08e8086633856b8f53acfaf8b7227aeac8704\": container with ID starting with d9047d4fe869707d76d6fad315f08e8086633856b8f53acfaf8b7227aeac8704 not found: ID does not exist" Feb 19 19:18:01 crc kubenswrapper[4749]: I0219 19:18:01.057826 4749 scope.go:117] "RemoveContainer" containerID="f7515800503d7ab54d2fd2a8c0f0cc8d59b4e4f9d64eb94dad2012cd131da074" Feb 19 19:18:01 crc kubenswrapper[4749]: E0219 19:18:01.058086 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7515800503d7ab54d2fd2a8c0f0cc8d59b4e4f9d64eb94dad2012cd131da074\": container with ID starting with f7515800503d7ab54d2fd2a8c0f0cc8d59b4e4f9d64eb94dad2012cd131da074 not found: ID does not exist" containerID="f7515800503d7ab54d2fd2a8c0f0cc8d59b4e4f9d64eb94dad2012cd131da074" Feb 19 19:18:01 crc kubenswrapper[4749]: I0219 19:18:01.058133 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7515800503d7ab54d2fd2a8c0f0cc8d59b4e4f9d64eb94dad2012cd131da074"} err="failed to get container status \"f7515800503d7ab54d2fd2a8c0f0cc8d59b4e4f9d64eb94dad2012cd131da074\": rpc error: code = NotFound desc = could not find container \"f7515800503d7ab54d2fd2a8c0f0cc8d59b4e4f9d64eb94dad2012cd131da074\": container with ID starting with f7515800503d7ab54d2fd2a8c0f0cc8d59b4e4f9d64eb94dad2012cd131da074 not found: ID does not exist" Feb 19 19:18:01 crc kubenswrapper[4749]: I0219 19:18:01.058147 4749 scope.go:117] "RemoveContainer" containerID="3a7e8186dc02efb70bc0e28f7821c9721eb8ba703e893c8f07e0f4c4956585f4" Feb 19 19:18:01 crc kubenswrapper[4749]: E0219 19:18:01.058549 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a7e8186dc02efb70bc0e28f7821c9721eb8ba703e893c8f07e0f4c4956585f4\": container with ID starting with 3a7e8186dc02efb70bc0e28f7821c9721eb8ba703e893c8f07e0f4c4956585f4 not found: ID does not exist" containerID="3a7e8186dc02efb70bc0e28f7821c9721eb8ba703e893c8f07e0f4c4956585f4" Feb 19 19:18:01 crc kubenswrapper[4749]: I0219 19:18:01.058585 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a7e8186dc02efb70bc0e28f7821c9721eb8ba703e893c8f07e0f4c4956585f4"} err="failed to get container status \"3a7e8186dc02efb70bc0e28f7821c9721eb8ba703e893c8f07e0f4c4956585f4\": rpc error: code = NotFound desc = could not find container \"3a7e8186dc02efb70bc0e28f7821c9721eb8ba703e893c8f07e0f4c4956585f4\": container with ID starting with 3a7e8186dc02efb70bc0e28f7821c9721eb8ba703e893c8f07e0f4c4956585f4 not found: ID does not exist" Feb 19 19:18:02 crc kubenswrapper[4749]: I0219 19:18:02.691363 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5e73456-dba9-4a6d-a39f-510b438a8a04" path="/var/lib/kubelet/pods/e5e73456-dba9-4a6d-a39f-510b438a8a04/volumes" Feb 19 19:18:13 crc kubenswrapper[4749]: I0219 19:18:13.679871 4749 scope.go:117] "RemoveContainer" containerID="8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" Feb 19 19:18:13 crc kubenswrapper[4749]: E0219 19:18:13.680719 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:18:24 crc kubenswrapper[4749]: I0219 19:18:24.679262 4749 scope.go:117] "RemoveContainer" containerID="8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" Feb 19 19:18:24 crc kubenswrapper[4749]: E0219 19:18:24.680010 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:18:36 crc kubenswrapper[4749]: I0219 19:18:36.685014 4749 scope.go:117] "RemoveContainer" containerID="8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" Feb 19 19:18:37 crc kubenswrapper[4749]: I0219 19:18:37.189595 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 19:18:37 crc kubenswrapper[4749]: I0219 19:18:37.190282 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="15a688ac-ce3d-40e9-90d0-b013569164e3" containerName="prometheus" containerID="cri-o://ac50eb7c57f2273200cf4650239284e60b3fcea351fbcab08791bcf1592a0f5a" gracePeriod=600 Feb 19 19:18:37 crc kubenswrapper[4749]: I0219 19:18:37.190525 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="15a688ac-ce3d-40e9-90d0-b013569164e3" containerName="thanos-sidecar" containerID="cri-o://bda247fdba956df0533e901b8f8a7301db7a1cdc1771cafeea1206f9f3adb665" gracePeriod=600 Feb 19 19:18:37 crc kubenswrapper[4749]: I0219 19:18:37.190593 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="15a688ac-ce3d-40e9-90d0-b013569164e3" containerName="config-reloader" containerID="cri-o://eeb5965773fef4719ed9de17a0da7de9483b329b067732af29e29198c020c7f2" gracePeriod=600 Feb 19 19:18:37 crc kubenswrapper[4749]: I0219 19:18:37.307126 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerStarted","Data":"8093d672b657eb163bf89fa6162798b2d16c7bce8e69d30b979dad8d5d696438"} Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.200009 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.230713 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-config\") pod \"15a688ac-ce3d-40e9-90d0-b013569164e3\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.230780 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/15a688ac-ce3d-40e9-90d0-b013569164e3-prometheus-metric-storage-rulefiles-1\") pod \"15a688ac-ce3d-40e9-90d0-b013569164e3\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.230827 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/15a688ac-ce3d-40e9-90d0-b013569164e3-tls-assets\") pod \"15a688ac-ce3d-40e9-90d0-b013569164e3\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.230877 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-web-config\") pod \"15a688ac-ce3d-40e9-90d0-b013569164e3\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.230953 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/15a688ac-ce3d-40e9-90d0-b013569164e3-config-out\") pod \"15a688ac-ce3d-40e9-90d0-b013569164e3\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.231017 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"15a688ac-ce3d-40e9-90d0-b013569164e3\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.231063 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-thanos-prometheus-http-client-file\") pod \"15a688ac-ce3d-40e9-90d0-b013569164e3\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.231120 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/15a688ac-ce3d-40e9-90d0-b013569164e3-prometheus-metric-storage-rulefiles-0\") pod \"15a688ac-ce3d-40e9-90d0-b013569164e3\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.231151 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkwhc\" (UniqueName: \"kubernetes.io/projected/15a688ac-ce3d-40e9-90d0-b013569164e3-kube-api-access-zkwhc\") pod \"15a688ac-ce3d-40e9-90d0-b013569164e3\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.231622 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15a688ac-ce3d-40e9-90d0-b013569164e3-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "15a688ac-ce3d-40e9-90d0-b013569164e3" (UID: "15a688ac-ce3d-40e9-90d0-b013569164e3"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.231760 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82420923-4549-44bf-81a2-5cca6d09b55a\") pod \"15a688ac-ce3d-40e9-90d0-b013569164e3\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.231837 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"15a688ac-ce3d-40e9-90d0-b013569164e3\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.231926 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-secret-combined-ca-bundle\") pod \"15a688ac-ce3d-40e9-90d0-b013569164e3\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.231951 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/15a688ac-ce3d-40e9-90d0-b013569164e3-prometheus-metric-storage-rulefiles-2\") pod \"15a688ac-ce3d-40e9-90d0-b013569164e3\" (UID: \"15a688ac-ce3d-40e9-90d0-b013569164e3\") " Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.232623 4749 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/15a688ac-ce3d-40e9-90d0-b013569164e3-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.233103 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15a688ac-ce3d-40e9-90d0-b013569164e3-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "15a688ac-ce3d-40e9-90d0-b013569164e3" (UID: "15a688ac-ce3d-40e9-90d0-b013569164e3"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.233093 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15a688ac-ce3d-40e9-90d0-b013569164e3-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "15a688ac-ce3d-40e9-90d0-b013569164e3" (UID: "15a688ac-ce3d-40e9-90d0-b013569164e3"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.244826 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "15a688ac-ce3d-40e9-90d0-b013569164e3" (UID: "15a688ac-ce3d-40e9-90d0-b013569164e3"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.244953 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "15a688ac-ce3d-40e9-90d0-b013569164e3" (UID: "15a688ac-ce3d-40e9-90d0-b013569164e3"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.255414 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15a688ac-ce3d-40e9-90d0-b013569164e3-kube-api-access-zkwhc" (OuterVolumeSpecName: "kube-api-access-zkwhc") pod "15a688ac-ce3d-40e9-90d0-b013569164e3" (UID: "15a688ac-ce3d-40e9-90d0-b013569164e3"). InnerVolumeSpecName "kube-api-access-zkwhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.255879 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-config" (OuterVolumeSpecName: "config") pod "15a688ac-ce3d-40e9-90d0-b013569164e3" (UID: "15a688ac-ce3d-40e9-90d0-b013569164e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.265930 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "15a688ac-ce3d-40e9-90d0-b013569164e3" (UID: "15a688ac-ce3d-40e9-90d0-b013569164e3"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.275184 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15a688ac-ce3d-40e9-90d0-b013569164e3-config-out" (OuterVolumeSpecName: "config-out") pod "15a688ac-ce3d-40e9-90d0-b013569164e3" (UID: "15a688ac-ce3d-40e9-90d0-b013569164e3"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.275772 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15a688ac-ce3d-40e9-90d0-b013569164e3-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "15a688ac-ce3d-40e9-90d0-b013569164e3" (UID: "15a688ac-ce3d-40e9-90d0-b013569164e3"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.275978 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "15a688ac-ce3d-40e9-90d0-b013569164e3" (UID: "15a688ac-ce3d-40e9-90d0-b013569164e3"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.294657 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82420923-4549-44bf-81a2-5cca6d09b55a" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "15a688ac-ce3d-40e9-90d0-b013569164e3" (UID: "15a688ac-ce3d-40e9-90d0-b013569164e3"). InnerVolumeSpecName "pvc-82420923-4549-44bf-81a2-5cca6d09b55a". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.326567 4749 generic.go:334] "Generic (PLEG): container finished" podID="15a688ac-ce3d-40e9-90d0-b013569164e3" containerID="bda247fdba956df0533e901b8f8a7301db7a1cdc1771cafeea1206f9f3adb665" exitCode=0 Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.326596 4749 generic.go:334] "Generic (PLEG): container finished" podID="15a688ac-ce3d-40e9-90d0-b013569164e3" containerID="eeb5965773fef4719ed9de17a0da7de9483b329b067732af29e29198c020c7f2" exitCode=0 Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.326603 4749 generic.go:334] "Generic (PLEG): container finished" podID="15a688ac-ce3d-40e9-90d0-b013569164e3" containerID="ac50eb7c57f2273200cf4650239284e60b3fcea351fbcab08791bcf1592a0f5a" exitCode=0 Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.326622 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"15a688ac-ce3d-40e9-90d0-b013569164e3","Type":"ContainerDied","Data":"bda247fdba956df0533e901b8f8a7301db7a1cdc1771cafeea1206f9f3adb665"} Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.326648 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"15a688ac-ce3d-40e9-90d0-b013569164e3","Type":"ContainerDied","Data":"eeb5965773fef4719ed9de17a0da7de9483b329b067732af29e29198c020c7f2"} Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.326658 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"15a688ac-ce3d-40e9-90d0-b013569164e3","Type":"ContainerDied","Data":"ac50eb7c57f2273200cf4650239284e60b3fcea351fbcab08791bcf1592a0f5a"} Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.326666 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"15a688ac-ce3d-40e9-90d0-b013569164e3","Type":"ContainerDied","Data":"6232beab92b2ee5f38edfd6b54766ce3eefef5f79d59f38952df942c0abd0b94"} Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.326815 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.327303 4749 scope.go:117] "RemoveContainer" containerID="bda247fdba956df0533e901b8f8a7301db7a1cdc1771cafeea1206f9f3adb665" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.334451 4749 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/15a688ac-ce3d-40e9-90d0-b013569164e3-config-out\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.334480 4749 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.334491 4749 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.334504 4749 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/15a688ac-ce3d-40e9-90d0-b013569164e3-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.334517 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkwhc\" (UniqueName: \"kubernetes.io/projected/15a688ac-ce3d-40e9-90d0-b013569164e3-kube-api-access-zkwhc\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.334553 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-82420923-4549-44bf-81a2-5cca6d09b55a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82420923-4549-44bf-81a2-5cca6d09b55a\") on node \"crc\" " Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.334565 4749 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.334575 4749 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.334586 4749 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/15a688ac-ce3d-40e9-90d0-b013569164e3-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.339310 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.339351 4749 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/15a688ac-ce3d-40e9-90d0-b013569164e3-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.369676 4749 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.369817 4749 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-82420923-4549-44bf-81a2-5cca6d09b55a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82420923-4549-44bf-81a2-5cca6d09b55a") on node "crc" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.385200 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-web-config" (OuterVolumeSpecName: "web-config") pod "15a688ac-ce3d-40e9-90d0-b013569164e3" (UID: "15a688ac-ce3d-40e9-90d0-b013569164e3"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.425141 4749 scope.go:117] "RemoveContainer" containerID="eeb5965773fef4719ed9de17a0da7de9483b329b067732af29e29198c020c7f2" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.441545 4749 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/15a688ac-ce3d-40e9-90d0-b013569164e3-web-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.441738 4749 reconciler_common.go:293] "Volume detached for volume \"pvc-82420923-4549-44bf-81a2-5cca6d09b55a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82420923-4549-44bf-81a2-5cca6d09b55a\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.446223 4749 scope.go:117] "RemoveContainer" containerID="ac50eb7c57f2273200cf4650239284e60b3fcea351fbcab08791bcf1592a0f5a" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.464731 4749 scope.go:117] "RemoveContainer" containerID="932f0f8fbcf7192d70ec8c5b1ecfa8ee758f7808e4bb9fc686ea1d9e697ceeb8" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.483859 4749 scope.go:117] "RemoveContainer" containerID="bda247fdba956df0533e901b8f8a7301db7a1cdc1771cafeea1206f9f3adb665" Feb 19 19:18:38 crc kubenswrapper[4749]: E0219 19:18:38.484326 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bda247fdba956df0533e901b8f8a7301db7a1cdc1771cafeea1206f9f3adb665\": container with ID starting with bda247fdba956df0533e901b8f8a7301db7a1cdc1771cafeea1206f9f3adb665 not found: ID does not exist" containerID="bda247fdba956df0533e901b8f8a7301db7a1cdc1771cafeea1206f9f3adb665" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.484460 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bda247fdba956df0533e901b8f8a7301db7a1cdc1771cafeea1206f9f3adb665"} err="failed to get container status \"bda247fdba956df0533e901b8f8a7301db7a1cdc1771cafeea1206f9f3adb665\": rpc error: code = NotFound desc = could not find container \"bda247fdba956df0533e901b8f8a7301db7a1cdc1771cafeea1206f9f3adb665\": container with ID starting with bda247fdba956df0533e901b8f8a7301db7a1cdc1771cafeea1206f9f3adb665 not found: ID does not exist" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.484551 4749 scope.go:117] "RemoveContainer" containerID="eeb5965773fef4719ed9de17a0da7de9483b329b067732af29e29198c020c7f2" Feb 19 19:18:38 crc kubenswrapper[4749]: E0219 19:18:38.484971 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eeb5965773fef4719ed9de17a0da7de9483b329b067732af29e29198c020c7f2\": container with ID starting with eeb5965773fef4719ed9de17a0da7de9483b329b067732af29e29198c020c7f2 not found: ID does not exist" containerID="eeb5965773fef4719ed9de17a0da7de9483b329b067732af29e29198c020c7f2" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.485078 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeb5965773fef4719ed9de17a0da7de9483b329b067732af29e29198c020c7f2"} err="failed to get container status \"eeb5965773fef4719ed9de17a0da7de9483b329b067732af29e29198c020c7f2\": rpc error: code = NotFound desc = could not find container \"eeb5965773fef4719ed9de17a0da7de9483b329b067732af29e29198c020c7f2\": container with ID starting with eeb5965773fef4719ed9de17a0da7de9483b329b067732af29e29198c020c7f2 not found: ID does not exist" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.485111 4749 scope.go:117] "RemoveContainer" containerID="ac50eb7c57f2273200cf4650239284e60b3fcea351fbcab08791bcf1592a0f5a" Feb 19 19:18:38 crc kubenswrapper[4749]: E0219 19:18:38.485517 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac50eb7c57f2273200cf4650239284e60b3fcea351fbcab08791bcf1592a0f5a\": container with ID starting with ac50eb7c57f2273200cf4650239284e60b3fcea351fbcab08791bcf1592a0f5a not found: ID does not exist" containerID="ac50eb7c57f2273200cf4650239284e60b3fcea351fbcab08791bcf1592a0f5a" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.485667 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac50eb7c57f2273200cf4650239284e60b3fcea351fbcab08791bcf1592a0f5a"} err="failed to get container status \"ac50eb7c57f2273200cf4650239284e60b3fcea351fbcab08791bcf1592a0f5a\": rpc error: code = NotFound desc = could not find container \"ac50eb7c57f2273200cf4650239284e60b3fcea351fbcab08791bcf1592a0f5a\": container with ID starting with ac50eb7c57f2273200cf4650239284e60b3fcea351fbcab08791bcf1592a0f5a not found: ID does not exist" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.485772 4749 scope.go:117] "RemoveContainer" containerID="932f0f8fbcf7192d70ec8c5b1ecfa8ee758f7808e4bb9fc686ea1d9e697ceeb8" Feb 19 19:18:38 crc kubenswrapper[4749]: E0219 19:18:38.486126 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"932f0f8fbcf7192d70ec8c5b1ecfa8ee758f7808e4bb9fc686ea1d9e697ceeb8\": container with ID starting with 932f0f8fbcf7192d70ec8c5b1ecfa8ee758f7808e4bb9fc686ea1d9e697ceeb8 not found: ID does not exist" containerID="932f0f8fbcf7192d70ec8c5b1ecfa8ee758f7808e4bb9fc686ea1d9e697ceeb8" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.486213 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"932f0f8fbcf7192d70ec8c5b1ecfa8ee758f7808e4bb9fc686ea1d9e697ceeb8"} err="failed to get container status \"932f0f8fbcf7192d70ec8c5b1ecfa8ee758f7808e4bb9fc686ea1d9e697ceeb8\": rpc error: code = NotFound desc = could not find container \"932f0f8fbcf7192d70ec8c5b1ecfa8ee758f7808e4bb9fc686ea1d9e697ceeb8\": container with ID starting with 932f0f8fbcf7192d70ec8c5b1ecfa8ee758f7808e4bb9fc686ea1d9e697ceeb8 not found: ID does not exist" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.486291 4749 scope.go:117] "RemoveContainer" containerID="bda247fdba956df0533e901b8f8a7301db7a1cdc1771cafeea1206f9f3adb665" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.486565 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bda247fdba956df0533e901b8f8a7301db7a1cdc1771cafeea1206f9f3adb665"} err="failed to get container status \"bda247fdba956df0533e901b8f8a7301db7a1cdc1771cafeea1206f9f3adb665\": rpc error: code = NotFound desc = could not find container \"bda247fdba956df0533e901b8f8a7301db7a1cdc1771cafeea1206f9f3adb665\": container with ID starting with bda247fdba956df0533e901b8f8a7301db7a1cdc1771cafeea1206f9f3adb665 not found: ID does not exist" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.486596 4749 scope.go:117] "RemoveContainer" containerID="eeb5965773fef4719ed9de17a0da7de9483b329b067732af29e29198c020c7f2" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.486818 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeb5965773fef4719ed9de17a0da7de9483b329b067732af29e29198c020c7f2"} err="failed to get container status \"eeb5965773fef4719ed9de17a0da7de9483b329b067732af29e29198c020c7f2\": rpc error: code = NotFound desc = could not find container \"eeb5965773fef4719ed9de17a0da7de9483b329b067732af29e29198c020c7f2\": container with ID starting with eeb5965773fef4719ed9de17a0da7de9483b329b067732af29e29198c020c7f2 not found: ID does not exist" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.486836 4749 scope.go:117] "RemoveContainer" containerID="ac50eb7c57f2273200cf4650239284e60b3fcea351fbcab08791bcf1592a0f5a" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.487014 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac50eb7c57f2273200cf4650239284e60b3fcea351fbcab08791bcf1592a0f5a"} err="failed to get container status \"ac50eb7c57f2273200cf4650239284e60b3fcea351fbcab08791bcf1592a0f5a\": rpc error: code = NotFound desc = could not find container \"ac50eb7c57f2273200cf4650239284e60b3fcea351fbcab08791bcf1592a0f5a\": container with ID starting with ac50eb7c57f2273200cf4650239284e60b3fcea351fbcab08791bcf1592a0f5a not found: ID does not exist" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.487050 4749 scope.go:117] "RemoveContainer" containerID="932f0f8fbcf7192d70ec8c5b1ecfa8ee758f7808e4bb9fc686ea1d9e697ceeb8" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.487227 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"932f0f8fbcf7192d70ec8c5b1ecfa8ee758f7808e4bb9fc686ea1d9e697ceeb8"} err="failed to get container status \"932f0f8fbcf7192d70ec8c5b1ecfa8ee758f7808e4bb9fc686ea1d9e697ceeb8\": rpc error: code = NotFound desc = could not find container \"932f0f8fbcf7192d70ec8c5b1ecfa8ee758f7808e4bb9fc686ea1d9e697ceeb8\": container with ID starting with 932f0f8fbcf7192d70ec8c5b1ecfa8ee758f7808e4bb9fc686ea1d9e697ceeb8 not found: ID does not exist" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.487246 4749 scope.go:117] "RemoveContainer" containerID="bda247fdba956df0533e901b8f8a7301db7a1cdc1771cafeea1206f9f3adb665" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.487431 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bda247fdba956df0533e901b8f8a7301db7a1cdc1771cafeea1206f9f3adb665"} err="failed to get container status \"bda247fdba956df0533e901b8f8a7301db7a1cdc1771cafeea1206f9f3adb665\": rpc error: code = NotFound desc = could not find container \"bda247fdba956df0533e901b8f8a7301db7a1cdc1771cafeea1206f9f3adb665\": container with ID starting with bda247fdba956df0533e901b8f8a7301db7a1cdc1771cafeea1206f9f3adb665 not found: ID does not exist" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.487450 4749 scope.go:117] "RemoveContainer" containerID="eeb5965773fef4719ed9de17a0da7de9483b329b067732af29e29198c020c7f2" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.487658 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeb5965773fef4719ed9de17a0da7de9483b329b067732af29e29198c020c7f2"} err="failed to get container status \"eeb5965773fef4719ed9de17a0da7de9483b329b067732af29e29198c020c7f2\": rpc error: code = NotFound desc = could not find container \"eeb5965773fef4719ed9de17a0da7de9483b329b067732af29e29198c020c7f2\": container with ID starting with eeb5965773fef4719ed9de17a0da7de9483b329b067732af29e29198c020c7f2 not found: ID does not exist" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.487672 4749 scope.go:117] "RemoveContainer" containerID="ac50eb7c57f2273200cf4650239284e60b3fcea351fbcab08791bcf1592a0f5a" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.487853 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac50eb7c57f2273200cf4650239284e60b3fcea351fbcab08791bcf1592a0f5a"} err="failed to get container status \"ac50eb7c57f2273200cf4650239284e60b3fcea351fbcab08791bcf1592a0f5a\": rpc error: code = NotFound desc = could not find container \"ac50eb7c57f2273200cf4650239284e60b3fcea351fbcab08791bcf1592a0f5a\": container with ID starting with ac50eb7c57f2273200cf4650239284e60b3fcea351fbcab08791bcf1592a0f5a not found: ID does not exist" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.487867 4749 scope.go:117] "RemoveContainer" containerID="932f0f8fbcf7192d70ec8c5b1ecfa8ee758f7808e4bb9fc686ea1d9e697ceeb8" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.488154 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"932f0f8fbcf7192d70ec8c5b1ecfa8ee758f7808e4bb9fc686ea1d9e697ceeb8"} err="failed to get container status \"932f0f8fbcf7192d70ec8c5b1ecfa8ee758f7808e4bb9fc686ea1d9e697ceeb8\": rpc error: code = NotFound desc = could not find container \"932f0f8fbcf7192d70ec8c5b1ecfa8ee758f7808e4bb9fc686ea1d9e697ceeb8\": container with ID starting with 932f0f8fbcf7192d70ec8c5b1ecfa8ee758f7808e4bb9fc686ea1d9e697ceeb8 not found: ID does not exist" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.667380 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.678493 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.691772 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15a688ac-ce3d-40e9-90d0-b013569164e3" path="/var/lib/kubelet/pods/15a688ac-ce3d-40e9-90d0-b013569164e3/volumes" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.714470 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 19:18:38 crc kubenswrapper[4749]: E0219 19:18:38.714987 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15a688ac-ce3d-40e9-90d0-b013569164e3" containerName="init-config-reloader" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.715006 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a688ac-ce3d-40e9-90d0-b013569164e3" containerName="init-config-reloader" Feb 19 19:18:38 crc kubenswrapper[4749]: E0219 19:18:38.715018 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e73456-dba9-4a6d-a39f-510b438a8a04" containerName="registry-server" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.715040 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e73456-dba9-4a6d-a39f-510b438a8a04" containerName="registry-server" Feb 19 19:18:38 crc kubenswrapper[4749]: E0219 19:18:38.715083 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e73456-dba9-4a6d-a39f-510b438a8a04" containerName="extract-content" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.715093 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e73456-dba9-4a6d-a39f-510b438a8a04" containerName="extract-content" Feb 19 19:18:38 crc kubenswrapper[4749]: E0219 19:18:38.715129 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e73456-dba9-4a6d-a39f-510b438a8a04" containerName="extract-utilities" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.715139 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e73456-dba9-4a6d-a39f-510b438a8a04" containerName="extract-utilities" Feb 19 19:18:38 crc kubenswrapper[4749]: E0219 19:18:38.715150 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15a688ac-ce3d-40e9-90d0-b013569164e3" containerName="config-reloader" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.715158 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a688ac-ce3d-40e9-90d0-b013569164e3" containerName="config-reloader" Feb 19 19:18:38 crc kubenswrapper[4749]: E0219 19:18:38.715171 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15a688ac-ce3d-40e9-90d0-b013569164e3" containerName="prometheus" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.715178 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a688ac-ce3d-40e9-90d0-b013569164e3" containerName="prometheus" Feb 19 19:18:38 crc kubenswrapper[4749]: E0219 19:18:38.715209 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15a688ac-ce3d-40e9-90d0-b013569164e3" containerName="thanos-sidecar" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.715216 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a688ac-ce3d-40e9-90d0-b013569164e3" containerName="thanos-sidecar" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.715449 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5e73456-dba9-4a6d-a39f-510b438a8a04" containerName="registry-server" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.715479 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="15a688ac-ce3d-40e9-90d0-b013569164e3" containerName="prometheus" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.715490 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="15a688ac-ce3d-40e9-90d0-b013569164e3" containerName="config-reloader" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.715501 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="15a688ac-ce3d-40e9-90d0-b013569164e3" containerName="thanos-sidecar" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.717726 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.724803 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-h9bs6" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.726354 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.726596 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.726622 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.726650 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.726734 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.728044 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.732847 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.734924 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.864944 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.864999 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-config\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.865018 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.865059 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.865082 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-82420923-4549-44bf-81a2-5cca6d09b55a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82420923-4549-44bf-81a2-5cca6d09b55a\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.865235 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.865278 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.865442 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gplrb\" (UniqueName: \"kubernetes.io/projected/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-kube-api-access-gplrb\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.865667 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.865749 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.865787 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.865840 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.865939 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.967301 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-config\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.967348 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.967375 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.967398 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-82420923-4549-44bf-81a2-5cca6d09b55a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82420923-4549-44bf-81a2-5cca6d09b55a\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.967428 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.967449 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.967488 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gplrb\" (UniqueName: \"kubernetes.io/projected/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-kube-api-access-gplrb\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.967546 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.967578 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.967601 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.967624 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.968294 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.968368 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.968492 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.968952 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.969334 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.969369 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-82420923-4549-44bf-81a2-5cca6d09b55a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82420923-4549-44bf-81a2-5cca6d09b55a\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7aef4d84e7f064b8dddb5f07903a3617545888f3f79f605754eebcaaed810a22/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.969434 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.972616 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.972754 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.974341 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.975241 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.975792 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.982105 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.982591 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.983278 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-config\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:38 crc kubenswrapper[4749]: I0219 19:18:38.987434 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gplrb\" (UniqueName: \"kubernetes.io/projected/f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f-kube-api-access-gplrb\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:39 crc kubenswrapper[4749]: I0219 19:18:39.022716 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-82420923-4549-44bf-81a2-5cca6d09b55a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82420923-4549-44bf-81a2-5cca6d09b55a\") pod \"prometheus-metric-storage-0\" (UID: \"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:39 crc kubenswrapper[4749]: I0219 19:18:39.076945 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 19:18:39 crc kubenswrapper[4749]: I0219 19:18:39.544116 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 19:18:39 crc kubenswrapper[4749]: W0219 19:18:39.557652 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf79a6e1e_9ad3_43c8_a02e_e0a9ecb59d6f.slice/crio-edeb6e6f69db21b8e8d905767fb34272c809966737d538568f95d7e2317f6228 WatchSource:0}: Error finding container edeb6e6f69db21b8e8d905767fb34272c809966737d538568f95d7e2317f6228: Status 404 returned error can't find the container with id edeb6e6f69db21b8e8d905767fb34272c809966737d538568f95d7e2317f6228 Feb 19 19:18:40 crc kubenswrapper[4749]: I0219 19:18:40.348272 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f","Type":"ContainerStarted","Data":"edeb6e6f69db21b8e8d905767fb34272c809966737d538568f95d7e2317f6228"} Feb 19 19:18:43 crc kubenswrapper[4749]: I0219 19:18:43.380440 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f","Type":"ContainerStarted","Data":"c9a876f748839e8201e3db01d9bc713d7d77a5e6357d5fdc53fbb5e3be369806"} Feb 19 19:18:45 crc kubenswrapper[4749]: I0219 19:18:45.399188 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qll7d"] Feb 19 19:18:45 crc kubenswrapper[4749]: I0219 19:18:45.402287 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qll7d" Feb 19 19:18:45 crc kubenswrapper[4749]: I0219 19:18:45.407580 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qll7d"] Feb 19 19:18:45 crc kubenswrapper[4749]: I0219 19:18:45.513806 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smv7h\" (UniqueName: \"kubernetes.io/projected/dee5d824-f47d-4120-9587-d9c78449c492-kube-api-access-smv7h\") pod \"community-operators-qll7d\" (UID: \"dee5d824-f47d-4120-9587-d9c78449c492\") " pod="openshift-marketplace/community-operators-qll7d" Feb 19 19:18:45 crc kubenswrapper[4749]: I0219 19:18:45.513871 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dee5d824-f47d-4120-9587-d9c78449c492-catalog-content\") pod \"community-operators-qll7d\" (UID: \"dee5d824-f47d-4120-9587-d9c78449c492\") " pod="openshift-marketplace/community-operators-qll7d" Feb 19 19:18:45 crc kubenswrapper[4749]: I0219 19:18:45.514337 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dee5d824-f47d-4120-9587-d9c78449c492-utilities\") pod \"community-operators-qll7d\" (UID: \"dee5d824-f47d-4120-9587-d9c78449c492\") " pod="openshift-marketplace/community-operators-qll7d" Feb 19 19:18:45 crc kubenswrapper[4749]: I0219 19:18:45.616783 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smv7h\" (UniqueName: \"kubernetes.io/projected/dee5d824-f47d-4120-9587-d9c78449c492-kube-api-access-smv7h\") pod \"community-operators-qll7d\" (UID: \"dee5d824-f47d-4120-9587-d9c78449c492\") " pod="openshift-marketplace/community-operators-qll7d" Feb 19 19:18:45 crc kubenswrapper[4749]: I0219 19:18:45.616837 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dee5d824-f47d-4120-9587-d9c78449c492-catalog-content\") pod \"community-operators-qll7d\" (UID: \"dee5d824-f47d-4120-9587-d9c78449c492\") " pod="openshift-marketplace/community-operators-qll7d" Feb 19 19:18:45 crc kubenswrapper[4749]: I0219 19:18:45.616973 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dee5d824-f47d-4120-9587-d9c78449c492-utilities\") pod \"community-operators-qll7d\" (UID: \"dee5d824-f47d-4120-9587-d9c78449c492\") " pod="openshift-marketplace/community-operators-qll7d" Feb 19 19:18:45 crc kubenswrapper[4749]: I0219 19:18:45.617490 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dee5d824-f47d-4120-9587-d9c78449c492-catalog-content\") pod \"community-operators-qll7d\" (UID: \"dee5d824-f47d-4120-9587-d9c78449c492\") " pod="openshift-marketplace/community-operators-qll7d" Feb 19 19:18:45 crc kubenswrapper[4749]: I0219 19:18:45.617535 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dee5d824-f47d-4120-9587-d9c78449c492-utilities\") pod \"community-operators-qll7d\" (UID: \"dee5d824-f47d-4120-9587-d9c78449c492\") " pod="openshift-marketplace/community-operators-qll7d" Feb 19 19:18:45 crc kubenswrapper[4749]: I0219 19:18:45.640810 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smv7h\" (UniqueName: \"kubernetes.io/projected/dee5d824-f47d-4120-9587-d9c78449c492-kube-api-access-smv7h\") pod \"community-operators-qll7d\" (UID: \"dee5d824-f47d-4120-9587-d9c78449c492\") " pod="openshift-marketplace/community-operators-qll7d" Feb 19 19:18:45 crc kubenswrapper[4749]: I0219 19:18:45.724012 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qll7d" Feb 19 19:18:46 crc kubenswrapper[4749]: I0219 19:18:46.297872 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qll7d"] Feb 19 19:18:46 crc kubenswrapper[4749]: W0219 19:18:46.309565 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddee5d824_f47d_4120_9587_d9c78449c492.slice/crio-1ceb2dfac1477c7c6a160fdd7942aee8afbf52be77d9a73ea84c01307992b720 WatchSource:0}: Error finding container 1ceb2dfac1477c7c6a160fdd7942aee8afbf52be77d9a73ea84c01307992b720: Status 404 returned error can't find the container with id 1ceb2dfac1477c7c6a160fdd7942aee8afbf52be77d9a73ea84c01307992b720 Feb 19 19:18:46 crc kubenswrapper[4749]: I0219 19:18:46.418241 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qll7d" event={"ID":"dee5d824-f47d-4120-9587-d9c78449c492","Type":"ContainerStarted","Data":"1ceb2dfac1477c7c6a160fdd7942aee8afbf52be77d9a73ea84c01307992b720"} Feb 19 19:18:47 crc kubenswrapper[4749]: I0219 19:18:47.430265 4749 generic.go:334] "Generic (PLEG): container finished" podID="dee5d824-f47d-4120-9587-d9c78449c492" containerID="05206eb211ca92c633b408f9ac00c89034898e0558c94d438490db39b2af0202" exitCode=0 Feb 19 19:18:47 crc kubenswrapper[4749]: I0219 19:18:47.430422 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qll7d" event={"ID":"dee5d824-f47d-4120-9587-d9c78449c492","Type":"ContainerDied","Data":"05206eb211ca92c633b408f9ac00c89034898e0558c94d438490db39b2af0202"} Feb 19 19:18:49 crc kubenswrapper[4749]: I0219 19:18:49.457520 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qll7d" event={"ID":"dee5d824-f47d-4120-9587-d9c78449c492","Type":"ContainerStarted","Data":"7cbf29943b8f79cb391961598bc93e7f8cfcd1836cd2a348bfd49514750dc759"} Feb 19 19:18:50 crc kubenswrapper[4749]: I0219 19:18:50.469190 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qll7d" event={"ID":"dee5d824-f47d-4120-9587-d9c78449c492","Type":"ContainerDied","Data":"7cbf29943b8f79cb391961598bc93e7f8cfcd1836cd2a348bfd49514750dc759"} Feb 19 19:18:50 crc kubenswrapper[4749]: I0219 19:18:50.469007 4749 generic.go:334] "Generic (PLEG): container finished" podID="dee5d824-f47d-4120-9587-d9c78449c492" containerID="7cbf29943b8f79cb391961598bc93e7f8cfcd1836cd2a348bfd49514750dc759" exitCode=0 Feb 19 19:18:51 crc kubenswrapper[4749]: I0219 19:18:51.481601 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qll7d" event={"ID":"dee5d824-f47d-4120-9587-d9c78449c492","Type":"ContainerStarted","Data":"479d3cadabb8f7807508a005633c7e3f45fbf750c8bdbb3de88516a07142ee74"} Feb 19 19:18:51 crc kubenswrapper[4749]: I0219 19:18:51.484180 4749 generic.go:334] "Generic (PLEG): container finished" podID="f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f" containerID="c9a876f748839e8201e3db01d9bc713d7d77a5e6357d5fdc53fbb5e3be369806" exitCode=0 Feb 19 19:18:51 crc kubenswrapper[4749]: I0219 19:18:51.484220 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f","Type":"ContainerDied","Data":"c9a876f748839e8201e3db01d9bc713d7d77a5e6357d5fdc53fbb5e3be369806"} Feb 19 19:18:51 crc kubenswrapper[4749]: I0219 19:18:51.507910 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qll7d" podStartSLOduration=2.961220468 podStartE2EDuration="6.507885085s" podCreationTimestamp="2026-02-19 19:18:45 +0000 UTC" firstStartedPulling="2026-02-19 19:18:47.432631072 +0000 UTC m=+2701.393851026" lastFinishedPulling="2026-02-19 19:18:50.979295679 +0000 UTC m=+2704.940515643" observedRunningTime="2026-02-19 19:18:51.503166012 +0000 UTC m=+2705.464385986" watchObservedRunningTime="2026-02-19 19:18:51.507885085 +0000 UTC m=+2705.469105059" Feb 19 19:18:52 crc kubenswrapper[4749]: I0219 19:18:52.495327 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f","Type":"ContainerStarted","Data":"fad6fafa63e1704b8160ea0095001a510ea3cd70aec42249b94de2227be9a59e"} Feb 19 19:18:55 crc kubenswrapper[4749]: I0219 19:18:55.523871 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f","Type":"ContainerStarted","Data":"ea81cd49bae115f3624c6533d12a908a73d9e77682093f83d8abb471c7c58d5f"} Feb 19 19:18:55 crc kubenswrapper[4749]: I0219 19:18:55.524506 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f","Type":"ContainerStarted","Data":"ff30e44c9b8d3d603b7da302bacc1fa44478808fe1be9897db6261f4a680f982"} Feb 19 19:18:55 crc kubenswrapper[4749]: I0219 19:18:55.554384 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=17.554367137 podStartE2EDuration="17.554367137s" podCreationTimestamp="2026-02-19 19:18:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:18:55.547356308 +0000 UTC m=+2709.508576262" watchObservedRunningTime="2026-02-19 19:18:55.554367137 +0000 UTC m=+2709.515587091" Feb 19 19:18:55 crc kubenswrapper[4749]: I0219 19:18:55.724413 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qll7d" Feb 19 19:18:55 crc kubenswrapper[4749]: I0219 19:18:55.724466 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qll7d" Feb 19 19:18:56 crc kubenswrapper[4749]: I0219 19:18:56.786209 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-qll7d" podUID="dee5d824-f47d-4120-9587-d9c78449c492" containerName="registry-server" probeResult="failure" output=< Feb 19 19:18:56 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Feb 19 19:18:56 crc kubenswrapper[4749]: > Feb 19 19:18:59 crc kubenswrapper[4749]: I0219 19:18:59.077430 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 19 19:19:00 crc kubenswrapper[4749]: E0219 19:19:00.544651 4749 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.128:50448->38.102.83.128:36573: write tcp 38.102.83.128:50448->38.102.83.128:36573: write: broken pipe Feb 19 19:19:05 crc kubenswrapper[4749]: I0219 19:19:05.785290 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qll7d" Feb 19 19:19:05 crc kubenswrapper[4749]: I0219 19:19:05.837181 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qll7d" Feb 19 19:19:06 crc kubenswrapper[4749]: I0219 19:19:06.026413 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qll7d"] Feb 19 19:19:07 crc kubenswrapper[4749]: I0219 19:19:07.636471 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qll7d" podUID="dee5d824-f47d-4120-9587-d9c78449c492" containerName="registry-server" containerID="cri-o://479d3cadabb8f7807508a005633c7e3f45fbf750c8bdbb3de88516a07142ee74" gracePeriod=2 Feb 19 19:19:08 crc kubenswrapper[4749]: I0219 19:19:08.092110 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qll7d" Feb 19 19:19:08 crc kubenswrapper[4749]: I0219 19:19:08.197496 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dee5d824-f47d-4120-9587-d9c78449c492-utilities\") pod \"dee5d824-f47d-4120-9587-d9c78449c492\" (UID: \"dee5d824-f47d-4120-9587-d9c78449c492\") " Feb 19 19:19:08 crc kubenswrapper[4749]: I0219 19:19:08.197734 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smv7h\" (UniqueName: \"kubernetes.io/projected/dee5d824-f47d-4120-9587-d9c78449c492-kube-api-access-smv7h\") pod \"dee5d824-f47d-4120-9587-d9c78449c492\" (UID: \"dee5d824-f47d-4120-9587-d9c78449c492\") " Feb 19 19:19:08 crc kubenswrapper[4749]: I0219 19:19:08.197891 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dee5d824-f47d-4120-9587-d9c78449c492-catalog-content\") pod \"dee5d824-f47d-4120-9587-d9c78449c492\" (UID: \"dee5d824-f47d-4120-9587-d9c78449c492\") " Feb 19 19:19:08 crc kubenswrapper[4749]: I0219 19:19:08.198874 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dee5d824-f47d-4120-9587-d9c78449c492-utilities" (OuterVolumeSpecName: "utilities") pod "dee5d824-f47d-4120-9587-d9c78449c492" (UID: "dee5d824-f47d-4120-9587-d9c78449c492"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:19:08 crc kubenswrapper[4749]: I0219 19:19:08.204139 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dee5d824-f47d-4120-9587-d9c78449c492-kube-api-access-smv7h" (OuterVolumeSpecName: "kube-api-access-smv7h") pod "dee5d824-f47d-4120-9587-d9c78449c492" (UID: "dee5d824-f47d-4120-9587-d9c78449c492"). InnerVolumeSpecName "kube-api-access-smv7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:19:08 crc kubenswrapper[4749]: I0219 19:19:08.249272 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dee5d824-f47d-4120-9587-d9c78449c492-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dee5d824-f47d-4120-9587-d9c78449c492" (UID: "dee5d824-f47d-4120-9587-d9c78449c492"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:19:08 crc kubenswrapper[4749]: I0219 19:19:08.300258 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dee5d824-f47d-4120-9587-d9c78449c492-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:19:08 crc kubenswrapper[4749]: I0219 19:19:08.300296 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smv7h\" (UniqueName: \"kubernetes.io/projected/dee5d824-f47d-4120-9587-d9c78449c492-kube-api-access-smv7h\") on node \"crc\" DevicePath \"\"" Feb 19 19:19:08 crc kubenswrapper[4749]: I0219 19:19:08.300306 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dee5d824-f47d-4120-9587-d9c78449c492-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:19:08 crc kubenswrapper[4749]: I0219 19:19:08.647986 4749 generic.go:334] "Generic (PLEG): container finished" podID="dee5d824-f47d-4120-9587-d9c78449c492" containerID="479d3cadabb8f7807508a005633c7e3f45fbf750c8bdbb3de88516a07142ee74" exitCode=0 Feb 19 19:19:08 crc kubenswrapper[4749]: I0219 19:19:08.648041 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qll7d" event={"ID":"dee5d824-f47d-4120-9587-d9c78449c492","Type":"ContainerDied","Data":"479d3cadabb8f7807508a005633c7e3f45fbf750c8bdbb3de88516a07142ee74"} Feb 19 19:19:08 crc kubenswrapper[4749]: I0219 19:19:08.648072 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qll7d" event={"ID":"dee5d824-f47d-4120-9587-d9c78449c492","Type":"ContainerDied","Data":"1ceb2dfac1477c7c6a160fdd7942aee8afbf52be77d9a73ea84c01307992b720"} Feb 19 19:19:08 crc kubenswrapper[4749]: I0219 19:19:08.648088 4749 scope.go:117] "RemoveContainer" containerID="479d3cadabb8f7807508a005633c7e3f45fbf750c8bdbb3de88516a07142ee74" Feb 19 19:19:08 crc kubenswrapper[4749]: I0219 19:19:08.648199 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qll7d" Feb 19 19:19:08 crc kubenswrapper[4749]: I0219 19:19:08.677939 4749 scope.go:117] "RemoveContainer" containerID="7cbf29943b8f79cb391961598bc93e7f8cfcd1836cd2a348bfd49514750dc759" Feb 19 19:19:08 crc kubenswrapper[4749]: I0219 19:19:08.695221 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qll7d"] Feb 19 19:19:08 crc kubenswrapper[4749]: I0219 19:19:08.707120 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qll7d"] Feb 19 19:19:08 crc kubenswrapper[4749]: I0219 19:19:08.718120 4749 scope.go:117] "RemoveContainer" containerID="05206eb211ca92c633b408f9ac00c89034898e0558c94d438490db39b2af0202" Feb 19 19:19:08 crc kubenswrapper[4749]: I0219 19:19:08.756869 4749 scope.go:117] "RemoveContainer" containerID="479d3cadabb8f7807508a005633c7e3f45fbf750c8bdbb3de88516a07142ee74" Feb 19 19:19:08 crc kubenswrapper[4749]: E0219 19:19:08.757278 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"479d3cadabb8f7807508a005633c7e3f45fbf750c8bdbb3de88516a07142ee74\": container with ID starting with 479d3cadabb8f7807508a005633c7e3f45fbf750c8bdbb3de88516a07142ee74 not found: ID does not exist" containerID="479d3cadabb8f7807508a005633c7e3f45fbf750c8bdbb3de88516a07142ee74" Feb 19 19:19:08 crc kubenswrapper[4749]: I0219 19:19:08.757306 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"479d3cadabb8f7807508a005633c7e3f45fbf750c8bdbb3de88516a07142ee74"} err="failed to get container status \"479d3cadabb8f7807508a005633c7e3f45fbf750c8bdbb3de88516a07142ee74\": rpc error: code = NotFound desc = could not find container \"479d3cadabb8f7807508a005633c7e3f45fbf750c8bdbb3de88516a07142ee74\": container with ID starting with 479d3cadabb8f7807508a005633c7e3f45fbf750c8bdbb3de88516a07142ee74 not found: ID does not exist" Feb 19 19:19:08 crc kubenswrapper[4749]: I0219 19:19:08.757327 4749 scope.go:117] "RemoveContainer" containerID="7cbf29943b8f79cb391961598bc93e7f8cfcd1836cd2a348bfd49514750dc759" Feb 19 19:19:08 crc kubenswrapper[4749]: E0219 19:19:08.757901 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cbf29943b8f79cb391961598bc93e7f8cfcd1836cd2a348bfd49514750dc759\": container with ID starting with 7cbf29943b8f79cb391961598bc93e7f8cfcd1836cd2a348bfd49514750dc759 not found: ID does not exist" containerID="7cbf29943b8f79cb391961598bc93e7f8cfcd1836cd2a348bfd49514750dc759" Feb 19 19:19:08 crc kubenswrapper[4749]: I0219 19:19:08.757922 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cbf29943b8f79cb391961598bc93e7f8cfcd1836cd2a348bfd49514750dc759"} err="failed to get container status \"7cbf29943b8f79cb391961598bc93e7f8cfcd1836cd2a348bfd49514750dc759\": rpc error: code = NotFound desc = could not find container \"7cbf29943b8f79cb391961598bc93e7f8cfcd1836cd2a348bfd49514750dc759\": container with ID starting with 7cbf29943b8f79cb391961598bc93e7f8cfcd1836cd2a348bfd49514750dc759 not found: ID does not exist" Feb 19 19:19:08 crc kubenswrapper[4749]: I0219 19:19:08.757934 4749 scope.go:117] "RemoveContainer" containerID="05206eb211ca92c633b408f9ac00c89034898e0558c94d438490db39b2af0202" Feb 19 19:19:08 crc kubenswrapper[4749]: E0219 19:19:08.758378 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05206eb211ca92c633b408f9ac00c89034898e0558c94d438490db39b2af0202\": container with ID starting with 05206eb211ca92c633b408f9ac00c89034898e0558c94d438490db39b2af0202 not found: ID does not exist" containerID="05206eb211ca92c633b408f9ac00c89034898e0558c94d438490db39b2af0202" Feb 19 19:19:08 crc kubenswrapper[4749]: I0219 19:19:08.758397 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05206eb211ca92c633b408f9ac00c89034898e0558c94d438490db39b2af0202"} err="failed to get container status \"05206eb211ca92c633b408f9ac00c89034898e0558c94d438490db39b2af0202\": rpc error: code = NotFound desc = could not find container \"05206eb211ca92c633b408f9ac00c89034898e0558c94d438490db39b2af0202\": container with ID starting with 05206eb211ca92c633b408f9ac00c89034898e0558c94d438490db39b2af0202 not found: ID does not exist" Feb 19 19:19:09 crc kubenswrapper[4749]: I0219 19:19:09.077542 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 19 19:19:09 crc kubenswrapper[4749]: I0219 19:19:09.086238 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 19 19:19:09 crc kubenswrapper[4749]: I0219 19:19:09.660694 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 19 19:19:10 crc kubenswrapper[4749]: I0219 19:19:10.703530 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dee5d824-f47d-4120-9587-d9c78449c492" path="/var/lib/kubelet/pods/dee5d824-f47d-4120-9587-d9c78449c492/volumes" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.583233 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 19 19:19:13 crc kubenswrapper[4749]: E0219 19:19:13.583985 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dee5d824-f47d-4120-9587-d9c78449c492" containerName="extract-content" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.584000 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dee5d824-f47d-4120-9587-d9c78449c492" containerName="extract-content" Feb 19 19:19:13 crc kubenswrapper[4749]: E0219 19:19:13.584059 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dee5d824-f47d-4120-9587-d9c78449c492" containerName="registry-server" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.584068 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dee5d824-f47d-4120-9587-d9c78449c492" containerName="registry-server" Feb 19 19:19:13 crc kubenswrapper[4749]: E0219 19:19:13.584082 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dee5d824-f47d-4120-9587-d9c78449c492" containerName="extract-utilities" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.584088 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dee5d824-f47d-4120-9587-d9c78449c492" containerName="extract-utilities" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.584278 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="dee5d824-f47d-4120-9587-d9c78449c492" containerName="registry-server" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.584983 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.590913 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.591149 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.591317 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.596530 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-tl9hr" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.613422 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.670346 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/656c9f00-c5aa-4d25-b425-84c0ce173433-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.670646 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqtf8\" (UniqueName: \"kubernetes.io/projected/656c9f00-c5aa-4d25-b425-84c0ce173433-kube-api-access-mqtf8\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.670813 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.670929 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/656c9f00-c5aa-4d25-b425-84c0ce173433-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.672223 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/656c9f00-c5aa-4d25-b425-84c0ce173433-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.672384 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/656c9f00-c5aa-4d25-b425-84c0ce173433-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.672491 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/656c9f00-c5aa-4d25-b425-84c0ce173433-config-data\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.672682 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/656c9f00-c5aa-4d25-b425-84c0ce173433-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.673356 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/656c9f00-c5aa-4d25-b425-84c0ce173433-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.775355 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/656c9f00-c5aa-4d25-b425-84c0ce173433-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.775439 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/656c9f00-c5aa-4d25-b425-84c0ce173433-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.775468 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/656c9f00-c5aa-4d25-b425-84c0ce173433-config-data\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.775532 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/656c9f00-c5aa-4d25-b425-84c0ce173433-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.775584 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/656c9f00-c5aa-4d25-b425-84c0ce173433-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.775611 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/656c9f00-c5aa-4d25-b425-84c0ce173433-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.775640 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqtf8\" (UniqueName: \"kubernetes.io/projected/656c9f00-c5aa-4d25-b425-84c0ce173433-kube-api-access-mqtf8\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.775668 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.775704 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/656c9f00-c5aa-4d25-b425-84c0ce173433-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.776541 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/656c9f00-c5aa-4d25-b425-84c0ce173433-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.776756 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.776768 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/656c9f00-c5aa-4d25-b425-84c0ce173433-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.777263 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/656c9f00-c5aa-4d25-b425-84c0ce173433-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.777693 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/656c9f00-c5aa-4d25-b425-84c0ce173433-config-data\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.783065 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/656c9f00-c5aa-4d25-b425-84c0ce173433-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.783223 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/656c9f00-c5aa-4d25-b425-84c0ce173433-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.783473 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/656c9f00-c5aa-4d25-b425-84c0ce173433-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.792806 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqtf8\" (UniqueName: \"kubernetes.io/projected/656c9f00-c5aa-4d25-b425-84c0ce173433-kube-api-access-mqtf8\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.808527 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " pod="openstack/tempest-tests-tempest" Feb 19 19:19:13 crc kubenswrapper[4749]: I0219 19:19:13.913712 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 19 19:19:14 crc kubenswrapper[4749]: I0219 19:19:14.351756 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 19 19:19:14 crc kubenswrapper[4749]: I0219 19:19:14.703932 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"656c9f00-c5aa-4d25-b425-84c0ce173433","Type":"ContainerStarted","Data":"217e50d8dcb7fe0119bde170bc997d9c44e71b7d36574e0b1cdf6b3832f1cb13"} Feb 19 19:19:25 crc kubenswrapper[4749]: I0219 19:19:25.799964 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"656c9f00-c5aa-4d25-b425-84c0ce173433","Type":"ContainerStarted","Data":"4c00e383e557ddaab27532f44ffd780aa43c3e24dd11aba0455e0d56b77de22a"} Feb 19 19:19:25 crc kubenswrapper[4749]: I0219 19:19:25.821064 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.649962967 podStartE2EDuration="13.821017099s" podCreationTimestamp="2026-02-19 19:19:12 +0000 UTC" firstStartedPulling="2026-02-19 19:19:14.355433651 +0000 UTC m=+2728.316653605" lastFinishedPulling="2026-02-19 19:19:24.526487733 +0000 UTC m=+2738.487707737" observedRunningTime="2026-02-19 19:19:25.813406287 +0000 UTC m=+2739.774626251" watchObservedRunningTime="2026-02-19 19:19:25.821017099 +0000 UTC m=+2739.782237053" Feb 19 19:19:40 crc kubenswrapper[4749]: I0219 19:19:40.846575 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sg8s9"] Feb 19 19:19:40 crc kubenswrapper[4749]: I0219 19:19:40.850445 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sg8s9" Feb 19 19:19:40 crc kubenswrapper[4749]: I0219 19:19:40.875518 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sg8s9"] Feb 19 19:19:40 crc kubenswrapper[4749]: I0219 19:19:40.916484 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e05f121-510d-4f58-8c95-74d770eb37ea-catalog-content\") pod \"certified-operators-sg8s9\" (UID: \"5e05f121-510d-4f58-8c95-74d770eb37ea\") " pod="openshift-marketplace/certified-operators-sg8s9" Feb 19 19:19:40 crc kubenswrapper[4749]: I0219 19:19:40.916550 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcdxk\" (UniqueName: \"kubernetes.io/projected/5e05f121-510d-4f58-8c95-74d770eb37ea-kube-api-access-lcdxk\") pod \"certified-operators-sg8s9\" (UID: \"5e05f121-510d-4f58-8c95-74d770eb37ea\") " pod="openshift-marketplace/certified-operators-sg8s9" Feb 19 19:19:40 crc kubenswrapper[4749]: I0219 19:19:40.916584 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e05f121-510d-4f58-8c95-74d770eb37ea-utilities\") pod \"certified-operators-sg8s9\" (UID: \"5e05f121-510d-4f58-8c95-74d770eb37ea\") " pod="openshift-marketplace/certified-operators-sg8s9" Feb 19 19:19:41 crc kubenswrapper[4749]: I0219 19:19:41.018470 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e05f121-510d-4f58-8c95-74d770eb37ea-catalog-content\") pod \"certified-operators-sg8s9\" (UID: \"5e05f121-510d-4f58-8c95-74d770eb37ea\") " pod="openshift-marketplace/certified-operators-sg8s9" Feb 19 19:19:41 crc kubenswrapper[4749]: I0219 19:19:41.018525 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcdxk\" (UniqueName: \"kubernetes.io/projected/5e05f121-510d-4f58-8c95-74d770eb37ea-kube-api-access-lcdxk\") pod \"certified-operators-sg8s9\" (UID: \"5e05f121-510d-4f58-8c95-74d770eb37ea\") " pod="openshift-marketplace/certified-operators-sg8s9" Feb 19 19:19:41 crc kubenswrapper[4749]: I0219 19:19:41.018549 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e05f121-510d-4f58-8c95-74d770eb37ea-utilities\") pod \"certified-operators-sg8s9\" (UID: \"5e05f121-510d-4f58-8c95-74d770eb37ea\") " pod="openshift-marketplace/certified-operators-sg8s9" Feb 19 19:19:41 crc kubenswrapper[4749]: I0219 19:19:41.019143 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e05f121-510d-4f58-8c95-74d770eb37ea-catalog-content\") pod \"certified-operators-sg8s9\" (UID: \"5e05f121-510d-4f58-8c95-74d770eb37ea\") " pod="openshift-marketplace/certified-operators-sg8s9" Feb 19 19:19:41 crc kubenswrapper[4749]: I0219 19:19:41.019170 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e05f121-510d-4f58-8c95-74d770eb37ea-utilities\") pod \"certified-operators-sg8s9\" (UID: \"5e05f121-510d-4f58-8c95-74d770eb37ea\") " pod="openshift-marketplace/certified-operators-sg8s9" Feb 19 19:19:41 crc kubenswrapper[4749]: I0219 19:19:41.042529 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcdxk\" (UniqueName: \"kubernetes.io/projected/5e05f121-510d-4f58-8c95-74d770eb37ea-kube-api-access-lcdxk\") pod \"certified-operators-sg8s9\" (UID: \"5e05f121-510d-4f58-8c95-74d770eb37ea\") " pod="openshift-marketplace/certified-operators-sg8s9" Feb 19 19:19:41 crc kubenswrapper[4749]: I0219 19:19:41.196674 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sg8s9" Feb 19 19:19:42 crc kubenswrapper[4749]: I0219 19:19:41.774060 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sg8s9"] Feb 19 19:19:42 crc kubenswrapper[4749]: I0219 19:19:41.940303 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sg8s9" event={"ID":"5e05f121-510d-4f58-8c95-74d770eb37ea","Type":"ContainerStarted","Data":"6ff9771acd1d12c6d1c4d19bab0d769ec2685cb45e31369c12c8a6c3c168ad77"} Feb 19 19:19:42 crc kubenswrapper[4749]: I0219 19:19:42.951536 4749 generic.go:334] "Generic (PLEG): container finished" podID="5e05f121-510d-4f58-8c95-74d770eb37ea" containerID="303bcc0f4a6be689e97f9ff13be836e680ebcb33d6dffb477c68199d7f46d50f" exitCode=0 Feb 19 19:19:42 crc kubenswrapper[4749]: I0219 19:19:42.951648 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sg8s9" event={"ID":"5e05f121-510d-4f58-8c95-74d770eb37ea","Type":"ContainerDied","Data":"303bcc0f4a6be689e97f9ff13be836e680ebcb33d6dffb477c68199d7f46d50f"} Feb 19 19:19:43 crc kubenswrapper[4749]: I0219 19:19:43.253575 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t6dx2"] Feb 19 19:19:43 crc kubenswrapper[4749]: I0219 19:19:43.256537 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t6dx2" Feb 19 19:19:43 crc kubenswrapper[4749]: I0219 19:19:43.300976 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t6dx2"] Feb 19 19:19:43 crc kubenswrapper[4749]: I0219 19:19:43.384262 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pclmj\" (UniqueName: \"kubernetes.io/projected/25dfd640-2593-4dd0-8685-3c7018355375-kube-api-access-pclmj\") pod \"redhat-operators-t6dx2\" (UID: \"25dfd640-2593-4dd0-8685-3c7018355375\") " pod="openshift-marketplace/redhat-operators-t6dx2" Feb 19 19:19:43 crc kubenswrapper[4749]: I0219 19:19:43.387233 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25dfd640-2593-4dd0-8685-3c7018355375-utilities\") pod \"redhat-operators-t6dx2\" (UID: \"25dfd640-2593-4dd0-8685-3c7018355375\") " pod="openshift-marketplace/redhat-operators-t6dx2" Feb 19 19:19:43 crc kubenswrapper[4749]: I0219 19:19:43.387502 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25dfd640-2593-4dd0-8685-3c7018355375-catalog-content\") pod \"redhat-operators-t6dx2\" (UID: \"25dfd640-2593-4dd0-8685-3c7018355375\") " pod="openshift-marketplace/redhat-operators-t6dx2" Feb 19 19:19:43 crc kubenswrapper[4749]: I0219 19:19:43.489203 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25dfd640-2593-4dd0-8685-3c7018355375-utilities\") pod \"redhat-operators-t6dx2\" (UID: \"25dfd640-2593-4dd0-8685-3c7018355375\") " pod="openshift-marketplace/redhat-operators-t6dx2" Feb 19 19:19:43 crc kubenswrapper[4749]: I0219 19:19:43.489302 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25dfd640-2593-4dd0-8685-3c7018355375-catalog-content\") pod \"redhat-operators-t6dx2\" (UID: \"25dfd640-2593-4dd0-8685-3c7018355375\") " pod="openshift-marketplace/redhat-operators-t6dx2" Feb 19 19:19:43 crc kubenswrapper[4749]: I0219 19:19:43.489363 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pclmj\" (UniqueName: \"kubernetes.io/projected/25dfd640-2593-4dd0-8685-3c7018355375-kube-api-access-pclmj\") pod \"redhat-operators-t6dx2\" (UID: \"25dfd640-2593-4dd0-8685-3c7018355375\") " pod="openshift-marketplace/redhat-operators-t6dx2" Feb 19 19:19:43 crc kubenswrapper[4749]: I0219 19:19:43.490056 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25dfd640-2593-4dd0-8685-3c7018355375-utilities\") pod \"redhat-operators-t6dx2\" (UID: \"25dfd640-2593-4dd0-8685-3c7018355375\") " pod="openshift-marketplace/redhat-operators-t6dx2" Feb 19 19:19:43 crc kubenswrapper[4749]: I0219 19:19:43.490277 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25dfd640-2593-4dd0-8685-3c7018355375-catalog-content\") pod \"redhat-operators-t6dx2\" (UID: \"25dfd640-2593-4dd0-8685-3c7018355375\") " pod="openshift-marketplace/redhat-operators-t6dx2" Feb 19 19:19:43 crc kubenswrapper[4749]: I0219 19:19:43.516249 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pclmj\" (UniqueName: \"kubernetes.io/projected/25dfd640-2593-4dd0-8685-3c7018355375-kube-api-access-pclmj\") pod \"redhat-operators-t6dx2\" (UID: \"25dfd640-2593-4dd0-8685-3c7018355375\") " pod="openshift-marketplace/redhat-operators-t6dx2" Feb 19 19:19:43 crc kubenswrapper[4749]: I0219 19:19:43.579551 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t6dx2" Feb 19 19:19:43 crc kubenswrapper[4749]: I0219 19:19:43.963508 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sg8s9" event={"ID":"5e05f121-510d-4f58-8c95-74d770eb37ea","Type":"ContainerStarted","Data":"5ad1457d678f50d5c353e8380fa5a13db734f3ffa27ffaf9791352758776a3ae"} Feb 19 19:19:44 crc kubenswrapper[4749]: I0219 19:19:44.188430 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t6dx2"] Feb 19 19:19:44 crc kubenswrapper[4749]: W0219 19:19:44.190445 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25dfd640_2593_4dd0_8685_3c7018355375.slice/crio-3e56e91f99089a3d7972b1d162f56e2e958a84692aff52462ed4bc6543b71426 WatchSource:0}: Error finding container 3e56e91f99089a3d7972b1d162f56e2e958a84692aff52462ed4bc6543b71426: Status 404 returned error can't find the container with id 3e56e91f99089a3d7972b1d162f56e2e958a84692aff52462ed4bc6543b71426 Feb 19 19:19:44 crc kubenswrapper[4749]: I0219 19:19:44.988849 4749 generic.go:334] "Generic (PLEG): container finished" podID="25dfd640-2593-4dd0-8685-3c7018355375" containerID="918ab814f948d2f05be293fb6f5cac75231d6240f0ead4d0e587d7df5abd7c77" exitCode=0 Feb 19 19:19:44 crc kubenswrapper[4749]: I0219 19:19:44.990068 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6dx2" event={"ID":"25dfd640-2593-4dd0-8685-3c7018355375","Type":"ContainerDied","Data":"918ab814f948d2f05be293fb6f5cac75231d6240f0ead4d0e587d7df5abd7c77"} Feb 19 19:19:44 crc kubenswrapper[4749]: I0219 19:19:44.990405 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6dx2" event={"ID":"25dfd640-2593-4dd0-8685-3c7018355375","Type":"ContainerStarted","Data":"3e56e91f99089a3d7972b1d162f56e2e958a84692aff52462ed4bc6543b71426"} Feb 19 19:19:46 crc kubenswrapper[4749]: I0219 19:19:46.003169 4749 generic.go:334] "Generic (PLEG): container finished" podID="5e05f121-510d-4f58-8c95-74d770eb37ea" containerID="5ad1457d678f50d5c353e8380fa5a13db734f3ffa27ffaf9791352758776a3ae" exitCode=0 Feb 19 19:19:46 crc kubenswrapper[4749]: I0219 19:19:46.003354 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sg8s9" event={"ID":"5e05f121-510d-4f58-8c95-74d770eb37ea","Type":"ContainerDied","Data":"5ad1457d678f50d5c353e8380fa5a13db734f3ffa27ffaf9791352758776a3ae"} Feb 19 19:19:47 crc kubenswrapper[4749]: I0219 19:19:47.015268 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6dx2" event={"ID":"25dfd640-2593-4dd0-8685-3c7018355375","Type":"ContainerStarted","Data":"c21c92a8d6b66886f93c74c4f6cec5896d6ab28771788452bb253badf9b5c301"} Feb 19 19:19:47 crc kubenswrapper[4749]: I0219 19:19:47.018912 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sg8s9" event={"ID":"5e05f121-510d-4f58-8c95-74d770eb37ea","Type":"ContainerStarted","Data":"196a3286e3e87a436acc1714988567331469e68f30099aae5f071d0cb980886d"} Feb 19 19:19:47 crc kubenswrapper[4749]: I0219 19:19:47.063912 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sg8s9" podStartSLOduration=3.563727612 podStartE2EDuration="7.063891639s" podCreationTimestamp="2026-02-19 19:19:40 +0000 UTC" firstStartedPulling="2026-02-19 19:19:42.953812872 +0000 UTC m=+2756.915032826" lastFinishedPulling="2026-02-19 19:19:46.453976899 +0000 UTC m=+2760.415196853" observedRunningTime="2026-02-19 19:19:47.058859928 +0000 UTC m=+2761.020079892" watchObservedRunningTime="2026-02-19 19:19:47.063891639 +0000 UTC m=+2761.025111593" Feb 19 19:19:51 crc kubenswrapper[4749]: I0219 19:19:51.197313 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sg8s9" Feb 19 19:19:51 crc kubenswrapper[4749]: I0219 19:19:51.197902 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sg8s9" Feb 19 19:19:51 crc kubenswrapper[4749]: I0219 19:19:51.257777 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sg8s9" Feb 19 19:19:52 crc kubenswrapper[4749]: I0219 19:19:52.065155 4749 generic.go:334] "Generic (PLEG): container finished" podID="25dfd640-2593-4dd0-8685-3c7018355375" containerID="c21c92a8d6b66886f93c74c4f6cec5896d6ab28771788452bb253badf9b5c301" exitCode=0 Feb 19 19:19:52 crc kubenswrapper[4749]: I0219 19:19:52.065251 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6dx2" event={"ID":"25dfd640-2593-4dd0-8685-3c7018355375","Type":"ContainerDied","Data":"c21c92a8d6b66886f93c74c4f6cec5896d6ab28771788452bb253badf9b5c301"} Feb 19 19:19:52 crc kubenswrapper[4749]: I0219 19:19:52.140596 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sg8s9" Feb 19 19:19:53 crc kubenswrapper[4749]: I0219 19:19:53.081672 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6dx2" event={"ID":"25dfd640-2593-4dd0-8685-3c7018355375","Type":"ContainerStarted","Data":"e5e9929bdd018dc1b139162212a3e5bcbccf63c3d4cec85de7cd79594fed9a74"} Feb 19 19:19:53 crc kubenswrapper[4749]: I0219 19:19:53.120424 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t6dx2" podStartSLOduration=2.605018922 podStartE2EDuration="10.120407877s" podCreationTimestamp="2026-02-19 19:19:43 +0000 UTC" firstStartedPulling="2026-02-19 19:19:44.991723831 +0000 UTC m=+2758.952943785" lastFinishedPulling="2026-02-19 19:19:52.507112786 +0000 UTC m=+2766.468332740" observedRunningTime="2026-02-19 19:19:53.11762236 +0000 UTC m=+2767.078842324" watchObservedRunningTime="2026-02-19 19:19:53.120407877 +0000 UTC m=+2767.081627831" Feb 19 19:19:53 crc kubenswrapper[4749]: I0219 19:19:53.580310 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t6dx2" Feb 19 19:19:53 crc kubenswrapper[4749]: I0219 19:19:53.580382 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t6dx2" Feb 19 19:19:53 crc kubenswrapper[4749]: I0219 19:19:53.638079 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sg8s9"] Feb 19 19:19:54 crc kubenswrapper[4749]: I0219 19:19:54.087576 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sg8s9" podUID="5e05f121-510d-4f58-8c95-74d770eb37ea" containerName="registry-server" containerID="cri-o://196a3286e3e87a436acc1714988567331469e68f30099aae5f071d0cb980886d" gracePeriod=2 Feb 19 19:19:54 crc kubenswrapper[4749]: I0219 19:19:54.553472 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sg8s9" Feb 19 19:19:54 crc kubenswrapper[4749]: I0219 19:19:54.630481 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t6dx2" podUID="25dfd640-2593-4dd0-8685-3c7018355375" containerName="registry-server" probeResult="failure" output=< Feb 19 19:19:54 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Feb 19 19:19:54 crc kubenswrapper[4749]: > Feb 19 19:19:54 crc kubenswrapper[4749]: I0219 19:19:54.726944 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcdxk\" (UniqueName: \"kubernetes.io/projected/5e05f121-510d-4f58-8c95-74d770eb37ea-kube-api-access-lcdxk\") pod \"5e05f121-510d-4f58-8c95-74d770eb37ea\" (UID: \"5e05f121-510d-4f58-8c95-74d770eb37ea\") " Feb 19 19:19:54 crc kubenswrapper[4749]: I0219 19:19:54.727171 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e05f121-510d-4f58-8c95-74d770eb37ea-utilities\") pod \"5e05f121-510d-4f58-8c95-74d770eb37ea\" (UID: \"5e05f121-510d-4f58-8c95-74d770eb37ea\") " Feb 19 19:19:54 crc kubenswrapper[4749]: I0219 19:19:54.727300 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e05f121-510d-4f58-8c95-74d770eb37ea-catalog-content\") pod \"5e05f121-510d-4f58-8c95-74d770eb37ea\" (UID: \"5e05f121-510d-4f58-8c95-74d770eb37ea\") " Feb 19 19:19:54 crc kubenswrapper[4749]: I0219 19:19:54.727924 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e05f121-510d-4f58-8c95-74d770eb37ea-utilities" (OuterVolumeSpecName: "utilities") pod "5e05f121-510d-4f58-8c95-74d770eb37ea" (UID: "5e05f121-510d-4f58-8c95-74d770eb37ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:19:54 crc kubenswrapper[4749]: I0219 19:19:54.728272 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e05f121-510d-4f58-8c95-74d770eb37ea-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:19:54 crc kubenswrapper[4749]: I0219 19:19:54.737449 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e05f121-510d-4f58-8c95-74d770eb37ea-kube-api-access-lcdxk" (OuterVolumeSpecName: "kube-api-access-lcdxk") pod "5e05f121-510d-4f58-8c95-74d770eb37ea" (UID: "5e05f121-510d-4f58-8c95-74d770eb37ea"). InnerVolumeSpecName "kube-api-access-lcdxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:19:54 crc kubenswrapper[4749]: I0219 19:19:54.789628 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e05f121-510d-4f58-8c95-74d770eb37ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e05f121-510d-4f58-8c95-74d770eb37ea" (UID: "5e05f121-510d-4f58-8c95-74d770eb37ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:19:54 crc kubenswrapper[4749]: I0219 19:19:54.830012 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcdxk\" (UniqueName: \"kubernetes.io/projected/5e05f121-510d-4f58-8c95-74d770eb37ea-kube-api-access-lcdxk\") on node \"crc\" DevicePath \"\"" Feb 19 19:19:54 crc kubenswrapper[4749]: I0219 19:19:54.830063 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e05f121-510d-4f58-8c95-74d770eb37ea-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:19:55 crc kubenswrapper[4749]: I0219 19:19:55.097932 4749 generic.go:334] "Generic (PLEG): container finished" podID="5e05f121-510d-4f58-8c95-74d770eb37ea" containerID="196a3286e3e87a436acc1714988567331469e68f30099aae5f071d0cb980886d" exitCode=0 Feb 19 19:19:55 crc kubenswrapper[4749]: I0219 19:19:55.097992 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sg8s9" event={"ID":"5e05f121-510d-4f58-8c95-74d770eb37ea","Type":"ContainerDied","Data":"196a3286e3e87a436acc1714988567331469e68f30099aae5f071d0cb980886d"} Feb 19 19:19:55 crc kubenswrapper[4749]: I0219 19:19:55.098048 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sg8s9" Feb 19 19:19:55 crc kubenswrapper[4749]: I0219 19:19:55.098073 4749 scope.go:117] "RemoveContainer" containerID="196a3286e3e87a436acc1714988567331469e68f30099aae5f071d0cb980886d" Feb 19 19:19:55 crc kubenswrapper[4749]: I0219 19:19:55.098058 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sg8s9" event={"ID":"5e05f121-510d-4f58-8c95-74d770eb37ea","Type":"ContainerDied","Data":"6ff9771acd1d12c6d1c4d19bab0d769ec2685cb45e31369c12c8a6c3c168ad77"} Feb 19 19:19:55 crc kubenswrapper[4749]: I0219 19:19:55.118909 4749 scope.go:117] "RemoveContainer" containerID="5ad1457d678f50d5c353e8380fa5a13db734f3ffa27ffaf9791352758776a3ae" Feb 19 19:19:55 crc kubenswrapper[4749]: I0219 19:19:55.134379 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sg8s9"] Feb 19 19:19:55 crc kubenswrapper[4749]: I0219 19:19:55.143370 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sg8s9"] Feb 19 19:19:55 crc kubenswrapper[4749]: I0219 19:19:55.148338 4749 scope.go:117] "RemoveContainer" containerID="303bcc0f4a6be689e97f9ff13be836e680ebcb33d6dffb477c68199d7f46d50f" Feb 19 19:19:55 crc kubenswrapper[4749]: I0219 19:19:55.182630 4749 scope.go:117] "RemoveContainer" containerID="196a3286e3e87a436acc1714988567331469e68f30099aae5f071d0cb980886d" Feb 19 19:19:55 crc kubenswrapper[4749]: E0219 19:19:55.182991 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"196a3286e3e87a436acc1714988567331469e68f30099aae5f071d0cb980886d\": container with ID starting with 196a3286e3e87a436acc1714988567331469e68f30099aae5f071d0cb980886d not found: ID does not exist" containerID="196a3286e3e87a436acc1714988567331469e68f30099aae5f071d0cb980886d" Feb 19 19:19:55 crc kubenswrapper[4749]: I0219 19:19:55.183066 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"196a3286e3e87a436acc1714988567331469e68f30099aae5f071d0cb980886d"} err="failed to get container status \"196a3286e3e87a436acc1714988567331469e68f30099aae5f071d0cb980886d\": rpc error: code = NotFound desc = could not find container \"196a3286e3e87a436acc1714988567331469e68f30099aae5f071d0cb980886d\": container with ID starting with 196a3286e3e87a436acc1714988567331469e68f30099aae5f071d0cb980886d not found: ID does not exist" Feb 19 19:19:55 crc kubenswrapper[4749]: I0219 19:19:55.183100 4749 scope.go:117] "RemoveContainer" containerID="5ad1457d678f50d5c353e8380fa5a13db734f3ffa27ffaf9791352758776a3ae" Feb 19 19:19:55 crc kubenswrapper[4749]: E0219 19:19:55.183382 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ad1457d678f50d5c353e8380fa5a13db734f3ffa27ffaf9791352758776a3ae\": container with ID starting with 5ad1457d678f50d5c353e8380fa5a13db734f3ffa27ffaf9791352758776a3ae not found: ID does not exist" containerID="5ad1457d678f50d5c353e8380fa5a13db734f3ffa27ffaf9791352758776a3ae" Feb 19 19:19:55 crc kubenswrapper[4749]: I0219 19:19:55.183414 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ad1457d678f50d5c353e8380fa5a13db734f3ffa27ffaf9791352758776a3ae"} err="failed to get container status \"5ad1457d678f50d5c353e8380fa5a13db734f3ffa27ffaf9791352758776a3ae\": rpc error: code = NotFound desc = could not find container \"5ad1457d678f50d5c353e8380fa5a13db734f3ffa27ffaf9791352758776a3ae\": container with ID starting with 5ad1457d678f50d5c353e8380fa5a13db734f3ffa27ffaf9791352758776a3ae not found: ID does not exist" Feb 19 19:19:55 crc kubenswrapper[4749]: I0219 19:19:55.183434 4749 scope.go:117] "RemoveContainer" containerID="303bcc0f4a6be689e97f9ff13be836e680ebcb33d6dffb477c68199d7f46d50f" Feb 19 19:19:55 crc kubenswrapper[4749]: E0219 19:19:55.183650 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"303bcc0f4a6be689e97f9ff13be836e680ebcb33d6dffb477c68199d7f46d50f\": container with ID starting with 303bcc0f4a6be689e97f9ff13be836e680ebcb33d6dffb477c68199d7f46d50f not found: ID does not exist" containerID="303bcc0f4a6be689e97f9ff13be836e680ebcb33d6dffb477c68199d7f46d50f" Feb 19 19:19:55 crc kubenswrapper[4749]: I0219 19:19:55.183681 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"303bcc0f4a6be689e97f9ff13be836e680ebcb33d6dffb477c68199d7f46d50f"} err="failed to get container status \"303bcc0f4a6be689e97f9ff13be836e680ebcb33d6dffb477c68199d7f46d50f\": rpc error: code = NotFound desc = could not find container \"303bcc0f4a6be689e97f9ff13be836e680ebcb33d6dffb477c68199d7f46d50f\": container with ID starting with 303bcc0f4a6be689e97f9ff13be836e680ebcb33d6dffb477c68199d7f46d50f not found: ID does not exist" Feb 19 19:19:56 crc kubenswrapper[4749]: I0219 19:19:56.703806 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e05f121-510d-4f58-8c95-74d770eb37ea" path="/var/lib/kubelet/pods/5e05f121-510d-4f58-8c95-74d770eb37ea/volumes" Feb 19 19:20:04 crc kubenswrapper[4749]: I0219 19:20:04.635421 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t6dx2" podUID="25dfd640-2593-4dd0-8685-3c7018355375" containerName="registry-server" probeResult="failure" output=< Feb 19 19:20:04 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Feb 19 19:20:04 crc kubenswrapper[4749]: > Feb 19 19:20:13 crc kubenswrapper[4749]: I0219 19:20:13.634972 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t6dx2" Feb 19 19:20:13 crc kubenswrapper[4749]: I0219 19:20:13.712864 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t6dx2" Feb 19 19:20:14 crc kubenswrapper[4749]: I0219 19:20:14.455934 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t6dx2"] Feb 19 19:20:15 crc kubenswrapper[4749]: I0219 19:20:15.282920 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t6dx2" podUID="25dfd640-2593-4dd0-8685-3c7018355375" containerName="registry-server" containerID="cri-o://e5e9929bdd018dc1b139162212a3e5bcbccf63c3d4cec85de7cd79594fed9a74" gracePeriod=2 Feb 19 19:20:15 crc kubenswrapper[4749]: I0219 19:20:15.876412 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t6dx2" Feb 19 19:20:15 crc kubenswrapper[4749]: I0219 19:20:15.972671 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25dfd640-2593-4dd0-8685-3c7018355375-utilities\") pod \"25dfd640-2593-4dd0-8685-3c7018355375\" (UID: \"25dfd640-2593-4dd0-8685-3c7018355375\") " Feb 19 19:20:15 crc kubenswrapper[4749]: I0219 19:20:15.972943 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pclmj\" (UniqueName: \"kubernetes.io/projected/25dfd640-2593-4dd0-8685-3c7018355375-kube-api-access-pclmj\") pod \"25dfd640-2593-4dd0-8685-3c7018355375\" (UID: \"25dfd640-2593-4dd0-8685-3c7018355375\") " Feb 19 19:20:15 crc kubenswrapper[4749]: I0219 19:20:15.973079 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25dfd640-2593-4dd0-8685-3c7018355375-catalog-content\") pod \"25dfd640-2593-4dd0-8685-3c7018355375\" (UID: \"25dfd640-2593-4dd0-8685-3c7018355375\") " Feb 19 19:20:15 crc kubenswrapper[4749]: I0219 19:20:15.974671 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25dfd640-2593-4dd0-8685-3c7018355375-utilities" (OuterVolumeSpecName: "utilities") pod "25dfd640-2593-4dd0-8685-3c7018355375" (UID: "25dfd640-2593-4dd0-8685-3c7018355375"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:20:16 crc kubenswrapper[4749]: I0219 19:20:16.011014 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25dfd640-2593-4dd0-8685-3c7018355375-kube-api-access-pclmj" (OuterVolumeSpecName: "kube-api-access-pclmj") pod "25dfd640-2593-4dd0-8685-3c7018355375" (UID: "25dfd640-2593-4dd0-8685-3c7018355375"). InnerVolumeSpecName "kube-api-access-pclmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:20:16 crc kubenswrapper[4749]: I0219 19:20:16.076623 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pclmj\" (UniqueName: \"kubernetes.io/projected/25dfd640-2593-4dd0-8685-3c7018355375-kube-api-access-pclmj\") on node \"crc\" DevicePath \"\"" Feb 19 19:20:16 crc kubenswrapper[4749]: I0219 19:20:16.076678 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25dfd640-2593-4dd0-8685-3c7018355375-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:20:16 crc kubenswrapper[4749]: I0219 19:20:16.143582 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25dfd640-2593-4dd0-8685-3c7018355375-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25dfd640-2593-4dd0-8685-3c7018355375" (UID: "25dfd640-2593-4dd0-8685-3c7018355375"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:20:16 crc kubenswrapper[4749]: I0219 19:20:16.178231 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25dfd640-2593-4dd0-8685-3c7018355375-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:20:16 crc kubenswrapper[4749]: I0219 19:20:16.306148 4749 generic.go:334] "Generic (PLEG): container finished" podID="25dfd640-2593-4dd0-8685-3c7018355375" containerID="e5e9929bdd018dc1b139162212a3e5bcbccf63c3d4cec85de7cd79594fed9a74" exitCode=0 Feb 19 19:20:16 crc kubenswrapper[4749]: I0219 19:20:16.306207 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6dx2" event={"ID":"25dfd640-2593-4dd0-8685-3c7018355375","Type":"ContainerDied","Data":"e5e9929bdd018dc1b139162212a3e5bcbccf63c3d4cec85de7cd79594fed9a74"} Feb 19 19:20:16 crc kubenswrapper[4749]: I0219 19:20:16.306248 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6dx2" event={"ID":"25dfd640-2593-4dd0-8685-3c7018355375","Type":"ContainerDied","Data":"3e56e91f99089a3d7972b1d162f56e2e958a84692aff52462ed4bc6543b71426"} Feb 19 19:20:16 crc kubenswrapper[4749]: I0219 19:20:16.306280 4749 scope.go:117] "RemoveContainer" containerID="e5e9929bdd018dc1b139162212a3e5bcbccf63c3d4cec85de7cd79594fed9a74" Feb 19 19:20:16 crc kubenswrapper[4749]: I0219 19:20:16.306509 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t6dx2" Feb 19 19:20:16 crc kubenswrapper[4749]: I0219 19:20:16.357534 4749 scope.go:117] "RemoveContainer" containerID="c21c92a8d6b66886f93c74c4f6cec5896d6ab28771788452bb253badf9b5c301" Feb 19 19:20:16 crc kubenswrapper[4749]: I0219 19:20:16.362051 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t6dx2"] Feb 19 19:20:16 crc kubenswrapper[4749]: I0219 19:20:16.384845 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t6dx2"] Feb 19 19:20:16 crc kubenswrapper[4749]: I0219 19:20:16.409011 4749 scope.go:117] "RemoveContainer" containerID="918ab814f948d2f05be293fb6f5cac75231d6240f0ead4d0e587d7df5abd7c77" Feb 19 19:20:16 crc kubenswrapper[4749]: I0219 19:20:16.449631 4749 scope.go:117] "RemoveContainer" containerID="e5e9929bdd018dc1b139162212a3e5bcbccf63c3d4cec85de7cd79594fed9a74" Feb 19 19:20:16 crc kubenswrapper[4749]: E0219 19:20:16.450126 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5e9929bdd018dc1b139162212a3e5bcbccf63c3d4cec85de7cd79594fed9a74\": container with ID starting with e5e9929bdd018dc1b139162212a3e5bcbccf63c3d4cec85de7cd79594fed9a74 not found: ID does not exist" containerID="e5e9929bdd018dc1b139162212a3e5bcbccf63c3d4cec85de7cd79594fed9a74" Feb 19 19:20:16 crc kubenswrapper[4749]: I0219 19:20:16.450157 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5e9929bdd018dc1b139162212a3e5bcbccf63c3d4cec85de7cd79594fed9a74"} err="failed to get container status \"e5e9929bdd018dc1b139162212a3e5bcbccf63c3d4cec85de7cd79594fed9a74\": rpc error: code = NotFound desc = could not find container \"e5e9929bdd018dc1b139162212a3e5bcbccf63c3d4cec85de7cd79594fed9a74\": container with ID starting with e5e9929bdd018dc1b139162212a3e5bcbccf63c3d4cec85de7cd79594fed9a74 not found: ID does not exist" Feb 19 19:20:16 crc kubenswrapper[4749]: I0219 19:20:16.450178 4749 scope.go:117] "RemoveContainer" containerID="c21c92a8d6b66886f93c74c4f6cec5896d6ab28771788452bb253badf9b5c301" Feb 19 19:20:16 crc kubenswrapper[4749]: E0219 19:20:16.450566 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c21c92a8d6b66886f93c74c4f6cec5896d6ab28771788452bb253badf9b5c301\": container with ID starting with c21c92a8d6b66886f93c74c4f6cec5896d6ab28771788452bb253badf9b5c301 not found: ID does not exist" containerID="c21c92a8d6b66886f93c74c4f6cec5896d6ab28771788452bb253badf9b5c301" Feb 19 19:20:16 crc kubenswrapper[4749]: I0219 19:20:16.450590 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c21c92a8d6b66886f93c74c4f6cec5896d6ab28771788452bb253badf9b5c301"} err="failed to get container status \"c21c92a8d6b66886f93c74c4f6cec5896d6ab28771788452bb253badf9b5c301\": rpc error: code = NotFound desc = could not find container \"c21c92a8d6b66886f93c74c4f6cec5896d6ab28771788452bb253badf9b5c301\": container with ID starting with c21c92a8d6b66886f93c74c4f6cec5896d6ab28771788452bb253badf9b5c301 not found: ID does not exist" Feb 19 19:20:16 crc kubenswrapper[4749]: I0219 19:20:16.450607 4749 scope.go:117] "RemoveContainer" containerID="918ab814f948d2f05be293fb6f5cac75231d6240f0ead4d0e587d7df5abd7c77" Feb 19 19:20:16 crc kubenswrapper[4749]: E0219 19:20:16.450887 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"918ab814f948d2f05be293fb6f5cac75231d6240f0ead4d0e587d7df5abd7c77\": container with ID starting with 918ab814f948d2f05be293fb6f5cac75231d6240f0ead4d0e587d7df5abd7c77 not found: ID does not exist" containerID="918ab814f948d2f05be293fb6f5cac75231d6240f0ead4d0e587d7df5abd7c77" Feb 19 19:20:16 crc kubenswrapper[4749]: I0219 19:20:16.450912 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"918ab814f948d2f05be293fb6f5cac75231d6240f0ead4d0e587d7df5abd7c77"} err="failed to get container status \"918ab814f948d2f05be293fb6f5cac75231d6240f0ead4d0e587d7df5abd7c77\": rpc error: code = NotFound desc = could not find container \"918ab814f948d2f05be293fb6f5cac75231d6240f0ead4d0e587d7df5abd7c77\": container with ID starting with 918ab814f948d2f05be293fb6f5cac75231d6240f0ead4d0e587d7df5abd7c77 not found: ID does not exist" Feb 19 19:20:16 crc kubenswrapper[4749]: I0219 19:20:16.690994 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25dfd640-2593-4dd0-8685-3c7018355375" path="/var/lib/kubelet/pods/25dfd640-2593-4dd0-8685-3c7018355375/volumes" Feb 19 19:20:54 crc kubenswrapper[4749]: I0219 19:20:54.725379 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:20:54 crc kubenswrapper[4749]: I0219 19:20:54.725927 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:21:24 crc kubenswrapper[4749]: I0219 19:21:24.725769 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:21:24 crc kubenswrapper[4749]: I0219 19:21:24.726519 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:21:54 crc kubenswrapper[4749]: I0219 19:21:54.726103 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:21:54 crc kubenswrapper[4749]: I0219 19:21:54.726775 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:21:54 crc kubenswrapper[4749]: I0219 19:21:54.726824 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 19:21:54 crc kubenswrapper[4749]: I0219 19:21:54.727742 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8093d672b657eb163bf89fa6162798b2d16c7bce8e69d30b979dad8d5d696438"} pod="openshift-machine-config-operator/machine-config-daemon-nzldt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:21:54 crc kubenswrapper[4749]: I0219 19:21:54.727801 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" containerID="cri-o://8093d672b657eb163bf89fa6162798b2d16c7bce8e69d30b979dad8d5d696438" gracePeriod=600 Feb 19 19:21:55 crc kubenswrapper[4749]: I0219 19:21:55.252519 4749 generic.go:334] "Generic (PLEG): container finished" podID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerID="8093d672b657eb163bf89fa6162798b2d16c7bce8e69d30b979dad8d5d696438" exitCode=0 Feb 19 19:21:55 crc kubenswrapper[4749]: I0219 19:21:55.252611 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerDied","Data":"8093d672b657eb163bf89fa6162798b2d16c7bce8e69d30b979dad8d5d696438"} Feb 19 19:21:55 crc kubenswrapper[4749]: I0219 19:21:55.253049 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerStarted","Data":"89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c"} Feb 19 19:21:55 crc kubenswrapper[4749]: I0219 19:21:55.253080 4749 scope.go:117] "RemoveContainer" containerID="8322baf85274ddd987f2fee258e98592cd052c591ae677d8c4cecf1c6034cb55" Feb 19 19:24:24 crc kubenswrapper[4749]: I0219 19:24:24.725303 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:24:24 crc kubenswrapper[4749]: I0219 19:24:24.726819 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:24:54 crc kubenswrapper[4749]: I0219 19:24:54.725809 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:24:54 crc kubenswrapper[4749]: I0219 19:24:54.726343 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:25:24 crc kubenswrapper[4749]: I0219 19:25:24.725244 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:25:24 crc kubenswrapper[4749]: I0219 19:25:24.725999 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:25:24 crc kubenswrapper[4749]: I0219 19:25:24.726069 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 19:25:24 crc kubenswrapper[4749]: I0219 19:25:24.726886 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c"} pod="openshift-machine-config-operator/machine-config-daemon-nzldt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:25:24 crc kubenswrapper[4749]: I0219 19:25:24.726939 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" containerID="cri-o://89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" gracePeriod=600 Feb 19 19:25:24 crc kubenswrapper[4749]: E0219 19:25:24.851325 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:25:25 crc kubenswrapper[4749]: I0219 19:25:25.260824 4749 generic.go:334] "Generic (PLEG): container finished" podID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerID="89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" exitCode=0 Feb 19 19:25:25 crc kubenswrapper[4749]: I0219 19:25:25.260893 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerDied","Data":"89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c"} Feb 19 19:25:25 crc kubenswrapper[4749]: I0219 19:25:25.261355 4749 scope.go:117] "RemoveContainer" containerID="8093d672b657eb163bf89fa6162798b2d16c7bce8e69d30b979dad8d5d696438" Feb 19 19:25:25 crc kubenswrapper[4749]: I0219 19:25:25.261828 4749 scope.go:117] "RemoveContainer" containerID="89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" Feb 19 19:25:25 crc kubenswrapper[4749]: E0219 19:25:25.262152 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:25:39 crc kubenswrapper[4749]: I0219 19:25:39.679461 4749 scope.go:117] "RemoveContainer" containerID="89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" Feb 19 19:25:39 crc kubenswrapper[4749]: E0219 19:25:39.680239 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:25:51 crc kubenswrapper[4749]: I0219 19:25:51.678992 4749 scope.go:117] "RemoveContainer" containerID="89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" Feb 19 19:25:51 crc kubenswrapper[4749]: E0219 19:25:51.680209 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:26:03 crc kubenswrapper[4749]: I0219 19:26:03.679441 4749 scope.go:117] "RemoveContainer" containerID="89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" Feb 19 19:26:03 crc kubenswrapper[4749]: E0219 19:26:03.680297 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:26:17 crc kubenswrapper[4749]: I0219 19:26:17.678619 4749 scope.go:117] "RemoveContainer" containerID="89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" Feb 19 19:26:17 crc kubenswrapper[4749]: E0219 19:26:17.680202 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:26:31 crc kubenswrapper[4749]: I0219 19:26:31.678972 4749 scope.go:117] "RemoveContainer" containerID="89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" Feb 19 19:26:31 crc kubenswrapper[4749]: E0219 19:26:31.679812 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:26:45 crc kubenswrapper[4749]: I0219 19:26:45.678965 4749 scope.go:117] "RemoveContainer" containerID="89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" Feb 19 19:26:45 crc kubenswrapper[4749]: E0219 19:26:45.679686 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:26:56 crc kubenswrapper[4749]: I0219 19:26:56.686087 4749 scope.go:117] "RemoveContainer" containerID="89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" Feb 19 19:26:56 crc kubenswrapper[4749]: E0219 19:26:56.686985 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:27:08 crc kubenswrapper[4749]: I0219 19:27:08.678971 4749 scope.go:117] "RemoveContainer" containerID="89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" Feb 19 19:27:08 crc kubenswrapper[4749]: E0219 19:27:08.680383 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:27:23 crc kubenswrapper[4749]: I0219 19:27:23.679353 4749 scope.go:117] "RemoveContainer" containerID="89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" Feb 19 19:27:23 crc kubenswrapper[4749]: E0219 19:27:23.680218 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:27:37 crc kubenswrapper[4749]: I0219 19:27:37.679499 4749 scope.go:117] "RemoveContainer" containerID="89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" Feb 19 19:27:37 crc kubenswrapper[4749]: E0219 19:27:37.680285 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:27:48 crc kubenswrapper[4749]: I0219 19:27:48.680592 4749 scope.go:117] "RemoveContainer" containerID="89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" Feb 19 19:27:48 crc kubenswrapper[4749]: E0219 19:27:48.681586 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:27:59 crc kubenswrapper[4749]: I0219 19:27:59.679675 4749 scope.go:117] "RemoveContainer" containerID="89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" Feb 19 19:27:59 crc kubenswrapper[4749]: E0219 19:27:59.681622 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:28:05 crc kubenswrapper[4749]: I0219 19:28:05.607393 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jwx7c"] Feb 19 19:28:05 crc kubenswrapper[4749]: E0219 19:28:05.608650 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25dfd640-2593-4dd0-8685-3c7018355375" containerName="registry-server" Feb 19 19:28:05 crc kubenswrapper[4749]: I0219 19:28:05.608672 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="25dfd640-2593-4dd0-8685-3c7018355375" containerName="registry-server" Feb 19 19:28:05 crc kubenswrapper[4749]: E0219 19:28:05.608690 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e05f121-510d-4f58-8c95-74d770eb37ea" containerName="extract-utilities" Feb 19 19:28:05 crc kubenswrapper[4749]: I0219 19:28:05.608701 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e05f121-510d-4f58-8c95-74d770eb37ea" containerName="extract-utilities" Feb 19 19:28:05 crc kubenswrapper[4749]: E0219 19:28:05.608740 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e05f121-510d-4f58-8c95-74d770eb37ea" containerName="extract-content" Feb 19 19:28:05 crc kubenswrapper[4749]: I0219 19:28:05.608753 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e05f121-510d-4f58-8c95-74d770eb37ea" containerName="extract-content" Feb 19 19:28:05 crc kubenswrapper[4749]: E0219 19:28:05.608768 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25dfd640-2593-4dd0-8685-3c7018355375" containerName="extract-content" Feb 19 19:28:05 crc kubenswrapper[4749]: I0219 19:28:05.608779 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="25dfd640-2593-4dd0-8685-3c7018355375" containerName="extract-content" Feb 19 19:28:05 crc kubenswrapper[4749]: E0219 19:28:05.608829 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25dfd640-2593-4dd0-8685-3c7018355375" containerName="extract-utilities" Feb 19 19:28:05 crc kubenswrapper[4749]: I0219 19:28:05.608843 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="25dfd640-2593-4dd0-8685-3c7018355375" containerName="extract-utilities" Feb 19 19:28:05 crc kubenswrapper[4749]: E0219 19:28:05.608868 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e05f121-510d-4f58-8c95-74d770eb37ea" containerName="registry-server" Feb 19 19:28:05 crc kubenswrapper[4749]: I0219 19:28:05.608879 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e05f121-510d-4f58-8c95-74d770eb37ea" containerName="registry-server" Feb 19 19:28:05 crc kubenswrapper[4749]: I0219 19:28:05.609245 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e05f121-510d-4f58-8c95-74d770eb37ea" containerName="registry-server" Feb 19 19:28:05 crc kubenswrapper[4749]: I0219 19:28:05.609309 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="25dfd640-2593-4dd0-8685-3c7018355375" containerName="registry-server" Feb 19 19:28:05 crc kubenswrapper[4749]: I0219 19:28:05.611776 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jwx7c" Feb 19 19:28:05 crc kubenswrapper[4749]: I0219 19:28:05.624498 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jwx7c"] Feb 19 19:28:05 crc kubenswrapper[4749]: I0219 19:28:05.702317 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6-utilities\") pod \"redhat-marketplace-jwx7c\" (UID: \"dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6\") " pod="openshift-marketplace/redhat-marketplace-jwx7c" Feb 19 19:28:05 crc kubenswrapper[4749]: I0219 19:28:05.702497 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr92r\" (UniqueName: \"kubernetes.io/projected/dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6-kube-api-access-kr92r\") pod \"redhat-marketplace-jwx7c\" (UID: \"dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6\") " pod="openshift-marketplace/redhat-marketplace-jwx7c" Feb 19 19:28:05 crc kubenswrapper[4749]: I0219 19:28:05.702685 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6-catalog-content\") pod \"redhat-marketplace-jwx7c\" (UID: \"dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6\") " pod="openshift-marketplace/redhat-marketplace-jwx7c" Feb 19 19:28:05 crc kubenswrapper[4749]: I0219 19:28:05.805609 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr92r\" (UniqueName: \"kubernetes.io/projected/dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6-kube-api-access-kr92r\") pod \"redhat-marketplace-jwx7c\" (UID: \"dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6\") " pod="openshift-marketplace/redhat-marketplace-jwx7c" Feb 19 19:28:05 crc kubenswrapper[4749]: I0219 19:28:05.805691 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6-catalog-content\") pod \"redhat-marketplace-jwx7c\" (UID: \"dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6\") " pod="openshift-marketplace/redhat-marketplace-jwx7c" Feb 19 19:28:05 crc kubenswrapper[4749]: I0219 19:28:05.805810 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6-utilities\") pod \"redhat-marketplace-jwx7c\" (UID: \"dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6\") " pod="openshift-marketplace/redhat-marketplace-jwx7c" Feb 19 19:28:05 crc kubenswrapper[4749]: I0219 19:28:05.806463 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6-utilities\") pod \"redhat-marketplace-jwx7c\" (UID: \"dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6\") " pod="openshift-marketplace/redhat-marketplace-jwx7c" Feb 19 19:28:05 crc kubenswrapper[4749]: I0219 19:28:05.806893 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6-catalog-content\") pod \"redhat-marketplace-jwx7c\" (UID: \"dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6\") " pod="openshift-marketplace/redhat-marketplace-jwx7c" Feb 19 19:28:05 crc kubenswrapper[4749]: I0219 19:28:05.832080 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr92r\" (UniqueName: \"kubernetes.io/projected/dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6-kube-api-access-kr92r\") pod \"redhat-marketplace-jwx7c\" (UID: \"dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6\") " pod="openshift-marketplace/redhat-marketplace-jwx7c" Feb 19 19:28:05 crc kubenswrapper[4749]: I0219 19:28:05.939228 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jwx7c" Feb 19 19:28:06 crc kubenswrapper[4749]: I0219 19:28:06.393101 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jwx7c"] Feb 19 19:28:06 crc kubenswrapper[4749]: I0219 19:28:06.862129 4749 generic.go:334] "Generic (PLEG): container finished" podID="dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6" containerID="3335b897d82cac5903c411204559a00c9f104768e458bc885db6e691f760bcc3" exitCode=0 Feb 19 19:28:06 crc kubenswrapper[4749]: I0219 19:28:06.862203 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jwx7c" event={"ID":"dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6","Type":"ContainerDied","Data":"3335b897d82cac5903c411204559a00c9f104768e458bc885db6e691f760bcc3"} Feb 19 19:28:06 crc kubenswrapper[4749]: I0219 19:28:06.862519 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jwx7c" event={"ID":"dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6","Type":"ContainerStarted","Data":"c172e97bab0271b1b4a9ffc6458a460023858c60fff48082f3babd7bc30139fd"} Feb 19 19:28:06 crc kubenswrapper[4749]: I0219 19:28:06.864512 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 19:28:08 crc kubenswrapper[4749]: I0219 19:28:08.888866 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jwx7c" event={"ID":"dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6","Type":"ContainerStarted","Data":"db6d7c95fd707aee735cdde75cd5ddd147c60db0dad9c49ffb90296b1d84f22a"} Feb 19 19:28:09 crc kubenswrapper[4749]: I0219 19:28:09.909594 4749 generic.go:334] "Generic (PLEG): container finished" podID="dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6" containerID="db6d7c95fd707aee735cdde75cd5ddd147c60db0dad9c49ffb90296b1d84f22a" exitCode=0 Feb 19 19:28:09 crc kubenswrapper[4749]: I0219 19:28:09.909919 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jwx7c" event={"ID":"dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6","Type":"ContainerDied","Data":"db6d7c95fd707aee735cdde75cd5ddd147c60db0dad9c49ffb90296b1d84f22a"} Feb 19 19:28:10 crc kubenswrapper[4749]: I0219 19:28:10.924972 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jwx7c" event={"ID":"dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6","Type":"ContainerStarted","Data":"1cb0123394290bf2c45d1cf79ad6da6e511a4b520188e7a51dc870c133c7769c"} Feb 19 19:28:10 crc kubenswrapper[4749]: I0219 19:28:10.957813 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jwx7c" podStartSLOduration=2.498421307 podStartE2EDuration="5.957794312s" podCreationTimestamp="2026-02-19 19:28:05 +0000 UTC" firstStartedPulling="2026-02-19 19:28:06.864130804 +0000 UTC m=+3260.825350788" lastFinishedPulling="2026-02-19 19:28:10.323503839 +0000 UTC m=+3264.284723793" observedRunningTime="2026-02-19 19:28:10.949610465 +0000 UTC m=+3264.910830429" watchObservedRunningTime="2026-02-19 19:28:10.957794312 +0000 UTC m=+3264.919014276" Feb 19 19:28:12 crc kubenswrapper[4749]: I0219 19:28:12.678995 4749 scope.go:117] "RemoveContainer" containerID="89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" Feb 19 19:28:12 crc kubenswrapper[4749]: E0219 19:28:12.679342 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:28:15 crc kubenswrapper[4749]: I0219 19:28:15.939437 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jwx7c" Feb 19 19:28:15 crc kubenswrapper[4749]: I0219 19:28:15.940036 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jwx7c" Feb 19 19:28:16 crc kubenswrapper[4749]: I0219 19:28:16.012532 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jwx7c" Feb 19 19:28:16 crc kubenswrapper[4749]: I0219 19:28:16.101927 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jwx7c" Feb 19 19:28:16 crc kubenswrapper[4749]: I0219 19:28:16.264495 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jwx7c"] Feb 19 19:28:18 crc kubenswrapper[4749]: I0219 19:28:18.000959 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jwx7c" podUID="dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6" containerName="registry-server" containerID="cri-o://1cb0123394290bf2c45d1cf79ad6da6e511a4b520188e7a51dc870c133c7769c" gracePeriod=2 Feb 19 19:28:18 crc kubenswrapper[4749]: I0219 19:28:18.509319 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jwx7c" Feb 19 19:28:18 crc kubenswrapper[4749]: I0219 19:28:18.607836 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6-utilities\") pod \"dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6\" (UID: \"dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6\") " Feb 19 19:28:18 crc kubenswrapper[4749]: I0219 19:28:18.607951 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6-catalog-content\") pod \"dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6\" (UID: \"dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6\") " Feb 19 19:28:18 crc kubenswrapper[4749]: I0219 19:28:18.608337 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr92r\" (UniqueName: \"kubernetes.io/projected/dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6-kube-api-access-kr92r\") pod \"dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6\" (UID: \"dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6\") " Feb 19 19:28:18 crc kubenswrapper[4749]: I0219 19:28:18.609942 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6-utilities" (OuterVolumeSpecName: "utilities") pod "dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6" (UID: "dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:28:18 crc kubenswrapper[4749]: I0219 19:28:18.620054 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6-kube-api-access-kr92r" (OuterVolumeSpecName: "kube-api-access-kr92r") pod "dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6" (UID: "dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6"). InnerVolumeSpecName "kube-api-access-kr92r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:28:18 crc kubenswrapper[4749]: I0219 19:28:18.636532 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6" (UID: "dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:28:18 crc kubenswrapper[4749]: I0219 19:28:18.710919 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr92r\" (UniqueName: \"kubernetes.io/projected/dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6-kube-api-access-kr92r\") on node \"crc\" DevicePath \"\"" Feb 19 19:28:18 crc kubenswrapper[4749]: I0219 19:28:18.710960 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:28:18 crc kubenswrapper[4749]: I0219 19:28:18.710976 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:28:19 crc kubenswrapper[4749]: I0219 19:28:19.025275 4749 generic.go:334] "Generic (PLEG): container finished" podID="dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6" containerID="1cb0123394290bf2c45d1cf79ad6da6e511a4b520188e7a51dc870c133c7769c" exitCode=0 Feb 19 19:28:19 crc kubenswrapper[4749]: I0219 19:28:19.025409 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jwx7c" event={"ID":"dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6","Type":"ContainerDied","Data":"1cb0123394290bf2c45d1cf79ad6da6e511a4b520188e7a51dc870c133c7769c"} Feb 19 19:28:19 crc kubenswrapper[4749]: I0219 19:28:19.025466 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jwx7c" event={"ID":"dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6","Type":"ContainerDied","Data":"c172e97bab0271b1b4a9ffc6458a460023858c60fff48082f3babd7bc30139fd"} Feb 19 19:28:19 crc kubenswrapper[4749]: I0219 19:28:19.025500 4749 scope.go:117] "RemoveContainer" containerID="1cb0123394290bf2c45d1cf79ad6da6e511a4b520188e7a51dc870c133c7769c" Feb 19 19:28:19 crc kubenswrapper[4749]: I0219 19:28:19.026549 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jwx7c" Feb 19 19:28:19 crc kubenswrapper[4749]: I0219 19:28:19.066005 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jwx7c"] Feb 19 19:28:19 crc kubenswrapper[4749]: I0219 19:28:19.074579 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jwx7c"] Feb 19 19:28:19 crc kubenswrapper[4749]: I0219 19:28:19.075072 4749 scope.go:117] "RemoveContainer" containerID="db6d7c95fd707aee735cdde75cd5ddd147c60db0dad9c49ffb90296b1d84f22a" Feb 19 19:28:19 crc kubenswrapper[4749]: I0219 19:28:19.108278 4749 scope.go:117] "RemoveContainer" containerID="3335b897d82cac5903c411204559a00c9f104768e458bc885db6e691f760bcc3" Feb 19 19:28:19 crc kubenswrapper[4749]: I0219 19:28:19.171630 4749 scope.go:117] "RemoveContainer" containerID="1cb0123394290bf2c45d1cf79ad6da6e511a4b520188e7a51dc870c133c7769c" Feb 19 19:28:19 crc kubenswrapper[4749]: E0219 19:28:19.172238 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cb0123394290bf2c45d1cf79ad6da6e511a4b520188e7a51dc870c133c7769c\": container with ID starting with 1cb0123394290bf2c45d1cf79ad6da6e511a4b520188e7a51dc870c133c7769c not found: ID does not exist" containerID="1cb0123394290bf2c45d1cf79ad6da6e511a4b520188e7a51dc870c133c7769c" Feb 19 19:28:19 crc kubenswrapper[4749]: I0219 19:28:19.172357 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cb0123394290bf2c45d1cf79ad6da6e511a4b520188e7a51dc870c133c7769c"} err="failed to get container status \"1cb0123394290bf2c45d1cf79ad6da6e511a4b520188e7a51dc870c133c7769c\": rpc error: code = NotFound desc = could not find container \"1cb0123394290bf2c45d1cf79ad6da6e511a4b520188e7a51dc870c133c7769c\": container with ID starting with 1cb0123394290bf2c45d1cf79ad6da6e511a4b520188e7a51dc870c133c7769c not found: ID does not exist" Feb 19 19:28:19 crc kubenswrapper[4749]: I0219 19:28:19.172452 4749 scope.go:117] "RemoveContainer" containerID="db6d7c95fd707aee735cdde75cd5ddd147c60db0dad9c49ffb90296b1d84f22a" Feb 19 19:28:19 crc kubenswrapper[4749]: E0219 19:28:19.172777 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db6d7c95fd707aee735cdde75cd5ddd147c60db0dad9c49ffb90296b1d84f22a\": container with ID starting with db6d7c95fd707aee735cdde75cd5ddd147c60db0dad9c49ffb90296b1d84f22a not found: ID does not exist" containerID="db6d7c95fd707aee735cdde75cd5ddd147c60db0dad9c49ffb90296b1d84f22a" Feb 19 19:28:19 crc kubenswrapper[4749]: I0219 19:28:19.172860 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db6d7c95fd707aee735cdde75cd5ddd147c60db0dad9c49ffb90296b1d84f22a"} err="failed to get container status \"db6d7c95fd707aee735cdde75cd5ddd147c60db0dad9c49ffb90296b1d84f22a\": rpc error: code = NotFound desc = could not find container \"db6d7c95fd707aee735cdde75cd5ddd147c60db0dad9c49ffb90296b1d84f22a\": container with ID starting with db6d7c95fd707aee735cdde75cd5ddd147c60db0dad9c49ffb90296b1d84f22a not found: ID does not exist" Feb 19 19:28:19 crc kubenswrapper[4749]: I0219 19:28:19.172979 4749 scope.go:117] "RemoveContainer" containerID="3335b897d82cac5903c411204559a00c9f104768e458bc885db6e691f760bcc3" Feb 19 19:28:19 crc kubenswrapper[4749]: E0219 19:28:19.173452 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3335b897d82cac5903c411204559a00c9f104768e458bc885db6e691f760bcc3\": container with ID starting with 3335b897d82cac5903c411204559a00c9f104768e458bc885db6e691f760bcc3 not found: ID does not exist" containerID="3335b897d82cac5903c411204559a00c9f104768e458bc885db6e691f760bcc3" Feb 19 19:28:19 crc kubenswrapper[4749]: I0219 19:28:19.173482 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3335b897d82cac5903c411204559a00c9f104768e458bc885db6e691f760bcc3"} err="failed to get container status \"3335b897d82cac5903c411204559a00c9f104768e458bc885db6e691f760bcc3\": rpc error: code = NotFound desc = could not find container \"3335b897d82cac5903c411204559a00c9f104768e458bc885db6e691f760bcc3\": container with ID starting with 3335b897d82cac5903c411204559a00c9f104768e458bc885db6e691f760bcc3 not found: ID does not exist" Feb 19 19:28:20 crc kubenswrapper[4749]: I0219 19:28:20.689592 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6" path="/var/lib/kubelet/pods/dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6/volumes" Feb 19 19:28:27 crc kubenswrapper[4749]: I0219 19:28:27.679459 4749 scope.go:117] "RemoveContainer" containerID="89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" Feb 19 19:28:27 crc kubenswrapper[4749]: E0219 19:28:27.680232 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:28:39 crc kubenswrapper[4749]: I0219 19:28:39.679018 4749 scope.go:117] "RemoveContainer" containerID="89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" Feb 19 19:28:39 crc kubenswrapper[4749]: E0219 19:28:39.680584 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:28:50 crc kubenswrapper[4749]: I0219 19:28:50.679179 4749 scope.go:117] "RemoveContainer" containerID="89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" Feb 19 19:28:50 crc kubenswrapper[4749]: E0219 19:28:50.680332 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:29:03 crc kubenswrapper[4749]: I0219 19:29:03.679973 4749 scope.go:117] "RemoveContainer" containerID="89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" Feb 19 19:29:03 crc kubenswrapper[4749]: E0219 19:29:03.681220 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:29:14 crc kubenswrapper[4749]: I0219 19:29:14.679994 4749 scope.go:117] "RemoveContainer" containerID="89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" Feb 19 19:29:14 crc kubenswrapper[4749]: E0219 19:29:14.682204 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:29:29 crc kubenswrapper[4749]: I0219 19:29:29.679139 4749 scope.go:117] "RemoveContainer" containerID="89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" Feb 19 19:29:29 crc kubenswrapper[4749]: E0219 19:29:29.680193 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:29:41 crc kubenswrapper[4749]: I0219 19:29:41.679749 4749 scope.go:117] "RemoveContainer" containerID="89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" Feb 19 19:29:41 crc kubenswrapper[4749]: E0219 19:29:41.680608 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:29:55 crc kubenswrapper[4749]: I0219 19:29:55.678486 4749 scope.go:117] "RemoveContainer" containerID="89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" Feb 19 19:29:55 crc kubenswrapper[4749]: E0219 19:29:55.679292 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:29:56 crc kubenswrapper[4749]: I0219 19:29:56.861265 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2xxvx"] Feb 19 19:29:56 crc kubenswrapper[4749]: E0219 19:29:56.862316 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6" containerName="extract-utilities" Feb 19 19:29:56 crc kubenswrapper[4749]: I0219 19:29:56.862340 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6" containerName="extract-utilities" Feb 19 19:29:56 crc kubenswrapper[4749]: E0219 19:29:56.862376 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6" containerName="extract-content" Feb 19 19:29:56 crc kubenswrapper[4749]: I0219 19:29:56.862388 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6" containerName="extract-content" Feb 19 19:29:56 crc kubenswrapper[4749]: E0219 19:29:56.862416 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6" containerName="registry-server" Feb 19 19:29:56 crc kubenswrapper[4749]: I0219 19:29:56.862427 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6" containerName="registry-server" Feb 19 19:29:56 crc kubenswrapper[4749]: I0219 19:29:56.862752 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbaa1066-fbac-46ad-a7b1-ca8aac9ea9f6" containerName="registry-server" Feb 19 19:29:56 crc kubenswrapper[4749]: I0219 19:29:56.866878 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2xxvx" Feb 19 19:29:56 crc kubenswrapper[4749]: I0219 19:29:56.893049 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2xxvx"] Feb 19 19:29:56 crc kubenswrapper[4749]: I0219 19:29:56.963269 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m27m2\" (UniqueName: \"kubernetes.io/projected/ed9e8860-db2a-48d1-a571-6475da138f70-kube-api-access-m27m2\") pod \"community-operators-2xxvx\" (UID: \"ed9e8860-db2a-48d1-a571-6475da138f70\") " pod="openshift-marketplace/community-operators-2xxvx" Feb 19 19:29:56 crc kubenswrapper[4749]: I0219 19:29:56.963323 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed9e8860-db2a-48d1-a571-6475da138f70-utilities\") pod \"community-operators-2xxvx\" (UID: \"ed9e8860-db2a-48d1-a571-6475da138f70\") " pod="openshift-marketplace/community-operators-2xxvx" Feb 19 19:29:56 crc kubenswrapper[4749]: I0219 19:29:56.963518 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed9e8860-db2a-48d1-a571-6475da138f70-catalog-content\") pod \"community-operators-2xxvx\" (UID: \"ed9e8860-db2a-48d1-a571-6475da138f70\") " pod="openshift-marketplace/community-operators-2xxvx" Feb 19 19:29:57 crc kubenswrapper[4749]: I0219 19:29:57.065035 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m27m2\" (UniqueName: \"kubernetes.io/projected/ed9e8860-db2a-48d1-a571-6475da138f70-kube-api-access-m27m2\") pod \"community-operators-2xxvx\" (UID: \"ed9e8860-db2a-48d1-a571-6475da138f70\") " pod="openshift-marketplace/community-operators-2xxvx" Feb 19 19:29:57 crc kubenswrapper[4749]: I0219 19:29:57.065110 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed9e8860-db2a-48d1-a571-6475da138f70-utilities\") pod \"community-operators-2xxvx\" (UID: \"ed9e8860-db2a-48d1-a571-6475da138f70\") " pod="openshift-marketplace/community-operators-2xxvx" Feb 19 19:29:57 crc kubenswrapper[4749]: I0219 19:29:57.065191 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed9e8860-db2a-48d1-a571-6475da138f70-catalog-content\") pod \"community-operators-2xxvx\" (UID: \"ed9e8860-db2a-48d1-a571-6475da138f70\") " pod="openshift-marketplace/community-operators-2xxvx" Feb 19 19:29:57 crc kubenswrapper[4749]: I0219 19:29:57.065702 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed9e8860-db2a-48d1-a571-6475da138f70-catalog-content\") pod \"community-operators-2xxvx\" (UID: \"ed9e8860-db2a-48d1-a571-6475da138f70\") " pod="openshift-marketplace/community-operators-2xxvx" Feb 19 19:29:57 crc kubenswrapper[4749]: I0219 19:29:57.065963 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed9e8860-db2a-48d1-a571-6475da138f70-utilities\") pod \"community-operators-2xxvx\" (UID: \"ed9e8860-db2a-48d1-a571-6475da138f70\") " pod="openshift-marketplace/community-operators-2xxvx" Feb 19 19:29:57 crc kubenswrapper[4749]: I0219 19:29:57.084786 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m27m2\" (UniqueName: \"kubernetes.io/projected/ed9e8860-db2a-48d1-a571-6475da138f70-kube-api-access-m27m2\") pod \"community-operators-2xxvx\" (UID: \"ed9e8860-db2a-48d1-a571-6475da138f70\") " pod="openshift-marketplace/community-operators-2xxvx" Feb 19 19:29:57 crc kubenswrapper[4749]: I0219 19:29:57.203219 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2xxvx" Feb 19 19:29:57 crc kubenswrapper[4749]: I0219 19:29:57.722867 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2xxvx"] Feb 19 19:29:57 crc kubenswrapper[4749]: I0219 19:29:57.994816 4749 generic.go:334] "Generic (PLEG): container finished" podID="ed9e8860-db2a-48d1-a571-6475da138f70" containerID="11b454bc73b9eb5a6694df3908803bf1c47195f43512b1963023d2864d4cdfdc" exitCode=0 Feb 19 19:29:57 crc kubenswrapper[4749]: I0219 19:29:57.994862 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xxvx" event={"ID":"ed9e8860-db2a-48d1-a571-6475da138f70","Type":"ContainerDied","Data":"11b454bc73b9eb5a6694df3908803bf1c47195f43512b1963023d2864d4cdfdc"} Feb 19 19:29:57 crc kubenswrapper[4749]: I0219 19:29:57.994899 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xxvx" event={"ID":"ed9e8860-db2a-48d1-a571-6475da138f70","Type":"ContainerStarted","Data":"0a2a940ea4112591bb9fed00e45ace41010d5c84cbab03ab155a433a37e22655"} Feb 19 19:30:00 crc kubenswrapper[4749]: I0219 19:30:00.014406 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xxvx" event={"ID":"ed9e8860-db2a-48d1-a571-6475da138f70","Type":"ContainerStarted","Data":"82e75cab057fe703d07b251ac4c477e47af3aa291cfd80f831cc8eebf4774f3f"} Feb 19 19:30:00 crc kubenswrapper[4749]: I0219 19:30:00.149849 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525490-qg5vd"] Feb 19 19:30:00 crc kubenswrapper[4749]: I0219 19:30:00.151184 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-qg5vd" Feb 19 19:30:00 crc kubenswrapper[4749]: I0219 19:30:00.156307 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 19:30:00 crc kubenswrapper[4749]: I0219 19:30:00.157537 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 19:30:00 crc kubenswrapper[4749]: I0219 19:30:00.160720 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525490-qg5vd"] Feb 19 19:30:00 crc kubenswrapper[4749]: I0219 19:30:00.225981 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf14e5ea-0731-41b0-93d5-0fb06e2190c6-config-volume\") pod \"collect-profiles-29525490-qg5vd\" (UID: \"cf14e5ea-0731-41b0-93d5-0fb06e2190c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-qg5vd" Feb 19 19:30:00 crc kubenswrapper[4749]: I0219 19:30:00.226303 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhq47\" (UniqueName: \"kubernetes.io/projected/cf14e5ea-0731-41b0-93d5-0fb06e2190c6-kube-api-access-mhq47\") pod \"collect-profiles-29525490-qg5vd\" (UID: \"cf14e5ea-0731-41b0-93d5-0fb06e2190c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-qg5vd" Feb 19 19:30:00 crc kubenswrapper[4749]: I0219 19:30:00.226707 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf14e5ea-0731-41b0-93d5-0fb06e2190c6-secret-volume\") pod \"collect-profiles-29525490-qg5vd\" (UID: \"cf14e5ea-0731-41b0-93d5-0fb06e2190c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-qg5vd" Feb 19 19:30:00 crc kubenswrapper[4749]: I0219 19:30:00.329002 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf14e5ea-0731-41b0-93d5-0fb06e2190c6-config-volume\") pod \"collect-profiles-29525490-qg5vd\" (UID: \"cf14e5ea-0731-41b0-93d5-0fb06e2190c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-qg5vd" Feb 19 19:30:00 crc kubenswrapper[4749]: I0219 19:30:00.329210 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhq47\" (UniqueName: \"kubernetes.io/projected/cf14e5ea-0731-41b0-93d5-0fb06e2190c6-kube-api-access-mhq47\") pod \"collect-profiles-29525490-qg5vd\" (UID: \"cf14e5ea-0731-41b0-93d5-0fb06e2190c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-qg5vd" Feb 19 19:30:00 crc kubenswrapper[4749]: I0219 19:30:00.329375 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf14e5ea-0731-41b0-93d5-0fb06e2190c6-secret-volume\") pod \"collect-profiles-29525490-qg5vd\" (UID: \"cf14e5ea-0731-41b0-93d5-0fb06e2190c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-qg5vd" Feb 19 19:30:00 crc kubenswrapper[4749]: I0219 19:30:00.329930 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf14e5ea-0731-41b0-93d5-0fb06e2190c6-config-volume\") pod \"collect-profiles-29525490-qg5vd\" (UID: \"cf14e5ea-0731-41b0-93d5-0fb06e2190c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-qg5vd" Feb 19 19:30:00 crc kubenswrapper[4749]: I0219 19:30:00.336282 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf14e5ea-0731-41b0-93d5-0fb06e2190c6-secret-volume\") pod \"collect-profiles-29525490-qg5vd\" (UID: \"cf14e5ea-0731-41b0-93d5-0fb06e2190c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-qg5vd" Feb 19 19:30:00 crc kubenswrapper[4749]: I0219 19:30:00.357679 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhq47\" (UniqueName: \"kubernetes.io/projected/cf14e5ea-0731-41b0-93d5-0fb06e2190c6-kube-api-access-mhq47\") pod \"collect-profiles-29525490-qg5vd\" (UID: \"cf14e5ea-0731-41b0-93d5-0fb06e2190c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-qg5vd" Feb 19 19:30:00 crc kubenswrapper[4749]: I0219 19:30:00.470184 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-qg5vd" Feb 19 19:30:01 crc kubenswrapper[4749]: I0219 19:30:01.026411 4749 generic.go:334] "Generic (PLEG): container finished" podID="ed9e8860-db2a-48d1-a571-6475da138f70" containerID="82e75cab057fe703d07b251ac4c477e47af3aa291cfd80f831cc8eebf4774f3f" exitCode=0 Feb 19 19:30:01 crc kubenswrapper[4749]: I0219 19:30:01.026516 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xxvx" event={"ID":"ed9e8860-db2a-48d1-a571-6475da138f70","Type":"ContainerDied","Data":"82e75cab057fe703d07b251ac4c477e47af3aa291cfd80f831cc8eebf4774f3f"} Feb 19 19:30:01 crc kubenswrapper[4749]: I0219 19:30:01.106864 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525490-qg5vd"] Feb 19 19:30:01 crc kubenswrapper[4749]: W0219 19:30:01.110765 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf14e5ea_0731_41b0_93d5_0fb06e2190c6.slice/crio-0bebda14917eb199146434b516dd25c65539dbe97da5a913feb954035d4b7a8b WatchSource:0}: Error finding container 0bebda14917eb199146434b516dd25c65539dbe97da5a913feb954035d4b7a8b: Status 404 returned error can't find the container with id 0bebda14917eb199146434b516dd25c65539dbe97da5a913feb954035d4b7a8b Feb 19 19:30:02 crc kubenswrapper[4749]: I0219 19:30:02.038205 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xxvx" event={"ID":"ed9e8860-db2a-48d1-a571-6475da138f70","Type":"ContainerStarted","Data":"6b484980847ea79d6a4c5f4ca92a149961d1053e96367fc4ec9d04401c7f8385"} Feb 19 19:30:02 crc kubenswrapper[4749]: I0219 19:30:02.040147 4749 generic.go:334] "Generic (PLEG): container finished" podID="cf14e5ea-0731-41b0-93d5-0fb06e2190c6" containerID="5f00b43e59cded978f2de7bc6d0610106993c1e10db4568182161ddf66b16ab9" exitCode=0 Feb 19 19:30:02 crc kubenswrapper[4749]: I0219 19:30:02.040241 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-qg5vd" event={"ID":"cf14e5ea-0731-41b0-93d5-0fb06e2190c6","Type":"ContainerDied","Data":"5f00b43e59cded978f2de7bc6d0610106993c1e10db4568182161ddf66b16ab9"} Feb 19 19:30:02 crc kubenswrapper[4749]: I0219 19:30:02.040717 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-qg5vd" event={"ID":"cf14e5ea-0731-41b0-93d5-0fb06e2190c6","Type":"ContainerStarted","Data":"0bebda14917eb199146434b516dd25c65539dbe97da5a913feb954035d4b7a8b"} Feb 19 19:30:02 crc kubenswrapper[4749]: I0219 19:30:02.061652 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2xxvx" podStartSLOduration=2.515859856 podStartE2EDuration="6.061632513s" podCreationTimestamp="2026-02-19 19:29:56 +0000 UTC" firstStartedPulling="2026-02-19 19:29:57.996403227 +0000 UTC m=+3371.957623181" lastFinishedPulling="2026-02-19 19:30:01.542175874 +0000 UTC m=+3375.503395838" observedRunningTime="2026-02-19 19:30:02.052514953 +0000 UTC m=+3376.013734927" watchObservedRunningTime="2026-02-19 19:30:02.061632513 +0000 UTC m=+3376.022852467" Feb 19 19:30:03 crc kubenswrapper[4749]: I0219 19:30:03.456306 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-qg5vd" Feb 19 19:30:03 crc kubenswrapper[4749]: I0219 19:30:03.593250 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf14e5ea-0731-41b0-93d5-0fb06e2190c6-secret-volume\") pod \"cf14e5ea-0731-41b0-93d5-0fb06e2190c6\" (UID: \"cf14e5ea-0731-41b0-93d5-0fb06e2190c6\") " Feb 19 19:30:03 crc kubenswrapper[4749]: I0219 19:30:03.593366 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf14e5ea-0731-41b0-93d5-0fb06e2190c6-config-volume\") pod \"cf14e5ea-0731-41b0-93d5-0fb06e2190c6\" (UID: \"cf14e5ea-0731-41b0-93d5-0fb06e2190c6\") " Feb 19 19:30:03 crc kubenswrapper[4749]: I0219 19:30:03.593614 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhq47\" (UniqueName: \"kubernetes.io/projected/cf14e5ea-0731-41b0-93d5-0fb06e2190c6-kube-api-access-mhq47\") pod \"cf14e5ea-0731-41b0-93d5-0fb06e2190c6\" (UID: \"cf14e5ea-0731-41b0-93d5-0fb06e2190c6\") " Feb 19 19:30:03 crc kubenswrapper[4749]: I0219 19:30:03.594565 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf14e5ea-0731-41b0-93d5-0fb06e2190c6-config-volume" (OuterVolumeSpecName: "config-volume") pod "cf14e5ea-0731-41b0-93d5-0fb06e2190c6" (UID: "cf14e5ea-0731-41b0-93d5-0fb06e2190c6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:30:03 crc kubenswrapper[4749]: I0219 19:30:03.602283 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf14e5ea-0731-41b0-93d5-0fb06e2190c6-kube-api-access-mhq47" (OuterVolumeSpecName: "kube-api-access-mhq47") pod "cf14e5ea-0731-41b0-93d5-0fb06e2190c6" (UID: "cf14e5ea-0731-41b0-93d5-0fb06e2190c6"). InnerVolumeSpecName "kube-api-access-mhq47". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:30:03 crc kubenswrapper[4749]: I0219 19:30:03.602314 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf14e5ea-0731-41b0-93d5-0fb06e2190c6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cf14e5ea-0731-41b0-93d5-0fb06e2190c6" (UID: "cf14e5ea-0731-41b0-93d5-0fb06e2190c6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:30:03 crc kubenswrapper[4749]: I0219 19:30:03.696834 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhq47\" (UniqueName: \"kubernetes.io/projected/cf14e5ea-0731-41b0-93d5-0fb06e2190c6-kube-api-access-mhq47\") on node \"crc\" DevicePath \"\"" Feb 19 19:30:03 crc kubenswrapper[4749]: I0219 19:30:03.696896 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf14e5ea-0731-41b0-93d5-0fb06e2190c6-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 19:30:03 crc kubenswrapper[4749]: I0219 19:30:03.696915 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf14e5ea-0731-41b0-93d5-0fb06e2190c6-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 19:30:04 crc kubenswrapper[4749]: I0219 19:30:04.058725 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-qg5vd" event={"ID":"cf14e5ea-0731-41b0-93d5-0fb06e2190c6","Type":"ContainerDied","Data":"0bebda14917eb199146434b516dd25c65539dbe97da5a913feb954035d4b7a8b"} Feb 19 19:30:04 crc kubenswrapper[4749]: I0219 19:30:04.059117 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bebda14917eb199146434b516dd25c65539dbe97da5a913feb954035d4b7a8b" Feb 19 19:30:04 crc kubenswrapper[4749]: I0219 19:30:04.058931 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-qg5vd" Feb 19 19:30:04 crc kubenswrapper[4749]: I0219 19:30:04.527607 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525445-n2zcb"] Feb 19 19:30:04 crc kubenswrapper[4749]: I0219 19:30:04.536779 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525445-n2zcb"] Feb 19 19:30:04 crc kubenswrapper[4749]: I0219 19:30:04.693132 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f395667-c914-4b04-a6b4-52180a9b0356" path="/var/lib/kubelet/pods/6f395667-c914-4b04-a6b4-52180a9b0356/volumes" Feb 19 19:30:07 crc kubenswrapper[4749]: I0219 19:30:07.203610 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2xxvx" Feb 19 19:30:07 crc kubenswrapper[4749]: I0219 19:30:07.204397 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2xxvx" Feb 19 19:30:07 crc kubenswrapper[4749]: I0219 19:30:07.264322 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2xxvx" Feb 19 19:30:08 crc kubenswrapper[4749]: I0219 19:30:08.160518 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2xxvx" Feb 19 19:30:08 crc kubenswrapper[4749]: I0219 19:30:08.204563 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2xxvx"] Feb 19 19:30:09 crc kubenswrapper[4749]: I0219 19:30:09.678973 4749 scope.go:117] "RemoveContainer" containerID="89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" Feb 19 19:30:09 crc kubenswrapper[4749]: E0219 19:30:09.679615 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:30:10 crc kubenswrapper[4749]: I0219 19:30:10.128680 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2xxvx" podUID="ed9e8860-db2a-48d1-a571-6475da138f70" containerName="registry-server" containerID="cri-o://6b484980847ea79d6a4c5f4ca92a149961d1053e96367fc4ec9d04401c7f8385" gracePeriod=2 Feb 19 19:30:10 crc kubenswrapper[4749]: I0219 19:30:10.591634 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2xxvx" Feb 19 19:30:10 crc kubenswrapper[4749]: I0219 19:30:10.741745 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed9e8860-db2a-48d1-a571-6475da138f70-utilities\") pod \"ed9e8860-db2a-48d1-a571-6475da138f70\" (UID: \"ed9e8860-db2a-48d1-a571-6475da138f70\") " Feb 19 19:30:10 crc kubenswrapper[4749]: I0219 19:30:10.741792 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed9e8860-db2a-48d1-a571-6475da138f70-catalog-content\") pod \"ed9e8860-db2a-48d1-a571-6475da138f70\" (UID: \"ed9e8860-db2a-48d1-a571-6475da138f70\") " Feb 19 19:30:10 crc kubenswrapper[4749]: I0219 19:30:10.741952 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m27m2\" (UniqueName: \"kubernetes.io/projected/ed9e8860-db2a-48d1-a571-6475da138f70-kube-api-access-m27m2\") pod \"ed9e8860-db2a-48d1-a571-6475da138f70\" (UID: \"ed9e8860-db2a-48d1-a571-6475da138f70\") " Feb 19 19:30:10 crc kubenswrapper[4749]: I0219 19:30:10.742729 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed9e8860-db2a-48d1-a571-6475da138f70-utilities" (OuterVolumeSpecName: "utilities") pod "ed9e8860-db2a-48d1-a571-6475da138f70" (UID: "ed9e8860-db2a-48d1-a571-6475da138f70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:30:10 crc kubenswrapper[4749]: I0219 19:30:10.747970 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed9e8860-db2a-48d1-a571-6475da138f70-kube-api-access-m27m2" (OuterVolumeSpecName: "kube-api-access-m27m2") pod "ed9e8860-db2a-48d1-a571-6475da138f70" (UID: "ed9e8860-db2a-48d1-a571-6475da138f70"). InnerVolumeSpecName "kube-api-access-m27m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:30:10 crc kubenswrapper[4749]: I0219 19:30:10.809322 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed9e8860-db2a-48d1-a571-6475da138f70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed9e8860-db2a-48d1-a571-6475da138f70" (UID: "ed9e8860-db2a-48d1-a571-6475da138f70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:30:10 crc kubenswrapper[4749]: I0219 19:30:10.844281 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m27m2\" (UniqueName: \"kubernetes.io/projected/ed9e8860-db2a-48d1-a571-6475da138f70-kube-api-access-m27m2\") on node \"crc\" DevicePath \"\"" Feb 19 19:30:10 crc kubenswrapper[4749]: I0219 19:30:10.844320 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed9e8860-db2a-48d1-a571-6475da138f70-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:30:10 crc kubenswrapper[4749]: I0219 19:30:10.844332 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed9e8860-db2a-48d1-a571-6475da138f70-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:30:11 crc kubenswrapper[4749]: I0219 19:30:11.137651 4749 generic.go:334] "Generic (PLEG): container finished" podID="ed9e8860-db2a-48d1-a571-6475da138f70" containerID="6b484980847ea79d6a4c5f4ca92a149961d1053e96367fc4ec9d04401c7f8385" exitCode=0 Feb 19 19:30:11 crc kubenswrapper[4749]: I0219 19:30:11.137702 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2xxvx" Feb 19 19:30:11 crc kubenswrapper[4749]: I0219 19:30:11.137701 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xxvx" event={"ID":"ed9e8860-db2a-48d1-a571-6475da138f70","Type":"ContainerDied","Data":"6b484980847ea79d6a4c5f4ca92a149961d1053e96367fc4ec9d04401c7f8385"} Feb 19 19:30:11 crc kubenswrapper[4749]: I0219 19:30:11.137768 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xxvx" event={"ID":"ed9e8860-db2a-48d1-a571-6475da138f70","Type":"ContainerDied","Data":"0a2a940ea4112591bb9fed00e45ace41010d5c84cbab03ab155a433a37e22655"} Feb 19 19:30:11 crc kubenswrapper[4749]: I0219 19:30:11.137793 4749 scope.go:117] "RemoveContainer" containerID="6b484980847ea79d6a4c5f4ca92a149961d1053e96367fc4ec9d04401c7f8385" Feb 19 19:30:11 crc kubenswrapper[4749]: I0219 19:30:11.171254 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2xxvx"] Feb 19 19:30:11 crc kubenswrapper[4749]: I0219 19:30:11.173111 4749 scope.go:117] "RemoveContainer" containerID="82e75cab057fe703d07b251ac4c477e47af3aa291cfd80f831cc8eebf4774f3f" Feb 19 19:30:11 crc kubenswrapper[4749]: I0219 19:30:11.185771 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2xxvx"] Feb 19 19:30:11 crc kubenswrapper[4749]: I0219 19:30:11.193978 4749 scope.go:117] "RemoveContainer" containerID="11b454bc73b9eb5a6694df3908803bf1c47195f43512b1963023d2864d4cdfdc" Feb 19 19:30:11 crc kubenswrapper[4749]: I0219 19:30:11.255706 4749 scope.go:117] "RemoveContainer" containerID="6b484980847ea79d6a4c5f4ca92a149961d1053e96367fc4ec9d04401c7f8385" Feb 19 19:30:11 crc kubenswrapper[4749]: E0219 19:30:11.256591 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b484980847ea79d6a4c5f4ca92a149961d1053e96367fc4ec9d04401c7f8385\": container with ID starting with 6b484980847ea79d6a4c5f4ca92a149961d1053e96367fc4ec9d04401c7f8385 not found: ID does not exist" containerID="6b484980847ea79d6a4c5f4ca92a149961d1053e96367fc4ec9d04401c7f8385" Feb 19 19:30:11 crc kubenswrapper[4749]: I0219 19:30:11.256626 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b484980847ea79d6a4c5f4ca92a149961d1053e96367fc4ec9d04401c7f8385"} err="failed to get container status \"6b484980847ea79d6a4c5f4ca92a149961d1053e96367fc4ec9d04401c7f8385\": rpc error: code = NotFound desc = could not find container \"6b484980847ea79d6a4c5f4ca92a149961d1053e96367fc4ec9d04401c7f8385\": container with ID starting with 6b484980847ea79d6a4c5f4ca92a149961d1053e96367fc4ec9d04401c7f8385 not found: ID does not exist" Feb 19 19:30:11 crc kubenswrapper[4749]: I0219 19:30:11.256653 4749 scope.go:117] "RemoveContainer" containerID="82e75cab057fe703d07b251ac4c477e47af3aa291cfd80f831cc8eebf4774f3f" Feb 19 19:30:11 crc kubenswrapper[4749]: E0219 19:30:11.256943 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82e75cab057fe703d07b251ac4c477e47af3aa291cfd80f831cc8eebf4774f3f\": container with ID starting with 82e75cab057fe703d07b251ac4c477e47af3aa291cfd80f831cc8eebf4774f3f not found: ID does not exist" containerID="82e75cab057fe703d07b251ac4c477e47af3aa291cfd80f831cc8eebf4774f3f" Feb 19 19:30:11 crc kubenswrapper[4749]: I0219 19:30:11.256973 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82e75cab057fe703d07b251ac4c477e47af3aa291cfd80f831cc8eebf4774f3f"} err="failed to get container status \"82e75cab057fe703d07b251ac4c477e47af3aa291cfd80f831cc8eebf4774f3f\": rpc error: code = NotFound desc = could not find container \"82e75cab057fe703d07b251ac4c477e47af3aa291cfd80f831cc8eebf4774f3f\": container with ID starting with 82e75cab057fe703d07b251ac4c477e47af3aa291cfd80f831cc8eebf4774f3f not found: ID does not exist" Feb 19 19:30:11 crc kubenswrapper[4749]: I0219 19:30:11.256989 4749 scope.go:117] "RemoveContainer" containerID="11b454bc73b9eb5a6694df3908803bf1c47195f43512b1963023d2864d4cdfdc" Feb 19 19:30:11 crc kubenswrapper[4749]: E0219 19:30:11.257257 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11b454bc73b9eb5a6694df3908803bf1c47195f43512b1963023d2864d4cdfdc\": container with ID starting with 11b454bc73b9eb5a6694df3908803bf1c47195f43512b1963023d2864d4cdfdc not found: ID does not exist" containerID="11b454bc73b9eb5a6694df3908803bf1c47195f43512b1963023d2864d4cdfdc" Feb 19 19:30:11 crc kubenswrapper[4749]: I0219 19:30:11.257284 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11b454bc73b9eb5a6694df3908803bf1c47195f43512b1963023d2864d4cdfdc"} err="failed to get container status \"11b454bc73b9eb5a6694df3908803bf1c47195f43512b1963023d2864d4cdfdc\": rpc error: code = NotFound desc = could not find container \"11b454bc73b9eb5a6694df3908803bf1c47195f43512b1963023d2864d4cdfdc\": container with ID starting with 11b454bc73b9eb5a6694df3908803bf1c47195f43512b1963023d2864d4cdfdc not found: ID does not exist" Feb 19 19:30:12 crc kubenswrapper[4749]: I0219 19:30:12.696065 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed9e8860-db2a-48d1-a571-6475da138f70" path="/var/lib/kubelet/pods/ed9e8860-db2a-48d1-a571-6475da138f70/volumes" Feb 19 19:30:20 crc kubenswrapper[4749]: I0219 19:30:20.679236 4749 scope.go:117] "RemoveContainer" containerID="89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" Feb 19 19:30:20 crc kubenswrapper[4749]: E0219 19:30:20.680058 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:30:24 crc kubenswrapper[4749]: I0219 19:30:24.692794 4749 scope.go:117] "RemoveContainer" containerID="891f89fc4ca6204e3e993bf46d34bb48a9e9cfc30359e7e4e2dd7c016f26e419" Feb 19 19:30:35 crc kubenswrapper[4749]: I0219 19:30:35.678958 4749 scope.go:117] "RemoveContainer" containerID="89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" Feb 19 19:30:36 crc kubenswrapper[4749]: I0219 19:30:36.412580 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerStarted","Data":"4ec4d745220a9131845dd02d7496e4db3d90a311584068adc0592a01b23f118f"} Feb 19 19:30:51 crc kubenswrapper[4749]: E0219 19:30:51.848444 4749 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.128:56690->38.102.83.128:36573: write tcp 38.102.83.128:56690->38.102.83.128:36573: write: broken pipe Feb 19 19:31:16 crc kubenswrapper[4749]: I0219 19:31:16.667480 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q4dmw"] Feb 19 19:31:16 crc kubenswrapper[4749]: E0219 19:31:16.668556 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf14e5ea-0731-41b0-93d5-0fb06e2190c6" containerName="collect-profiles" Feb 19 19:31:16 crc kubenswrapper[4749]: I0219 19:31:16.668572 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf14e5ea-0731-41b0-93d5-0fb06e2190c6" containerName="collect-profiles" Feb 19 19:31:16 crc kubenswrapper[4749]: E0219 19:31:16.668586 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed9e8860-db2a-48d1-a571-6475da138f70" containerName="registry-server" Feb 19 19:31:16 crc kubenswrapper[4749]: I0219 19:31:16.668595 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed9e8860-db2a-48d1-a571-6475da138f70" containerName="registry-server" Feb 19 19:31:16 crc kubenswrapper[4749]: E0219 19:31:16.668627 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed9e8860-db2a-48d1-a571-6475da138f70" containerName="extract-content" Feb 19 19:31:16 crc kubenswrapper[4749]: I0219 19:31:16.668635 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed9e8860-db2a-48d1-a571-6475da138f70" containerName="extract-content" Feb 19 19:31:16 crc kubenswrapper[4749]: E0219 19:31:16.668652 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed9e8860-db2a-48d1-a571-6475da138f70" containerName="extract-utilities" Feb 19 19:31:16 crc kubenswrapper[4749]: I0219 19:31:16.668661 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed9e8860-db2a-48d1-a571-6475da138f70" containerName="extract-utilities" Feb 19 19:31:16 crc kubenswrapper[4749]: I0219 19:31:16.668897 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf14e5ea-0731-41b0-93d5-0fb06e2190c6" containerName="collect-profiles" Feb 19 19:31:16 crc kubenswrapper[4749]: I0219 19:31:16.668921 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed9e8860-db2a-48d1-a571-6475da138f70" containerName="registry-server" Feb 19 19:31:16 crc kubenswrapper[4749]: I0219 19:31:16.670707 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q4dmw" Feb 19 19:31:16 crc kubenswrapper[4749]: I0219 19:31:16.712126 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b09953f2-99c5-486a-81a0-2b65a2cd2531-utilities\") pod \"redhat-operators-q4dmw\" (UID: \"b09953f2-99c5-486a-81a0-2b65a2cd2531\") " pod="openshift-marketplace/redhat-operators-q4dmw" Feb 19 19:31:16 crc kubenswrapper[4749]: I0219 19:31:16.712202 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b09953f2-99c5-486a-81a0-2b65a2cd2531-catalog-content\") pod \"redhat-operators-q4dmw\" (UID: \"b09953f2-99c5-486a-81a0-2b65a2cd2531\") " pod="openshift-marketplace/redhat-operators-q4dmw" Feb 19 19:31:16 crc kubenswrapper[4749]: I0219 19:31:16.712344 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg7m5\" (UniqueName: \"kubernetes.io/projected/b09953f2-99c5-486a-81a0-2b65a2cd2531-kube-api-access-vg7m5\") pod \"redhat-operators-q4dmw\" (UID: \"b09953f2-99c5-486a-81a0-2b65a2cd2531\") " pod="openshift-marketplace/redhat-operators-q4dmw" Feb 19 19:31:16 crc kubenswrapper[4749]: I0219 19:31:16.731411 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q4dmw"] Feb 19 19:31:16 crc kubenswrapper[4749]: I0219 19:31:16.814809 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg7m5\" (UniqueName: \"kubernetes.io/projected/b09953f2-99c5-486a-81a0-2b65a2cd2531-kube-api-access-vg7m5\") pod \"redhat-operators-q4dmw\" (UID: \"b09953f2-99c5-486a-81a0-2b65a2cd2531\") " pod="openshift-marketplace/redhat-operators-q4dmw" Feb 19 19:31:16 crc kubenswrapper[4749]: I0219 19:31:16.815053 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b09953f2-99c5-486a-81a0-2b65a2cd2531-utilities\") pod \"redhat-operators-q4dmw\" (UID: \"b09953f2-99c5-486a-81a0-2b65a2cd2531\") " pod="openshift-marketplace/redhat-operators-q4dmw" Feb 19 19:31:16 crc kubenswrapper[4749]: I0219 19:31:16.815130 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b09953f2-99c5-486a-81a0-2b65a2cd2531-catalog-content\") pod \"redhat-operators-q4dmw\" (UID: \"b09953f2-99c5-486a-81a0-2b65a2cd2531\") " pod="openshift-marketplace/redhat-operators-q4dmw" Feb 19 19:31:16 crc kubenswrapper[4749]: I0219 19:31:16.815513 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b09953f2-99c5-486a-81a0-2b65a2cd2531-utilities\") pod \"redhat-operators-q4dmw\" (UID: \"b09953f2-99c5-486a-81a0-2b65a2cd2531\") " pod="openshift-marketplace/redhat-operators-q4dmw" Feb 19 19:31:16 crc kubenswrapper[4749]: I0219 19:31:16.816231 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b09953f2-99c5-486a-81a0-2b65a2cd2531-catalog-content\") pod \"redhat-operators-q4dmw\" (UID: \"b09953f2-99c5-486a-81a0-2b65a2cd2531\") " pod="openshift-marketplace/redhat-operators-q4dmw" Feb 19 19:31:16 crc kubenswrapper[4749]: I0219 19:31:16.846365 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg7m5\" (UniqueName: \"kubernetes.io/projected/b09953f2-99c5-486a-81a0-2b65a2cd2531-kube-api-access-vg7m5\") pod \"redhat-operators-q4dmw\" (UID: \"b09953f2-99c5-486a-81a0-2b65a2cd2531\") " pod="openshift-marketplace/redhat-operators-q4dmw" Feb 19 19:31:17 crc kubenswrapper[4749]: I0219 19:31:17.005428 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q4dmw" Feb 19 19:31:17 crc kubenswrapper[4749]: I0219 19:31:17.489378 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q4dmw"] Feb 19 19:31:17 crc kubenswrapper[4749]: I0219 19:31:17.817556 4749 generic.go:334] "Generic (PLEG): container finished" podID="b09953f2-99c5-486a-81a0-2b65a2cd2531" containerID="e865afb3c5996dbe5d63e884207e257135bbdd9865f3d670ce9b2ee22021bf47" exitCode=0 Feb 19 19:31:17 crc kubenswrapper[4749]: I0219 19:31:17.817703 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q4dmw" event={"ID":"b09953f2-99c5-486a-81a0-2b65a2cd2531","Type":"ContainerDied","Data":"e865afb3c5996dbe5d63e884207e257135bbdd9865f3d670ce9b2ee22021bf47"} Feb 19 19:31:17 crc kubenswrapper[4749]: I0219 19:31:17.817972 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q4dmw" event={"ID":"b09953f2-99c5-486a-81a0-2b65a2cd2531","Type":"ContainerStarted","Data":"33e1da78d75e1742d3f9beaacbc6509e4e98a0d54de700b08e8ff77e9760d2fd"} Feb 19 19:31:18 crc kubenswrapper[4749]: I0219 19:31:18.827600 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q4dmw" event={"ID":"b09953f2-99c5-486a-81a0-2b65a2cd2531","Type":"ContainerStarted","Data":"92f3a0f17dee6bd172c05be0d41eda8fceed1bce6a29f24a45f447c33d1ce3a2"} Feb 19 19:31:23 crc kubenswrapper[4749]: I0219 19:31:23.873088 4749 generic.go:334] "Generic (PLEG): container finished" podID="b09953f2-99c5-486a-81a0-2b65a2cd2531" containerID="92f3a0f17dee6bd172c05be0d41eda8fceed1bce6a29f24a45f447c33d1ce3a2" exitCode=0 Feb 19 19:31:23 crc kubenswrapper[4749]: I0219 19:31:23.873160 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q4dmw" event={"ID":"b09953f2-99c5-486a-81a0-2b65a2cd2531","Type":"ContainerDied","Data":"92f3a0f17dee6bd172c05be0d41eda8fceed1bce6a29f24a45f447c33d1ce3a2"} Feb 19 19:31:24 crc kubenswrapper[4749]: I0219 19:31:24.885676 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q4dmw" event={"ID":"b09953f2-99c5-486a-81a0-2b65a2cd2531","Type":"ContainerStarted","Data":"6e2500497342123eb750bdbe309e3306c3571038aaa89c50ef08cffbcd300a4e"} Feb 19 19:31:24 crc kubenswrapper[4749]: I0219 19:31:24.918903 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q4dmw" podStartSLOduration=2.446510938 podStartE2EDuration="8.918869169s" podCreationTimestamp="2026-02-19 19:31:16 +0000 UTC" firstStartedPulling="2026-02-19 19:31:17.819278007 +0000 UTC m=+3451.780497961" lastFinishedPulling="2026-02-19 19:31:24.291636238 +0000 UTC m=+3458.252856192" observedRunningTime="2026-02-19 19:31:24.907623518 +0000 UTC m=+3458.868843492" watchObservedRunningTime="2026-02-19 19:31:24.918869169 +0000 UTC m=+3458.880089123" Feb 19 19:31:27 crc kubenswrapper[4749]: I0219 19:31:27.005943 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q4dmw" Feb 19 19:31:27 crc kubenswrapper[4749]: I0219 19:31:27.006265 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q4dmw" Feb 19 19:31:28 crc kubenswrapper[4749]: I0219 19:31:28.048959 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-q4dmw" podUID="b09953f2-99c5-486a-81a0-2b65a2cd2531" containerName="registry-server" probeResult="failure" output=< Feb 19 19:31:28 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Feb 19 19:31:28 crc kubenswrapper[4749]: > Feb 19 19:31:37 crc kubenswrapper[4749]: I0219 19:31:37.050253 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q4dmw" Feb 19 19:31:37 crc kubenswrapper[4749]: I0219 19:31:37.101711 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q4dmw" Feb 19 19:31:37 crc kubenswrapper[4749]: I0219 19:31:37.289612 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q4dmw"] Feb 19 19:31:39 crc kubenswrapper[4749]: I0219 19:31:39.000629 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q4dmw" podUID="b09953f2-99c5-486a-81a0-2b65a2cd2531" containerName="registry-server" containerID="cri-o://6e2500497342123eb750bdbe309e3306c3571038aaa89c50ef08cffbcd300a4e" gracePeriod=2 Feb 19 19:31:39 crc kubenswrapper[4749]: I0219 19:31:39.442866 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q4dmw" Feb 19 19:31:39 crc kubenswrapper[4749]: I0219 19:31:39.598828 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b09953f2-99c5-486a-81a0-2b65a2cd2531-utilities\") pod \"b09953f2-99c5-486a-81a0-2b65a2cd2531\" (UID: \"b09953f2-99c5-486a-81a0-2b65a2cd2531\") " Feb 19 19:31:39 crc kubenswrapper[4749]: I0219 19:31:39.598984 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg7m5\" (UniqueName: \"kubernetes.io/projected/b09953f2-99c5-486a-81a0-2b65a2cd2531-kube-api-access-vg7m5\") pod \"b09953f2-99c5-486a-81a0-2b65a2cd2531\" (UID: \"b09953f2-99c5-486a-81a0-2b65a2cd2531\") " Feb 19 19:31:39 crc kubenswrapper[4749]: I0219 19:31:39.599214 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b09953f2-99c5-486a-81a0-2b65a2cd2531-catalog-content\") pod \"b09953f2-99c5-486a-81a0-2b65a2cd2531\" (UID: \"b09953f2-99c5-486a-81a0-2b65a2cd2531\") " Feb 19 19:31:39 crc kubenswrapper[4749]: I0219 19:31:39.599700 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b09953f2-99c5-486a-81a0-2b65a2cd2531-utilities" (OuterVolumeSpecName: "utilities") pod "b09953f2-99c5-486a-81a0-2b65a2cd2531" (UID: "b09953f2-99c5-486a-81a0-2b65a2cd2531"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:31:39 crc kubenswrapper[4749]: I0219 19:31:39.600400 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b09953f2-99c5-486a-81a0-2b65a2cd2531-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:31:39 crc kubenswrapper[4749]: I0219 19:31:39.606565 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b09953f2-99c5-486a-81a0-2b65a2cd2531-kube-api-access-vg7m5" (OuterVolumeSpecName: "kube-api-access-vg7m5") pod "b09953f2-99c5-486a-81a0-2b65a2cd2531" (UID: "b09953f2-99c5-486a-81a0-2b65a2cd2531"). InnerVolumeSpecName "kube-api-access-vg7m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:31:39 crc kubenswrapper[4749]: I0219 19:31:39.702895 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg7m5\" (UniqueName: \"kubernetes.io/projected/b09953f2-99c5-486a-81a0-2b65a2cd2531-kube-api-access-vg7m5\") on node \"crc\" DevicePath \"\"" Feb 19 19:31:39 crc kubenswrapper[4749]: I0219 19:31:39.745888 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b09953f2-99c5-486a-81a0-2b65a2cd2531-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b09953f2-99c5-486a-81a0-2b65a2cd2531" (UID: "b09953f2-99c5-486a-81a0-2b65a2cd2531"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:31:39 crc kubenswrapper[4749]: I0219 19:31:39.805847 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b09953f2-99c5-486a-81a0-2b65a2cd2531-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:31:40 crc kubenswrapper[4749]: I0219 19:31:40.013519 4749 generic.go:334] "Generic (PLEG): container finished" podID="b09953f2-99c5-486a-81a0-2b65a2cd2531" containerID="6e2500497342123eb750bdbe309e3306c3571038aaa89c50ef08cffbcd300a4e" exitCode=0 Feb 19 19:31:40 crc kubenswrapper[4749]: I0219 19:31:40.013566 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q4dmw" event={"ID":"b09953f2-99c5-486a-81a0-2b65a2cd2531","Type":"ContainerDied","Data":"6e2500497342123eb750bdbe309e3306c3571038aaa89c50ef08cffbcd300a4e"} Feb 19 19:31:40 crc kubenswrapper[4749]: I0219 19:31:40.013605 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q4dmw" event={"ID":"b09953f2-99c5-486a-81a0-2b65a2cd2531","Type":"ContainerDied","Data":"33e1da78d75e1742d3f9beaacbc6509e4e98a0d54de700b08e8ff77e9760d2fd"} Feb 19 19:31:40 crc kubenswrapper[4749]: I0219 19:31:40.013615 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q4dmw" Feb 19 19:31:40 crc kubenswrapper[4749]: I0219 19:31:40.013629 4749 scope.go:117] "RemoveContainer" containerID="6e2500497342123eb750bdbe309e3306c3571038aaa89c50ef08cffbcd300a4e" Feb 19 19:31:40 crc kubenswrapper[4749]: I0219 19:31:40.033534 4749 scope.go:117] "RemoveContainer" containerID="92f3a0f17dee6bd172c05be0d41eda8fceed1bce6a29f24a45f447c33d1ce3a2" Feb 19 19:31:40 crc kubenswrapper[4749]: I0219 19:31:40.061953 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q4dmw"] Feb 19 19:31:40 crc kubenswrapper[4749]: I0219 19:31:40.073211 4749 scope.go:117] "RemoveContainer" containerID="e865afb3c5996dbe5d63e884207e257135bbdd9865f3d670ce9b2ee22021bf47" Feb 19 19:31:40 crc kubenswrapper[4749]: I0219 19:31:40.075358 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q4dmw"] Feb 19 19:31:40 crc kubenswrapper[4749]: I0219 19:31:40.101885 4749 scope.go:117] "RemoveContainer" containerID="6e2500497342123eb750bdbe309e3306c3571038aaa89c50ef08cffbcd300a4e" Feb 19 19:31:40 crc kubenswrapper[4749]: E0219 19:31:40.102204 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e2500497342123eb750bdbe309e3306c3571038aaa89c50ef08cffbcd300a4e\": container with ID starting with 6e2500497342123eb750bdbe309e3306c3571038aaa89c50ef08cffbcd300a4e not found: ID does not exist" containerID="6e2500497342123eb750bdbe309e3306c3571038aaa89c50ef08cffbcd300a4e" Feb 19 19:31:40 crc kubenswrapper[4749]: I0219 19:31:40.102238 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e2500497342123eb750bdbe309e3306c3571038aaa89c50ef08cffbcd300a4e"} err="failed to get container status \"6e2500497342123eb750bdbe309e3306c3571038aaa89c50ef08cffbcd300a4e\": rpc error: code = NotFound desc = could not find container \"6e2500497342123eb750bdbe309e3306c3571038aaa89c50ef08cffbcd300a4e\": container with ID starting with 6e2500497342123eb750bdbe309e3306c3571038aaa89c50ef08cffbcd300a4e not found: ID does not exist" Feb 19 19:31:40 crc kubenswrapper[4749]: I0219 19:31:40.102260 4749 scope.go:117] "RemoveContainer" containerID="92f3a0f17dee6bd172c05be0d41eda8fceed1bce6a29f24a45f447c33d1ce3a2" Feb 19 19:31:40 crc kubenswrapper[4749]: E0219 19:31:40.102458 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92f3a0f17dee6bd172c05be0d41eda8fceed1bce6a29f24a45f447c33d1ce3a2\": container with ID starting with 92f3a0f17dee6bd172c05be0d41eda8fceed1bce6a29f24a45f447c33d1ce3a2 not found: ID does not exist" containerID="92f3a0f17dee6bd172c05be0d41eda8fceed1bce6a29f24a45f447c33d1ce3a2" Feb 19 19:31:40 crc kubenswrapper[4749]: I0219 19:31:40.102482 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92f3a0f17dee6bd172c05be0d41eda8fceed1bce6a29f24a45f447c33d1ce3a2"} err="failed to get container status \"92f3a0f17dee6bd172c05be0d41eda8fceed1bce6a29f24a45f447c33d1ce3a2\": rpc error: code = NotFound desc = could not find container \"92f3a0f17dee6bd172c05be0d41eda8fceed1bce6a29f24a45f447c33d1ce3a2\": container with ID starting with 92f3a0f17dee6bd172c05be0d41eda8fceed1bce6a29f24a45f447c33d1ce3a2 not found: ID does not exist" Feb 19 19:31:40 crc kubenswrapper[4749]: I0219 19:31:40.102501 4749 scope.go:117] "RemoveContainer" containerID="e865afb3c5996dbe5d63e884207e257135bbdd9865f3d670ce9b2ee22021bf47" Feb 19 19:31:40 crc kubenswrapper[4749]: E0219 19:31:40.102784 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e865afb3c5996dbe5d63e884207e257135bbdd9865f3d670ce9b2ee22021bf47\": container with ID starting with e865afb3c5996dbe5d63e884207e257135bbdd9865f3d670ce9b2ee22021bf47 not found: ID does not exist" containerID="e865afb3c5996dbe5d63e884207e257135bbdd9865f3d670ce9b2ee22021bf47" Feb 19 19:31:40 crc kubenswrapper[4749]: I0219 19:31:40.102804 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e865afb3c5996dbe5d63e884207e257135bbdd9865f3d670ce9b2ee22021bf47"} err="failed to get container status \"e865afb3c5996dbe5d63e884207e257135bbdd9865f3d670ce9b2ee22021bf47\": rpc error: code = NotFound desc = could not find container \"e865afb3c5996dbe5d63e884207e257135bbdd9865f3d670ce9b2ee22021bf47\": container with ID starting with e865afb3c5996dbe5d63e884207e257135bbdd9865f3d670ce9b2ee22021bf47 not found: ID does not exist" Feb 19 19:31:40 crc kubenswrapper[4749]: I0219 19:31:40.689453 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b09953f2-99c5-486a-81a0-2b65a2cd2531" path="/var/lib/kubelet/pods/b09953f2-99c5-486a-81a0-2b65a2cd2531/volumes" Feb 19 19:32:54 crc kubenswrapper[4749]: I0219 19:32:54.725660 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:32:54 crc kubenswrapper[4749]: I0219 19:32:54.726136 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:33:24 crc kubenswrapper[4749]: I0219 19:33:24.725657 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:33:24 crc kubenswrapper[4749]: I0219 19:33:24.726454 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:33:49 crc kubenswrapper[4749]: I0219 19:33:49.133901 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rfw5q"] Feb 19 19:33:49 crc kubenswrapper[4749]: E0219 19:33:49.135018 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b09953f2-99c5-486a-81a0-2b65a2cd2531" containerName="extract-content" Feb 19 19:33:49 crc kubenswrapper[4749]: I0219 19:33:49.139210 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b09953f2-99c5-486a-81a0-2b65a2cd2531" containerName="extract-content" Feb 19 19:33:49 crc kubenswrapper[4749]: E0219 19:33:49.139251 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b09953f2-99c5-486a-81a0-2b65a2cd2531" containerName="registry-server" Feb 19 19:33:49 crc kubenswrapper[4749]: I0219 19:33:49.139257 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b09953f2-99c5-486a-81a0-2b65a2cd2531" containerName="registry-server" Feb 19 19:33:49 crc kubenswrapper[4749]: E0219 19:33:49.139275 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b09953f2-99c5-486a-81a0-2b65a2cd2531" containerName="extract-utilities" Feb 19 19:33:49 crc kubenswrapper[4749]: I0219 19:33:49.139281 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b09953f2-99c5-486a-81a0-2b65a2cd2531" containerName="extract-utilities" Feb 19 19:33:49 crc kubenswrapper[4749]: I0219 19:33:49.139592 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b09953f2-99c5-486a-81a0-2b65a2cd2531" containerName="registry-server" Feb 19 19:33:49 crc kubenswrapper[4749]: I0219 19:33:49.141154 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rfw5q" Feb 19 19:33:49 crc kubenswrapper[4749]: I0219 19:33:49.155952 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rfw5q"] Feb 19 19:33:49 crc kubenswrapper[4749]: I0219 19:33:49.269901 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbvb9\" (UniqueName: \"kubernetes.io/projected/34a60908-5d55-49ff-bff4-3d5da8be5ebf-kube-api-access-xbvb9\") pod \"certified-operators-rfw5q\" (UID: \"34a60908-5d55-49ff-bff4-3d5da8be5ebf\") " pod="openshift-marketplace/certified-operators-rfw5q" Feb 19 19:33:49 crc kubenswrapper[4749]: I0219 19:33:49.269948 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34a60908-5d55-49ff-bff4-3d5da8be5ebf-catalog-content\") pod \"certified-operators-rfw5q\" (UID: \"34a60908-5d55-49ff-bff4-3d5da8be5ebf\") " pod="openshift-marketplace/certified-operators-rfw5q" Feb 19 19:33:49 crc kubenswrapper[4749]: I0219 19:33:49.270068 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34a60908-5d55-49ff-bff4-3d5da8be5ebf-utilities\") pod \"certified-operators-rfw5q\" (UID: \"34a60908-5d55-49ff-bff4-3d5da8be5ebf\") " pod="openshift-marketplace/certified-operators-rfw5q" Feb 19 19:33:49 crc kubenswrapper[4749]: I0219 19:33:49.372323 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34a60908-5d55-49ff-bff4-3d5da8be5ebf-utilities\") pod \"certified-operators-rfw5q\" (UID: \"34a60908-5d55-49ff-bff4-3d5da8be5ebf\") " pod="openshift-marketplace/certified-operators-rfw5q" Feb 19 19:33:49 crc kubenswrapper[4749]: I0219 19:33:49.372526 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbvb9\" (UniqueName: \"kubernetes.io/projected/34a60908-5d55-49ff-bff4-3d5da8be5ebf-kube-api-access-xbvb9\") pod \"certified-operators-rfw5q\" (UID: \"34a60908-5d55-49ff-bff4-3d5da8be5ebf\") " pod="openshift-marketplace/certified-operators-rfw5q" Feb 19 19:33:49 crc kubenswrapper[4749]: I0219 19:33:49.372570 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34a60908-5d55-49ff-bff4-3d5da8be5ebf-catalog-content\") pod \"certified-operators-rfw5q\" (UID: \"34a60908-5d55-49ff-bff4-3d5da8be5ebf\") " pod="openshift-marketplace/certified-operators-rfw5q" Feb 19 19:33:49 crc kubenswrapper[4749]: I0219 19:33:49.372900 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34a60908-5d55-49ff-bff4-3d5da8be5ebf-utilities\") pod \"certified-operators-rfw5q\" (UID: \"34a60908-5d55-49ff-bff4-3d5da8be5ebf\") " pod="openshift-marketplace/certified-operators-rfw5q" Feb 19 19:33:49 crc kubenswrapper[4749]: I0219 19:33:49.373195 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34a60908-5d55-49ff-bff4-3d5da8be5ebf-catalog-content\") pod \"certified-operators-rfw5q\" (UID: \"34a60908-5d55-49ff-bff4-3d5da8be5ebf\") " pod="openshift-marketplace/certified-operators-rfw5q" Feb 19 19:33:49 crc kubenswrapper[4749]: I0219 19:33:49.393523 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbvb9\" (UniqueName: \"kubernetes.io/projected/34a60908-5d55-49ff-bff4-3d5da8be5ebf-kube-api-access-xbvb9\") pod \"certified-operators-rfw5q\" (UID: \"34a60908-5d55-49ff-bff4-3d5da8be5ebf\") " pod="openshift-marketplace/certified-operators-rfw5q" Feb 19 19:33:49 crc kubenswrapper[4749]: I0219 19:33:49.518858 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rfw5q" Feb 19 19:33:50 crc kubenswrapper[4749]: I0219 19:33:50.095882 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rfw5q"] Feb 19 19:33:50 crc kubenswrapper[4749]: I0219 19:33:50.237538 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfw5q" event={"ID":"34a60908-5d55-49ff-bff4-3d5da8be5ebf","Type":"ContainerStarted","Data":"ba8f54219f12ad93071b46a85581b650811c3b48f417e50bfe8cf6f2bef45b4f"} Feb 19 19:33:51 crc kubenswrapper[4749]: I0219 19:33:51.246704 4749 generic.go:334] "Generic (PLEG): container finished" podID="34a60908-5d55-49ff-bff4-3d5da8be5ebf" containerID="2bcabf7055cfd45e9a43673cc611d4c4915ebb20bba0dbb8a514f58a268b17e6" exitCode=0 Feb 19 19:33:51 crc kubenswrapper[4749]: I0219 19:33:51.246743 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfw5q" event={"ID":"34a60908-5d55-49ff-bff4-3d5da8be5ebf","Type":"ContainerDied","Data":"2bcabf7055cfd45e9a43673cc611d4c4915ebb20bba0dbb8a514f58a268b17e6"} Feb 19 19:33:51 crc kubenswrapper[4749]: I0219 19:33:51.249208 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 19:33:53 crc kubenswrapper[4749]: I0219 19:33:53.274569 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfw5q" event={"ID":"34a60908-5d55-49ff-bff4-3d5da8be5ebf","Type":"ContainerStarted","Data":"febd7c4fa18874a92b6b512baa459a650bdd24d4a629adf55e2dc80e6c5d6e49"} Feb 19 19:33:54 crc kubenswrapper[4749]: I0219 19:33:54.287687 4749 generic.go:334] "Generic (PLEG): container finished" podID="34a60908-5d55-49ff-bff4-3d5da8be5ebf" containerID="febd7c4fa18874a92b6b512baa459a650bdd24d4a629adf55e2dc80e6c5d6e49" exitCode=0 Feb 19 19:33:54 crc kubenswrapper[4749]: I0219 19:33:54.289138 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfw5q" event={"ID":"34a60908-5d55-49ff-bff4-3d5da8be5ebf","Type":"ContainerDied","Data":"febd7c4fa18874a92b6b512baa459a650bdd24d4a629adf55e2dc80e6c5d6e49"} Feb 19 19:33:54 crc kubenswrapper[4749]: I0219 19:33:54.725578 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:33:54 crc kubenswrapper[4749]: I0219 19:33:54.725653 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:33:54 crc kubenswrapper[4749]: I0219 19:33:54.725710 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 19:33:54 crc kubenswrapper[4749]: I0219 19:33:54.726550 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4ec4d745220a9131845dd02d7496e4db3d90a311584068adc0592a01b23f118f"} pod="openshift-machine-config-operator/machine-config-daemon-nzldt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:33:54 crc kubenswrapper[4749]: I0219 19:33:54.726625 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" containerID="cri-o://4ec4d745220a9131845dd02d7496e4db3d90a311584068adc0592a01b23f118f" gracePeriod=600 Feb 19 19:33:55 crc kubenswrapper[4749]: I0219 19:33:55.312554 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfw5q" event={"ID":"34a60908-5d55-49ff-bff4-3d5da8be5ebf","Type":"ContainerStarted","Data":"0458252c3cf9b9bd809e252a43681928bf0f11cd38fb3eb0fea92d2f5c233526"} Feb 19 19:33:55 crc kubenswrapper[4749]: I0219 19:33:55.316570 4749 generic.go:334] "Generic (PLEG): container finished" podID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerID="4ec4d745220a9131845dd02d7496e4db3d90a311584068adc0592a01b23f118f" exitCode=0 Feb 19 19:33:55 crc kubenswrapper[4749]: I0219 19:33:55.316624 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerDied","Data":"4ec4d745220a9131845dd02d7496e4db3d90a311584068adc0592a01b23f118f"} Feb 19 19:33:55 crc kubenswrapper[4749]: I0219 19:33:55.316688 4749 scope.go:117] "RemoveContainer" containerID="89da922b515419a45ac144e1a2da71fed69a9d099edbcdfb634f40bd84dcc63c" Feb 19 19:33:55 crc kubenswrapper[4749]: I0219 19:33:55.349353 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rfw5q" podStartSLOduration=2.929800204 podStartE2EDuration="6.349326317s" podCreationTimestamp="2026-02-19 19:33:49 +0000 UTC" firstStartedPulling="2026-02-19 19:33:51.248977706 +0000 UTC m=+3605.210197660" lastFinishedPulling="2026-02-19 19:33:54.668503819 +0000 UTC m=+3608.629723773" observedRunningTime="2026-02-19 19:33:55.335850751 +0000 UTC m=+3609.297070735" watchObservedRunningTime="2026-02-19 19:33:55.349326317 +0000 UTC m=+3609.310546281" Feb 19 19:33:56 crc kubenswrapper[4749]: I0219 19:33:56.332113 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerStarted","Data":"85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd"} Feb 19 19:33:59 crc kubenswrapper[4749]: I0219 19:33:59.520232 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rfw5q" Feb 19 19:33:59 crc kubenswrapper[4749]: I0219 19:33:59.521300 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rfw5q" Feb 19 19:33:59 crc kubenswrapper[4749]: I0219 19:33:59.569900 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rfw5q" Feb 19 19:34:00 crc kubenswrapper[4749]: I0219 19:34:00.429125 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rfw5q" Feb 19 19:34:00 crc kubenswrapper[4749]: I0219 19:34:00.475327 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rfw5q"] Feb 19 19:34:02 crc kubenswrapper[4749]: I0219 19:34:02.406787 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rfw5q" podUID="34a60908-5d55-49ff-bff4-3d5da8be5ebf" containerName="registry-server" containerID="cri-o://0458252c3cf9b9bd809e252a43681928bf0f11cd38fb3eb0fea92d2f5c233526" gracePeriod=2 Feb 19 19:34:02 crc kubenswrapper[4749]: I0219 19:34:02.947073 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rfw5q" Feb 19 19:34:03 crc kubenswrapper[4749]: I0219 19:34:03.113096 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbvb9\" (UniqueName: \"kubernetes.io/projected/34a60908-5d55-49ff-bff4-3d5da8be5ebf-kube-api-access-xbvb9\") pod \"34a60908-5d55-49ff-bff4-3d5da8be5ebf\" (UID: \"34a60908-5d55-49ff-bff4-3d5da8be5ebf\") " Feb 19 19:34:03 crc kubenswrapper[4749]: I0219 19:34:03.113249 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34a60908-5d55-49ff-bff4-3d5da8be5ebf-utilities\") pod \"34a60908-5d55-49ff-bff4-3d5da8be5ebf\" (UID: \"34a60908-5d55-49ff-bff4-3d5da8be5ebf\") " Feb 19 19:34:03 crc kubenswrapper[4749]: I0219 19:34:03.113300 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34a60908-5d55-49ff-bff4-3d5da8be5ebf-catalog-content\") pod \"34a60908-5d55-49ff-bff4-3d5da8be5ebf\" (UID: \"34a60908-5d55-49ff-bff4-3d5da8be5ebf\") " Feb 19 19:34:03 crc kubenswrapper[4749]: I0219 19:34:03.114332 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34a60908-5d55-49ff-bff4-3d5da8be5ebf-utilities" (OuterVolumeSpecName: "utilities") pod "34a60908-5d55-49ff-bff4-3d5da8be5ebf" (UID: "34a60908-5d55-49ff-bff4-3d5da8be5ebf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:34:03 crc kubenswrapper[4749]: I0219 19:34:03.121395 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34a60908-5d55-49ff-bff4-3d5da8be5ebf-kube-api-access-xbvb9" (OuterVolumeSpecName: "kube-api-access-xbvb9") pod "34a60908-5d55-49ff-bff4-3d5da8be5ebf" (UID: "34a60908-5d55-49ff-bff4-3d5da8be5ebf"). InnerVolumeSpecName "kube-api-access-xbvb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:34:03 crc kubenswrapper[4749]: I0219 19:34:03.216103 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34a60908-5d55-49ff-bff4-3d5da8be5ebf-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:34:03 crc kubenswrapper[4749]: I0219 19:34:03.216141 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbvb9\" (UniqueName: \"kubernetes.io/projected/34a60908-5d55-49ff-bff4-3d5da8be5ebf-kube-api-access-xbvb9\") on node \"crc\" DevicePath \"\"" Feb 19 19:34:03 crc kubenswrapper[4749]: I0219 19:34:03.421121 4749 generic.go:334] "Generic (PLEG): container finished" podID="34a60908-5d55-49ff-bff4-3d5da8be5ebf" containerID="0458252c3cf9b9bd809e252a43681928bf0f11cd38fb3eb0fea92d2f5c233526" exitCode=0 Feb 19 19:34:03 crc kubenswrapper[4749]: I0219 19:34:03.421169 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfw5q" event={"ID":"34a60908-5d55-49ff-bff4-3d5da8be5ebf","Type":"ContainerDied","Data":"0458252c3cf9b9bd809e252a43681928bf0f11cd38fb3eb0fea92d2f5c233526"} Feb 19 19:34:03 crc kubenswrapper[4749]: I0219 19:34:03.421197 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfw5q" event={"ID":"34a60908-5d55-49ff-bff4-3d5da8be5ebf","Type":"ContainerDied","Data":"ba8f54219f12ad93071b46a85581b650811c3b48f417e50bfe8cf6f2bef45b4f"} Feb 19 19:34:03 crc kubenswrapper[4749]: I0219 19:34:03.421216 4749 scope.go:117] "RemoveContainer" containerID="0458252c3cf9b9bd809e252a43681928bf0f11cd38fb3eb0fea92d2f5c233526" Feb 19 19:34:03 crc kubenswrapper[4749]: I0219 19:34:03.421241 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rfw5q" Feb 19 19:34:03 crc kubenswrapper[4749]: I0219 19:34:03.442177 4749 scope.go:117] "RemoveContainer" containerID="febd7c4fa18874a92b6b512baa459a650bdd24d4a629adf55e2dc80e6c5d6e49" Feb 19 19:34:03 crc kubenswrapper[4749]: I0219 19:34:03.462797 4749 scope.go:117] "RemoveContainer" containerID="2bcabf7055cfd45e9a43673cc611d4c4915ebb20bba0dbb8a514f58a268b17e6" Feb 19 19:34:03 crc kubenswrapper[4749]: I0219 19:34:03.519760 4749 scope.go:117] "RemoveContainer" containerID="0458252c3cf9b9bd809e252a43681928bf0f11cd38fb3eb0fea92d2f5c233526" Feb 19 19:34:03 crc kubenswrapper[4749]: E0219 19:34:03.520400 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0458252c3cf9b9bd809e252a43681928bf0f11cd38fb3eb0fea92d2f5c233526\": container with ID starting with 0458252c3cf9b9bd809e252a43681928bf0f11cd38fb3eb0fea92d2f5c233526 not found: ID does not exist" containerID="0458252c3cf9b9bd809e252a43681928bf0f11cd38fb3eb0fea92d2f5c233526" Feb 19 19:34:03 crc kubenswrapper[4749]: I0219 19:34:03.520650 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0458252c3cf9b9bd809e252a43681928bf0f11cd38fb3eb0fea92d2f5c233526"} err="failed to get container status \"0458252c3cf9b9bd809e252a43681928bf0f11cd38fb3eb0fea92d2f5c233526\": rpc error: code = NotFound desc = could not find container \"0458252c3cf9b9bd809e252a43681928bf0f11cd38fb3eb0fea92d2f5c233526\": container with ID starting with 0458252c3cf9b9bd809e252a43681928bf0f11cd38fb3eb0fea92d2f5c233526 not found: ID does not exist" Feb 19 19:34:03 crc kubenswrapper[4749]: I0219 19:34:03.520761 4749 scope.go:117] "RemoveContainer" containerID="febd7c4fa18874a92b6b512baa459a650bdd24d4a629adf55e2dc80e6c5d6e49" Feb 19 19:34:03 crc kubenswrapper[4749]: E0219 19:34:03.521283 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"febd7c4fa18874a92b6b512baa459a650bdd24d4a629adf55e2dc80e6c5d6e49\": container with ID starting with febd7c4fa18874a92b6b512baa459a650bdd24d4a629adf55e2dc80e6c5d6e49 not found: ID does not exist" containerID="febd7c4fa18874a92b6b512baa459a650bdd24d4a629adf55e2dc80e6c5d6e49" Feb 19 19:34:03 crc kubenswrapper[4749]: I0219 19:34:03.521327 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"febd7c4fa18874a92b6b512baa459a650bdd24d4a629adf55e2dc80e6c5d6e49"} err="failed to get container status \"febd7c4fa18874a92b6b512baa459a650bdd24d4a629adf55e2dc80e6c5d6e49\": rpc error: code = NotFound desc = could not find container \"febd7c4fa18874a92b6b512baa459a650bdd24d4a629adf55e2dc80e6c5d6e49\": container with ID starting with febd7c4fa18874a92b6b512baa459a650bdd24d4a629adf55e2dc80e6c5d6e49 not found: ID does not exist" Feb 19 19:34:03 crc kubenswrapper[4749]: I0219 19:34:03.521358 4749 scope.go:117] "RemoveContainer" containerID="2bcabf7055cfd45e9a43673cc611d4c4915ebb20bba0dbb8a514f58a268b17e6" Feb 19 19:34:03 crc kubenswrapper[4749]: E0219 19:34:03.523651 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bcabf7055cfd45e9a43673cc611d4c4915ebb20bba0dbb8a514f58a268b17e6\": container with ID starting with 2bcabf7055cfd45e9a43673cc611d4c4915ebb20bba0dbb8a514f58a268b17e6 not found: ID does not exist" containerID="2bcabf7055cfd45e9a43673cc611d4c4915ebb20bba0dbb8a514f58a268b17e6" Feb 19 19:34:03 crc kubenswrapper[4749]: I0219 19:34:03.523772 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bcabf7055cfd45e9a43673cc611d4c4915ebb20bba0dbb8a514f58a268b17e6"} err="failed to get container status \"2bcabf7055cfd45e9a43673cc611d4c4915ebb20bba0dbb8a514f58a268b17e6\": rpc error: code = NotFound desc = could not find container \"2bcabf7055cfd45e9a43673cc611d4c4915ebb20bba0dbb8a514f58a268b17e6\": container with ID starting with 2bcabf7055cfd45e9a43673cc611d4c4915ebb20bba0dbb8a514f58a268b17e6 not found: ID does not exist" Feb 19 19:34:03 crc kubenswrapper[4749]: I0219 19:34:03.664280 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34a60908-5d55-49ff-bff4-3d5da8be5ebf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34a60908-5d55-49ff-bff4-3d5da8be5ebf" (UID: "34a60908-5d55-49ff-bff4-3d5da8be5ebf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:34:03 crc kubenswrapper[4749]: I0219 19:34:03.727137 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34a60908-5d55-49ff-bff4-3d5da8be5ebf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:34:03 crc kubenswrapper[4749]: I0219 19:34:03.766664 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rfw5q"] Feb 19 19:34:03 crc kubenswrapper[4749]: I0219 19:34:03.777132 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rfw5q"] Feb 19 19:34:04 crc kubenswrapper[4749]: I0219 19:34:04.694442 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34a60908-5d55-49ff-bff4-3d5da8be5ebf" path="/var/lib/kubelet/pods/34a60908-5d55-49ff-bff4-3d5da8be5ebf/volumes" Feb 19 19:36:24 crc kubenswrapper[4749]: I0219 19:36:24.724959 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:36:24 crc kubenswrapper[4749]: I0219 19:36:24.725524 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:36:54 crc kubenswrapper[4749]: I0219 19:36:54.726267 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:36:54 crc kubenswrapper[4749]: I0219 19:36:54.726809 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:37:24 crc kubenswrapper[4749]: I0219 19:37:24.725093 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:37:24 crc kubenswrapper[4749]: I0219 19:37:24.725561 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:37:24 crc kubenswrapper[4749]: I0219 19:37:24.725612 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 19:37:24 crc kubenswrapper[4749]: I0219 19:37:24.726468 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd"} pod="openshift-machine-config-operator/machine-config-daemon-nzldt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:37:24 crc kubenswrapper[4749]: I0219 19:37:24.726530 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" containerID="cri-o://85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd" gracePeriod=600 Feb 19 19:37:24 crc kubenswrapper[4749]: E0219 19:37:24.845803 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:37:25 crc kubenswrapper[4749]: I0219 19:37:25.030505 4749 generic.go:334] "Generic (PLEG): container finished" podID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerID="85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd" exitCode=0 Feb 19 19:37:25 crc kubenswrapper[4749]: I0219 19:37:25.030540 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerDied","Data":"85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd"} Feb 19 19:37:25 crc kubenswrapper[4749]: I0219 19:37:25.030590 4749 scope.go:117] "RemoveContainer" containerID="4ec4d745220a9131845dd02d7496e4db3d90a311584068adc0592a01b23f118f" Feb 19 19:37:25 crc kubenswrapper[4749]: I0219 19:37:25.031237 4749 scope.go:117] "RemoveContainer" containerID="85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd" Feb 19 19:37:25 crc kubenswrapper[4749]: E0219 19:37:25.031517 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:37:38 crc kubenswrapper[4749]: I0219 19:37:38.678834 4749 scope.go:117] "RemoveContainer" containerID="85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd" Feb 19 19:37:38 crc kubenswrapper[4749]: E0219 19:37:38.679645 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:37:53 crc kubenswrapper[4749]: I0219 19:37:53.679324 4749 scope.go:117] "RemoveContainer" containerID="85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd" Feb 19 19:37:53 crc kubenswrapper[4749]: E0219 19:37:53.680175 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:38:06 crc kubenswrapper[4749]: I0219 19:38:06.689185 4749 scope.go:117] "RemoveContainer" containerID="85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd" Feb 19 19:38:06 crc kubenswrapper[4749]: E0219 19:38:06.690185 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:38:21 crc kubenswrapper[4749]: I0219 19:38:21.679217 4749 scope.go:117] "RemoveContainer" containerID="85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd" Feb 19 19:38:21 crc kubenswrapper[4749]: E0219 19:38:21.680297 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:38:35 crc kubenswrapper[4749]: I0219 19:38:35.680320 4749 scope.go:117] "RemoveContainer" containerID="85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd" Feb 19 19:38:35 crc kubenswrapper[4749]: E0219 19:38:35.681151 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:38:50 crc kubenswrapper[4749]: I0219 19:38:50.679316 4749 scope.go:117] "RemoveContainer" containerID="85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd" Feb 19 19:38:50 crc kubenswrapper[4749]: E0219 19:38:50.679920 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:39:00 crc kubenswrapper[4749]: I0219 19:39:00.585129 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c9rrb"] Feb 19 19:39:00 crc kubenswrapper[4749]: E0219 19:39:00.586260 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34a60908-5d55-49ff-bff4-3d5da8be5ebf" containerName="registry-server" Feb 19 19:39:00 crc kubenswrapper[4749]: I0219 19:39:00.586279 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="34a60908-5d55-49ff-bff4-3d5da8be5ebf" containerName="registry-server" Feb 19 19:39:00 crc kubenswrapper[4749]: E0219 19:39:00.586299 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34a60908-5d55-49ff-bff4-3d5da8be5ebf" containerName="extract-utilities" Feb 19 19:39:00 crc kubenswrapper[4749]: I0219 19:39:00.586308 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="34a60908-5d55-49ff-bff4-3d5da8be5ebf" containerName="extract-utilities" Feb 19 19:39:00 crc kubenswrapper[4749]: E0219 19:39:00.586338 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34a60908-5d55-49ff-bff4-3d5da8be5ebf" containerName="extract-content" Feb 19 19:39:00 crc kubenswrapper[4749]: I0219 19:39:00.586347 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="34a60908-5d55-49ff-bff4-3d5da8be5ebf" containerName="extract-content" Feb 19 19:39:00 crc kubenswrapper[4749]: I0219 19:39:00.586612 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="34a60908-5d55-49ff-bff4-3d5da8be5ebf" containerName="registry-server" Feb 19 19:39:00 crc kubenswrapper[4749]: I0219 19:39:00.588915 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c9rrb" Feb 19 19:39:00 crc kubenswrapper[4749]: I0219 19:39:00.602135 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c9rrb"] Feb 19 19:39:00 crc kubenswrapper[4749]: I0219 19:39:00.713893 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xgpf\" (UniqueName: \"kubernetes.io/projected/5d047bc6-e4d1-4cc0-9692-16e38608adff-kube-api-access-6xgpf\") pod \"redhat-marketplace-c9rrb\" (UID: \"5d047bc6-e4d1-4cc0-9692-16e38608adff\") " pod="openshift-marketplace/redhat-marketplace-c9rrb" Feb 19 19:39:00 crc kubenswrapper[4749]: I0219 19:39:00.714010 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d047bc6-e4d1-4cc0-9692-16e38608adff-catalog-content\") pod \"redhat-marketplace-c9rrb\" (UID: \"5d047bc6-e4d1-4cc0-9692-16e38608adff\") " pod="openshift-marketplace/redhat-marketplace-c9rrb" Feb 19 19:39:00 crc kubenswrapper[4749]: I0219 19:39:00.714052 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d047bc6-e4d1-4cc0-9692-16e38608adff-utilities\") pod \"redhat-marketplace-c9rrb\" (UID: \"5d047bc6-e4d1-4cc0-9692-16e38608adff\") " pod="openshift-marketplace/redhat-marketplace-c9rrb" Feb 19 19:39:00 crc kubenswrapper[4749]: I0219 19:39:00.815856 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d047bc6-e4d1-4cc0-9692-16e38608adff-catalog-content\") pod \"redhat-marketplace-c9rrb\" (UID: \"5d047bc6-e4d1-4cc0-9692-16e38608adff\") " pod="openshift-marketplace/redhat-marketplace-c9rrb" Feb 19 19:39:00 crc kubenswrapper[4749]: I0219 19:39:00.815932 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d047bc6-e4d1-4cc0-9692-16e38608adff-utilities\") pod \"redhat-marketplace-c9rrb\" (UID: \"5d047bc6-e4d1-4cc0-9692-16e38608adff\") " pod="openshift-marketplace/redhat-marketplace-c9rrb" Feb 19 19:39:00 crc kubenswrapper[4749]: I0219 19:39:00.816178 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xgpf\" (UniqueName: \"kubernetes.io/projected/5d047bc6-e4d1-4cc0-9692-16e38608adff-kube-api-access-6xgpf\") pod \"redhat-marketplace-c9rrb\" (UID: \"5d047bc6-e4d1-4cc0-9692-16e38608adff\") " pod="openshift-marketplace/redhat-marketplace-c9rrb" Feb 19 19:39:00 crc kubenswrapper[4749]: I0219 19:39:00.816509 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d047bc6-e4d1-4cc0-9692-16e38608adff-catalog-content\") pod \"redhat-marketplace-c9rrb\" (UID: \"5d047bc6-e4d1-4cc0-9692-16e38608adff\") " pod="openshift-marketplace/redhat-marketplace-c9rrb" Feb 19 19:39:00 crc kubenswrapper[4749]: I0219 19:39:00.816550 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d047bc6-e4d1-4cc0-9692-16e38608adff-utilities\") pod \"redhat-marketplace-c9rrb\" (UID: \"5d047bc6-e4d1-4cc0-9692-16e38608adff\") " pod="openshift-marketplace/redhat-marketplace-c9rrb" Feb 19 19:39:00 crc kubenswrapper[4749]: I0219 19:39:00.837485 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xgpf\" (UniqueName: \"kubernetes.io/projected/5d047bc6-e4d1-4cc0-9692-16e38608adff-kube-api-access-6xgpf\") pod \"redhat-marketplace-c9rrb\" (UID: \"5d047bc6-e4d1-4cc0-9692-16e38608adff\") " pod="openshift-marketplace/redhat-marketplace-c9rrb" Feb 19 19:39:00 crc kubenswrapper[4749]: I0219 19:39:00.908677 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c9rrb" Feb 19 19:39:01 crc kubenswrapper[4749]: I0219 19:39:01.752744 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c9rrb"] Feb 19 19:39:02 crc kubenswrapper[4749]: I0219 19:39:02.066875 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9rrb" event={"ID":"5d047bc6-e4d1-4cc0-9692-16e38608adff","Type":"ContainerStarted","Data":"82c84d91f5453c225c19eb7e257297e627c1d9bdee4685fb3060ecb1e4594dab"} Feb 19 19:39:04 crc kubenswrapper[4749]: I0219 19:39:04.087631 4749 generic.go:334] "Generic (PLEG): container finished" podID="5d047bc6-e4d1-4cc0-9692-16e38608adff" containerID="cd0ef7ac2e69a8c88502ac47e8bc1f2871bd80178860e6ab03946fcd6745fdb6" exitCode=0 Feb 19 19:39:04 crc kubenswrapper[4749]: I0219 19:39:04.087742 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9rrb" event={"ID":"5d047bc6-e4d1-4cc0-9692-16e38608adff","Type":"ContainerDied","Data":"cd0ef7ac2e69a8c88502ac47e8bc1f2871bd80178860e6ab03946fcd6745fdb6"} Feb 19 19:39:04 crc kubenswrapper[4749]: I0219 19:39:04.090938 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 19:39:04 crc kubenswrapper[4749]: I0219 19:39:04.679541 4749 scope.go:117] "RemoveContainer" containerID="85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd" Feb 19 19:39:04 crc kubenswrapper[4749]: E0219 19:39:04.680091 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:39:06 crc kubenswrapper[4749]: I0219 19:39:06.111395 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9rrb" event={"ID":"5d047bc6-e4d1-4cc0-9692-16e38608adff","Type":"ContainerStarted","Data":"0c1965b384b44a35da1cffe31d1f2a69140ea0975a3b5cb4c59eb62bfbc7177b"} Feb 19 19:39:07 crc kubenswrapper[4749]: I0219 19:39:07.122201 4749 generic.go:334] "Generic (PLEG): container finished" podID="5d047bc6-e4d1-4cc0-9692-16e38608adff" containerID="0c1965b384b44a35da1cffe31d1f2a69140ea0975a3b5cb4c59eb62bfbc7177b" exitCode=0 Feb 19 19:39:07 crc kubenswrapper[4749]: I0219 19:39:07.122859 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9rrb" event={"ID":"5d047bc6-e4d1-4cc0-9692-16e38608adff","Type":"ContainerDied","Data":"0c1965b384b44a35da1cffe31d1f2a69140ea0975a3b5cb4c59eb62bfbc7177b"} Feb 19 19:39:08 crc kubenswrapper[4749]: I0219 19:39:08.143108 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9rrb" event={"ID":"5d047bc6-e4d1-4cc0-9692-16e38608adff","Type":"ContainerStarted","Data":"0d4df3a4ce207d5cdbbed7be3d3916e83a9891aabfe7f2380c9cdddaed7bbb3e"} Feb 19 19:39:08 crc kubenswrapper[4749]: I0219 19:39:08.181951 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c9rrb" podStartSLOduration=4.7731199140000005 podStartE2EDuration="8.18189443s" podCreationTimestamp="2026-02-19 19:39:00 +0000 UTC" firstStartedPulling="2026-02-19 19:39:04.090468407 +0000 UTC m=+3918.051688391" lastFinishedPulling="2026-02-19 19:39:07.499242953 +0000 UTC m=+3921.460462907" observedRunningTime="2026-02-19 19:39:08.165817432 +0000 UTC m=+3922.127037416" watchObservedRunningTime="2026-02-19 19:39:08.18189443 +0000 UTC m=+3922.143114394" Feb 19 19:39:10 crc kubenswrapper[4749]: I0219 19:39:10.909082 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c9rrb" Feb 19 19:39:10 crc kubenswrapper[4749]: I0219 19:39:10.909461 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c9rrb" Feb 19 19:39:10 crc kubenswrapper[4749]: I0219 19:39:10.974962 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c9rrb" Feb 19 19:39:19 crc kubenswrapper[4749]: I0219 19:39:19.679454 4749 scope.go:117] "RemoveContainer" containerID="85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd" Feb 19 19:39:19 crc kubenswrapper[4749]: E0219 19:39:19.680270 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:39:20 crc kubenswrapper[4749]: I0219 19:39:20.975115 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c9rrb" Feb 19 19:39:21 crc kubenswrapper[4749]: I0219 19:39:21.031321 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c9rrb"] Feb 19 19:39:21 crc kubenswrapper[4749]: I0219 19:39:21.260468 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c9rrb" podUID="5d047bc6-e4d1-4cc0-9692-16e38608adff" containerName="registry-server" containerID="cri-o://0d4df3a4ce207d5cdbbed7be3d3916e83a9891aabfe7f2380c9cdddaed7bbb3e" gracePeriod=2 Feb 19 19:39:21 crc kubenswrapper[4749]: I0219 19:39:21.784581 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c9rrb" Feb 19 19:39:21 crc kubenswrapper[4749]: I0219 19:39:21.874194 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xgpf\" (UniqueName: \"kubernetes.io/projected/5d047bc6-e4d1-4cc0-9692-16e38608adff-kube-api-access-6xgpf\") pod \"5d047bc6-e4d1-4cc0-9692-16e38608adff\" (UID: \"5d047bc6-e4d1-4cc0-9692-16e38608adff\") " Feb 19 19:39:21 crc kubenswrapper[4749]: I0219 19:39:21.874357 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d047bc6-e4d1-4cc0-9692-16e38608adff-utilities\") pod \"5d047bc6-e4d1-4cc0-9692-16e38608adff\" (UID: \"5d047bc6-e4d1-4cc0-9692-16e38608adff\") " Feb 19 19:39:21 crc kubenswrapper[4749]: I0219 19:39:21.874787 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d047bc6-e4d1-4cc0-9692-16e38608adff-catalog-content\") pod \"5d047bc6-e4d1-4cc0-9692-16e38608adff\" (UID: \"5d047bc6-e4d1-4cc0-9692-16e38608adff\") " Feb 19 19:39:21 crc kubenswrapper[4749]: I0219 19:39:21.875992 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d047bc6-e4d1-4cc0-9692-16e38608adff-utilities" (OuterVolumeSpecName: "utilities") pod "5d047bc6-e4d1-4cc0-9692-16e38608adff" (UID: "5d047bc6-e4d1-4cc0-9692-16e38608adff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:39:21 crc kubenswrapper[4749]: I0219 19:39:21.886273 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d047bc6-e4d1-4cc0-9692-16e38608adff-kube-api-access-6xgpf" (OuterVolumeSpecName: "kube-api-access-6xgpf") pod "5d047bc6-e4d1-4cc0-9692-16e38608adff" (UID: "5d047bc6-e4d1-4cc0-9692-16e38608adff"). InnerVolumeSpecName "kube-api-access-6xgpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:21 crc kubenswrapper[4749]: I0219 19:39:21.900368 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d047bc6-e4d1-4cc0-9692-16e38608adff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d047bc6-e4d1-4cc0-9692-16e38608adff" (UID: "5d047bc6-e4d1-4cc0-9692-16e38608adff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:39:21 crc kubenswrapper[4749]: I0219 19:39:21.977446 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d047bc6-e4d1-4cc0-9692-16e38608adff-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:21 crc kubenswrapper[4749]: I0219 19:39:21.977489 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xgpf\" (UniqueName: \"kubernetes.io/projected/5d047bc6-e4d1-4cc0-9692-16e38608adff-kube-api-access-6xgpf\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:21 crc kubenswrapper[4749]: I0219 19:39:21.977505 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d047bc6-e4d1-4cc0-9692-16e38608adff-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:22 crc kubenswrapper[4749]: I0219 19:39:22.271469 4749 generic.go:334] "Generic (PLEG): container finished" podID="5d047bc6-e4d1-4cc0-9692-16e38608adff" containerID="0d4df3a4ce207d5cdbbed7be3d3916e83a9891aabfe7f2380c9cdddaed7bbb3e" exitCode=0 Feb 19 19:39:22 crc kubenswrapper[4749]: I0219 19:39:22.271512 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9rrb" event={"ID":"5d047bc6-e4d1-4cc0-9692-16e38608adff","Type":"ContainerDied","Data":"0d4df3a4ce207d5cdbbed7be3d3916e83a9891aabfe7f2380c9cdddaed7bbb3e"} Feb 19 19:39:22 crc kubenswrapper[4749]: I0219 19:39:22.271526 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c9rrb" Feb 19 19:39:22 crc kubenswrapper[4749]: I0219 19:39:22.271553 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9rrb" event={"ID":"5d047bc6-e4d1-4cc0-9692-16e38608adff","Type":"ContainerDied","Data":"82c84d91f5453c225c19eb7e257297e627c1d9bdee4685fb3060ecb1e4594dab"} Feb 19 19:39:22 crc kubenswrapper[4749]: I0219 19:39:22.271574 4749 scope.go:117] "RemoveContainer" containerID="0d4df3a4ce207d5cdbbed7be3d3916e83a9891aabfe7f2380c9cdddaed7bbb3e" Feb 19 19:39:22 crc kubenswrapper[4749]: I0219 19:39:22.316232 4749 scope.go:117] "RemoveContainer" containerID="0c1965b384b44a35da1cffe31d1f2a69140ea0975a3b5cb4c59eb62bfbc7177b" Feb 19 19:39:22 crc kubenswrapper[4749]: I0219 19:39:22.325209 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c9rrb"] Feb 19 19:39:22 crc kubenswrapper[4749]: I0219 19:39:22.336062 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c9rrb"] Feb 19 19:39:22 crc kubenswrapper[4749]: I0219 19:39:22.341837 4749 scope.go:117] "RemoveContainer" containerID="cd0ef7ac2e69a8c88502ac47e8bc1f2871bd80178860e6ab03946fcd6745fdb6" Feb 19 19:39:22 crc kubenswrapper[4749]: I0219 19:39:22.393443 4749 scope.go:117] "RemoveContainer" containerID="0d4df3a4ce207d5cdbbed7be3d3916e83a9891aabfe7f2380c9cdddaed7bbb3e" Feb 19 19:39:22 crc kubenswrapper[4749]: E0219 19:39:22.393996 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d4df3a4ce207d5cdbbed7be3d3916e83a9891aabfe7f2380c9cdddaed7bbb3e\": container with ID starting with 0d4df3a4ce207d5cdbbed7be3d3916e83a9891aabfe7f2380c9cdddaed7bbb3e not found: ID does not exist" containerID="0d4df3a4ce207d5cdbbed7be3d3916e83a9891aabfe7f2380c9cdddaed7bbb3e" Feb 19 19:39:22 crc kubenswrapper[4749]: I0219 19:39:22.394068 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d4df3a4ce207d5cdbbed7be3d3916e83a9891aabfe7f2380c9cdddaed7bbb3e"} err="failed to get container status \"0d4df3a4ce207d5cdbbed7be3d3916e83a9891aabfe7f2380c9cdddaed7bbb3e\": rpc error: code = NotFound desc = could not find container \"0d4df3a4ce207d5cdbbed7be3d3916e83a9891aabfe7f2380c9cdddaed7bbb3e\": container with ID starting with 0d4df3a4ce207d5cdbbed7be3d3916e83a9891aabfe7f2380c9cdddaed7bbb3e not found: ID does not exist" Feb 19 19:39:22 crc kubenswrapper[4749]: I0219 19:39:22.394096 4749 scope.go:117] "RemoveContainer" containerID="0c1965b384b44a35da1cffe31d1f2a69140ea0975a3b5cb4c59eb62bfbc7177b" Feb 19 19:39:22 crc kubenswrapper[4749]: E0219 19:39:22.394826 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c1965b384b44a35da1cffe31d1f2a69140ea0975a3b5cb4c59eb62bfbc7177b\": container with ID starting with 0c1965b384b44a35da1cffe31d1f2a69140ea0975a3b5cb4c59eb62bfbc7177b not found: ID does not exist" containerID="0c1965b384b44a35da1cffe31d1f2a69140ea0975a3b5cb4c59eb62bfbc7177b" Feb 19 19:39:22 crc kubenswrapper[4749]: I0219 19:39:22.394878 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c1965b384b44a35da1cffe31d1f2a69140ea0975a3b5cb4c59eb62bfbc7177b"} err="failed to get container status \"0c1965b384b44a35da1cffe31d1f2a69140ea0975a3b5cb4c59eb62bfbc7177b\": rpc error: code = NotFound desc = could not find container \"0c1965b384b44a35da1cffe31d1f2a69140ea0975a3b5cb4c59eb62bfbc7177b\": container with ID starting with 0c1965b384b44a35da1cffe31d1f2a69140ea0975a3b5cb4c59eb62bfbc7177b not found: ID does not exist" Feb 19 19:39:22 crc kubenswrapper[4749]: I0219 19:39:22.394913 4749 scope.go:117] "RemoveContainer" containerID="cd0ef7ac2e69a8c88502ac47e8bc1f2871bd80178860e6ab03946fcd6745fdb6" Feb 19 19:39:22 crc kubenswrapper[4749]: E0219 19:39:22.395355 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd0ef7ac2e69a8c88502ac47e8bc1f2871bd80178860e6ab03946fcd6745fdb6\": container with ID starting with cd0ef7ac2e69a8c88502ac47e8bc1f2871bd80178860e6ab03946fcd6745fdb6 not found: ID does not exist" containerID="cd0ef7ac2e69a8c88502ac47e8bc1f2871bd80178860e6ab03946fcd6745fdb6" Feb 19 19:39:22 crc kubenswrapper[4749]: I0219 19:39:22.395389 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd0ef7ac2e69a8c88502ac47e8bc1f2871bd80178860e6ab03946fcd6745fdb6"} err="failed to get container status \"cd0ef7ac2e69a8c88502ac47e8bc1f2871bd80178860e6ab03946fcd6745fdb6\": rpc error: code = NotFound desc = could not find container \"cd0ef7ac2e69a8c88502ac47e8bc1f2871bd80178860e6ab03946fcd6745fdb6\": container with ID starting with cd0ef7ac2e69a8c88502ac47e8bc1f2871bd80178860e6ab03946fcd6745fdb6 not found: ID does not exist" Feb 19 19:39:22 crc kubenswrapper[4749]: I0219 19:39:22.690275 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d047bc6-e4d1-4cc0-9692-16e38608adff" path="/var/lib/kubelet/pods/5d047bc6-e4d1-4cc0-9692-16e38608adff/volumes" Feb 19 19:39:34 crc kubenswrapper[4749]: I0219 19:39:34.679732 4749 scope.go:117] "RemoveContainer" containerID="85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd" Feb 19 19:39:34 crc kubenswrapper[4749]: E0219 19:39:34.680828 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:39:48 crc kubenswrapper[4749]: I0219 19:39:48.679615 4749 scope.go:117] "RemoveContainer" containerID="85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd" Feb 19 19:39:48 crc kubenswrapper[4749]: E0219 19:39:48.681928 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:40:02 crc kubenswrapper[4749]: I0219 19:40:02.678722 4749 scope.go:117] "RemoveContainer" containerID="85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd" Feb 19 19:40:02 crc kubenswrapper[4749]: E0219 19:40:02.679599 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:40:16 crc kubenswrapper[4749]: I0219 19:40:16.692364 4749 scope.go:117] "RemoveContainer" containerID="85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd" Feb 19 19:40:16 crc kubenswrapper[4749]: E0219 19:40:16.693724 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:40:31 crc kubenswrapper[4749]: I0219 19:40:31.679682 4749 scope.go:117] "RemoveContainer" containerID="85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd" Feb 19 19:40:31 crc kubenswrapper[4749]: E0219 19:40:31.680441 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:40:46 crc kubenswrapper[4749]: I0219 19:40:46.685958 4749 scope.go:117] "RemoveContainer" containerID="85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd" Feb 19 19:40:46 crc kubenswrapper[4749]: E0219 19:40:46.686978 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:40:51 crc kubenswrapper[4749]: I0219 19:40:51.539190 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hzzfc"] Feb 19 19:40:51 crc kubenswrapper[4749]: E0219 19:40:51.540314 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d047bc6-e4d1-4cc0-9692-16e38608adff" containerName="extract-content" Feb 19 19:40:51 crc kubenswrapper[4749]: I0219 19:40:51.540327 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d047bc6-e4d1-4cc0-9692-16e38608adff" containerName="extract-content" Feb 19 19:40:51 crc kubenswrapper[4749]: E0219 19:40:51.540353 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d047bc6-e4d1-4cc0-9692-16e38608adff" containerName="extract-utilities" Feb 19 19:40:51 crc kubenswrapper[4749]: I0219 19:40:51.540360 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d047bc6-e4d1-4cc0-9692-16e38608adff" containerName="extract-utilities" Feb 19 19:40:51 crc kubenswrapper[4749]: E0219 19:40:51.540372 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d047bc6-e4d1-4cc0-9692-16e38608adff" containerName="registry-server" Feb 19 19:40:51 crc kubenswrapper[4749]: I0219 19:40:51.540378 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d047bc6-e4d1-4cc0-9692-16e38608adff" containerName="registry-server" Feb 19 19:40:51 crc kubenswrapper[4749]: I0219 19:40:51.540624 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d047bc6-e4d1-4cc0-9692-16e38608adff" containerName="registry-server" Feb 19 19:40:51 crc kubenswrapper[4749]: I0219 19:40:51.542308 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hzzfc" Feb 19 19:40:51 crc kubenswrapper[4749]: I0219 19:40:51.555562 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hzzfc"] Feb 19 19:40:51 crc kubenswrapper[4749]: I0219 19:40:51.558791 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27x46\" (UniqueName: \"kubernetes.io/projected/4773cd8e-04ef-45bf-ad2f-2c387310fd74-kube-api-access-27x46\") pod \"community-operators-hzzfc\" (UID: \"4773cd8e-04ef-45bf-ad2f-2c387310fd74\") " pod="openshift-marketplace/community-operators-hzzfc" Feb 19 19:40:51 crc kubenswrapper[4749]: I0219 19:40:51.558937 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4773cd8e-04ef-45bf-ad2f-2c387310fd74-utilities\") pod \"community-operators-hzzfc\" (UID: \"4773cd8e-04ef-45bf-ad2f-2c387310fd74\") " pod="openshift-marketplace/community-operators-hzzfc" Feb 19 19:40:51 crc kubenswrapper[4749]: I0219 19:40:51.558973 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4773cd8e-04ef-45bf-ad2f-2c387310fd74-catalog-content\") pod \"community-operators-hzzfc\" (UID: \"4773cd8e-04ef-45bf-ad2f-2c387310fd74\") " pod="openshift-marketplace/community-operators-hzzfc" Feb 19 19:40:51 crc kubenswrapper[4749]: I0219 19:40:51.660839 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4773cd8e-04ef-45bf-ad2f-2c387310fd74-utilities\") pod \"community-operators-hzzfc\" (UID: \"4773cd8e-04ef-45bf-ad2f-2c387310fd74\") " pod="openshift-marketplace/community-operators-hzzfc" Feb 19 19:40:51 crc kubenswrapper[4749]: I0219 19:40:51.660892 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4773cd8e-04ef-45bf-ad2f-2c387310fd74-catalog-content\") pod \"community-operators-hzzfc\" (UID: \"4773cd8e-04ef-45bf-ad2f-2c387310fd74\") " pod="openshift-marketplace/community-operators-hzzfc" Feb 19 19:40:51 crc kubenswrapper[4749]: I0219 19:40:51.661062 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27x46\" (UniqueName: \"kubernetes.io/projected/4773cd8e-04ef-45bf-ad2f-2c387310fd74-kube-api-access-27x46\") pod \"community-operators-hzzfc\" (UID: \"4773cd8e-04ef-45bf-ad2f-2c387310fd74\") " pod="openshift-marketplace/community-operators-hzzfc" Feb 19 19:40:51 crc kubenswrapper[4749]: I0219 19:40:51.661389 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4773cd8e-04ef-45bf-ad2f-2c387310fd74-utilities\") pod \"community-operators-hzzfc\" (UID: \"4773cd8e-04ef-45bf-ad2f-2c387310fd74\") " pod="openshift-marketplace/community-operators-hzzfc" Feb 19 19:40:51 crc kubenswrapper[4749]: I0219 19:40:51.661459 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4773cd8e-04ef-45bf-ad2f-2c387310fd74-catalog-content\") pod \"community-operators-hzzfc\" (UID: \"4773cd8e-04ef-45bf-ad2f-2c387310fd74\") " pod="openshift-marketplace/community-operators-hzzfc" Feb 19 19:40:51 crc kubenswrapper[4749]: I0219 19:40:51.691772 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27x46\" (UniqueName: \"kubernetes.io/projected/4773cd8e-04ef-45bf-ad2f-2c387310fd74-kube-api-access-27x46\") pod \"community-operators-hzzfc\" (UID: \"4773cd8e-04ef-45bf-ad2f-2c387310fd74\") " pod="openshift-marketplace/community-operators-hzzfc" Feb 19 19:40:51 crc kubenswrapper[4749]: I0219 19:40:51.876900 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hzzfc" Feb 19 19:40:52 crc kubenswrapper[4749]: I0219 19:40:52.426721 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hzzfc"] Feb 19 19:40:53 crc kubenswrapper[4749]: I0219 19:40:53.155778 4749 generic.go:334] "Generic (PLEG): container finished" podID="4773cd8e-04ef-45bf-ad2f-2c387310fd74" containerID="6e077dbe27b17d3327ca77eb88fdfe9fab25b6b82631a4da1afe7ce6819d40a8" exitCode=0 Feb 19 19:40:53 crc kubenswrapper[4749]: I0219 19:40:53.155896 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hzzfc" event={"ID":"4773cd8e-04ef-45bf-ad2f-2c387310fd74","Type":"ContainerDied","Data":"6e077dbe27b17d3327ca77eb88fdfe9fab25b6b82631a4da1afe7ce6819d40a8"} Feb 19 19:40:53 crc kubenswrapper[4749]: I0219 19:40:53.157075 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hzzfc" event={"ID":"4773cd8e-04ef-45bf-ad2f-2c387310fd74","Type":"ContainerStarted","Data":"974bf23d4c79f425b109319514b8b965bbcec36734e31e36a4ffe80ed09588c8"} Feb 19 19:40:55 crc kubenswrapper[4749]: I0219 19:40:55.181331 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hzzfc" event={"ID":"4773cd8e-04ef-45bf-ad2f-2c387310fd74","Type":"ContainerStarted","Data":"1abe47d3879a6a6de50bb036407360d55b0538a11c28998e467d354b26145652"} Feb 19 19:40:58 crc kubenswrapper[4749]: I0219 19:40:58.208874 4749 generic.go:334] "Generic (PLEG): container finished" podID="4773cd8e-04ef-45bf-ad2f-2c387310fd74" containerID="1abe47d3879a6a6de50bb036407360d55b0538a11c28998e467d354b26145652" exitCode=0 Feb 19 19:40:58 crc kubenswrapper[4749]: I0219 19:40:58.208968 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hzzfc" event={"ID":"4773cd8e-04ef-45bf-ad2f-2c387310fd74","Type":"ContainerDied","Data":"1abe47d3879a6a6de50bb036407360d55b0538a11c28998e467d354b26145652"} Feb 19 19:40:59 crc kubenswrapper[4749]: I0219 19:40:59.220954 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hzzfc" event={"ID":"4773cd8e-04ef-45bf-ad2f-2c387310fd74","Type":"ContainerStarted","Data":"d84db3de45d02882c2e4ce91496e74df3cd6a3a551f24d625ec54e80f82f8b02"} Feb 19 19:40:59 crc kubenswrapper[4749]: I0219 19:40:59.240878 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hzzfc" podStartSLOduration=2.650987123 podStartE2EDuration="8.240861082s" podCreationTimestamp="2026-02-19 19:40:51 +0000 UTC" firstStartedPulling="2026-02-19 19:40:53.157416643 +0000 UTC m=+4027.118636597" lastFinishedPulling="2026-02-19 19:40:58.747290602 +0000 UTC m=+4032.708510556" observedRunningTime="2026-02-19 19:40:59.236546328 +0000 UTC m=+4033.197766282" watchObservedRunningTime="2026-02-19 19:40:59.240861082 +0000 UTC m=+4033.202081036" Feb 19 19:41:00 crc kubenswrapper[4749]: I0219 19:41:00.679466 4749 scope.go:117] "RemoveContainer" containerID="85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd" Feb 19 19:41:00 crc kubenswrapper[4749]: E0219 19:41:00.680052 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:41:01 crc kubenswrapper[4749]: I0219 19:41:01.877355 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hzzfc" Feb 19 19:41:01 crc kubenswrapper[4749]: I0219 19:41:01.877626 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hzzfc" Feb 19 19:41:01 crc kubenswrapper[4749]: I0219 19:41:01.924398 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hzzfc" Feb 19 19:41:11 crc kubenswrapper[4749]: I0219 19:41:11.923643 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hzzfc" Feb 19 19:41:11 crc kubenswrapper[4749]: I0219 19:41:11.985858 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hzzfc"] Feb 19 19:41:12 crc kubenswrapper[4749]: I0219 19:41:12.363587 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hzzfc" podUID="4773cd8e-04ef-45bf-ad2f-2c387310fd74" containerName="registry-server" containerID="cri-o://d84db3de45d02882c2e4ce91496e74df3cd6a3a551f24d625ec54e80f82f8b02" gracePeriod=2 Feb 19 19:41:13 crc kubenswrapper[4749]: I0219 19:41:13.379501 4749 generic.go:334] "Generic (PLEG): container finished" podID="4773cd8e-04ef-45bf-ad2f-2c387310fd74" containerID="d84db3de45d02882c2e4ce91496e74df3cd6a3a551f24d625ec54e80f82f8b02" exitCode=0 Feb 19 19:41:13 crc kubenswrapper[4749]: I0219 19:41:13.379590 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hzzfc" event={"ID":"4773cd8e-04ef-45bf-ad2f-2c387310fd74","Type":"ContainerDied","Data":"d84db3de45d02882c2e4ce91496e74df3cd6a3a551f24d625ec54e80f82f8b02"} Feb 19 19:41:13 crc kubenswrapper[4749]: I0219 19:41:13.539930 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hzzfc" Feb 19 19:41:13 crc kubenswrapper[4749]: I0219 19:41:13.691202 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4773cd8e-04ef-45bf-ad2f-2c387310fd74-catalog-content\") pod \"4773cd8e-04ef-45bf-ad2f-2c387310fd74\" (UID: \"4773cd8e-04ef-45bf-ad2f-2c387310fd74\") " Feb 19 19:41:13 crc kubenswrapper[4749]: I0219 19:41:13.691256 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4773cd8e-04ef-45bf-ad2f-2c387310fd74-utilities\") pod \"4773cd8e-04ef-45bf-ad2f-2c387310fd74\" (UID: \"4773cd8e-04ef-45bf-ad2f-2c387310fd74\") " Feb 19 19:41:13 crc kubenswrapper[4749]: I0219 19:41:13.691330 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27x46\" (UniqueName: \"kubernetes.io/projected/4773cd8e-04ef-45bf-ad2f-2c387310fd74-kube-api-access-27x46\") pod \"4773cd8e-04ef-45bf-ad2f-2c387310fd74\" (UID: \"4773cd8e-04ef-45bf-ad2f-2c387310fd74\") " Feb 19 19:41:13 crc kubenswrapper[4749]: I0219 19:41:13.693086 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4773cd8e-04ef-45bf-ad2f-2c387310fd74-utilities" (OuterVolumeSpecName: "utilities") pod "4773cd8e-04ef-45bf-ad2f-2c387310fd74" (UID: "4773cd8e-04ef-45bf-ad2f-2c387310fd74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:41:13 crc kubenswrapper[4749]: I0219 19:41:13.697220 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4773cd8e-04ef-45bf-ad2f-2c387310fd74-kube-api-access-27x46" (OuterVolumeSpecName: "kube-api-access-27x46") pod "4773cd8e-04ef-45bf-ad2f-2c387310fd74" (UID: "4773cd8e-04ef-45bf-ad2f-2c387310fd74"). InnerVolumeSpecName "kube-api-access-27x46". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:13 crc kubenswrapper[4749]: I0219 19:41:13.749803 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4773cd8e-04ef-45bf-ad2f-2c387310fd74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4773cd8e-04ef-45bf-ad2f-2c387310fd74" (UID: "4773cd8e-04ef-45bf-ad2f-2c387310fd74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:41:13 crc kubenswrapper[4749]: I0219 19:41:13.793624 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4773cd8e-04ef-45bf-ad2f-2c387310fd74-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:13 crc kubenswrapper[4749]: I0219 19:41:13.793668 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4773cd8e-04ef-45bf-ad2f-2c387310fd74-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:13 crc kubenswrapper[4749]: I0219 19:41:13.793678 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27x46\" (UniqueName: \"kubernetes.io/projected/4773cd8e-04ef-45bf-ad2f-2c387310fd74-kube-api-access-27x46\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:14 crc kubenswrapper[4749]: I0219 19:41:14.392846 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hzzfc" event={"ID":"4773cd8e-04ef-45bf-ad2f-2c387310fd74","Type":"ContainerDied","Data":"974bf23d4c79f425b109319514b8b965bbcec36734e31e36a4ffe80ed09588c8"} Feb 19 19:41:14 crc kubenswrapper[4749]: I0219 19:41:14.392902 4749 scope.go:117] "RemoveContainer" containerID="d84db3de45d02882c2e4ce91496e74df3cd6a3a551f24d625ec54e80f82f8b02" Feb 19 19:41:14 crc kubenswrapper[4749]: I0219 19:41:14.392945 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hzzfc" Feb 19 19:41:14 crc kubenswrapper[4749]: I0219 19:41:14.437451 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hzzfc"] Feb 19 19:41:14 crc kubenswrapper[4749]: I0219 19:41:14.438153 4749 scope.go:117] "RemoveContainer" containerID="1abe47d3879a6a6de50bb036407360d55b0538a11c28998e467d354b26145652" Feb 19 19:41:14 crc kubenswrapper[4749]: I0219 19:41:14.451365 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hzzfc"] Feb 19 19:41:14 crc kubenswrapper[4749]: I0219 19:41:14.691720 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4773cd8e-04ef-45bf-ad2f-2c387310fd74" path="/var/lib/kubelet/pods/4773cd8e-04ef-45bf-ad2f-2c387310fd74/volumes" Feb 19 19:41:14 crc kubenswrapper[4749]: I0219 19:41:14.746339 4749 scope.go:117] "RemoveContainer" containerID="6e077dbe27b17d3327ca77eb88fdfe9fab25b6b82631a4da1afe7ce6819d40a8" Feb 19 19:41:15 crc kubenswrapper[4749]: I0219 19:41:15.678784 4749 scope.go:117] "RemoveContainer" containerID="85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd" Feb 19 19:41:15 crc kubenswrapper[4749]: E0219 19:41:15.679285 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:41:26 crc kubenswrapper[4749]: I0219 19:41:26.685572 4749 scope.go:117] "RemoveContainer" containerID="85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd" Feb 19 19:41:26 crc kubenswrapper[4749]: E0219 19:41:26.686696 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:41:41 crc kubenswrapper[4749]: I0219 19:41:41.679678 4749 scope.go:117] "RemoveContainer" containerID="85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd" Feb 19 19:41:41 crc kubenswrapper[4749]: E0219 19:41:41.680556 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:41:54 crc kubenswrapper[4749]: I0219 19:41:54.679607 4749 scope.go:117] "RemoveContainer" containerID="85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd" Feb 19 19:41:54 crc kubenswrapper[4749]: E0219 19:41:54.680527 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:42:07 crc kubenswrapper[4749]: I0219 19:42:07.679109 4749 scope.go:117] "RemoveContainer" containerID="85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd" Feb 19 19:42:07 crc kubenswrapper[4749]: E0219 19:42:07.679859 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:42:18 crc kubenswrapper[4749]: I0219 19:42:18.679606 4749 scope.go:117] "RemoveContainer" containerID="85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd" Feb 19 19:42:18 crc kubenswrapper[4749]: E0219 19:42:18.680376 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:42:31 crc kubenswrapper[4749]: I0219 19:42:31.680274 4749 scope.go:117] "RemoveContainer" containerID="85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd" Feb 19 19:42:32 crc kubenswrapper[4749]: I0219 19:42:32.096277 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerStarted","Data":"be5d4f602982c70fd80fccb3c8e3e2f90ab6ede82af7ed1404cc6dd2c26c49ad"} Feb 19 19:42:46 crc kubenswrapper[4749]: I0219 19:42:46.298762 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6krlh"] Feb 19 19:42:46 crc kubenswrapper[4749]: E0219 19:42:46.299731 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4773cd8e-04ef-45bf-ad2f-2c387310fd74" containerName="extract-utilities" Feb 19 19:42:46 crc kubenswrapper[4749]: I0219 19:42:46.299767 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4773cd8e-04ef-45bf-ad2f-2c387310fd74" containerName="extract-utilities" Feb 19 19:42:46 crc kubenswrapper[4749]: E0219 19:42:46.299783 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4773cd8e-04ef-45bf-ad2f-2c387310fd74" containerName="extract-content" Feb 19 19:42:46 crc kubenswrapper[4749]: I0219 19:42:46.299788 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4773cd8e-04ef-45bf-ad2f-2c387310fd74" containerName="extract-content" Feb 19 19:42:46 crc kubenswrapper[4749]: E0219 19:42:46.299805 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4773cd8e-04ef-45bf-ad2f-2c387310fd74" containerName="registry-server" Feb 19 19:42:46 crc kubenswrapper[4749]: I0219 19:42:46.299811 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4773cd8e-04ef-45bf-ad2f-2c387310fd74" containerName="registry-server" Feb 19 19:42:46 crc kubenswrapper[4749]: I0219 19:42:46.300091 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4773cd8e-04ef-45bf-ad2f-2c387310fd74" containerName="registry-server" Feb 19 19:42:46 crc kubenswrapper[4749]: I0219 19:42:46.302219 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6krlh" Feb 19 19:42:46 crc kubenswrapper[4749]: I0219 19:42:46.317874 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6krlh"] Feb 19 19:42:46 crc kubenswrapper[4749]: I0219 19:42:46.379359 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgv4g\" (UniqueName: \"kubernetes.io/projected/1f905fe7-8488-4208-a331-c92b159859d9-kube-api-access-lgv4g\") pod \"redhat-operators-6krlh\" (UID: \"1f905fe7-8488-4208-a331-c92b159859d9\") " pod="openshift-marketplace/redhat-operators-6krlh" Feb 19 19:42:46 crc kubenswrapper[4749]: I0219 19:42:46.379469 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f905fe7-8488-4208-a331-c92b159859d9-catalog-content\") pod \"redhat-operators-6krlh\" (UID: \"1f905fe7-8488-4208-a331-c92b159859d9\") " pod="openshift-marketplace/redhat-operators-6krlh" Feb 19 19:42:46 crc kubenswrapper[4749]: I0219 19:42:46.379539 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f905fe7-8488-4208-a331-c92b159859d9-utilities\") pod \"redhat-operators-6krlh\" (UID: \"1f905fe7-8488-4208-a331-c92b159859d9\") " pod="openshift-marketplace/redhat-operators-6krlh" Feb 19 19:42:46 crc kubenswrapper[4749]: I0219 19:42:46.481543 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgv4g\" (UniqueName: \"kubernetes.io/projected/1f905fe7-8488-4208-a331-c92b159859d9-kube-api-access-lgv4g\") pod \"redhat-operators-6krlh\" (UID: \"1f905fe7-8488-4208-a331-c92b159859d9\") " pod="openshift-marketplace/redhat-operators-6krlh" Feb 19 19:42:46 crc kubenswrapper[4749]: I0219 19:42:46.481829 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f905fe7-8488-4208-a331-c92b159859d9-catalog-content\") pod \"redhat-operators-6krlh\" (UID: \"1f905fe7-8488-4208-a331-c92b159859d9\") " pod="openshift-marketplace/redhat-operators-6krlh" Feb 19 19:42:46 crc kubenswrapper[4749]: I0219 19:42:46.481867 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f905fe7-8488-4208-a331-c92b159859d9-utilities\") pod \"redhat-operators-6krlh\" (UID: \"1f905fe7-8488-4208-a331-c92b159859d9\") " pod="openshift-marketplace/redhat-operators-6krlh" Feb 19 19:42:46 crc kubenswrapper[4749]: I0219 19:42:46.482465 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f905fe7-8488-4208-a331-c92b159859d9-catalog-content\") pod \"redhat-operators-6krlh\" (UID: \"1f905fe7-8488-4208-a331-c92b159859d9\") " pod="openshift-marketplace/redhat-operators-6krlh" Feb 19 19:42:46 crc kubenswrapper[4749]: I0219 19:42:46.482482 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f905fe7-8488-4208-a331-c92b159859d9-utilities\") pod \"redhat-operators-6krlh\" (UID: \"1f905fe7-8488-4208-a331-c92b159859d9\") " pod="openshift-marketplace/redhat-operators-6krlh" Feb 19 19:42:46 crc kubenswrapper[4749]: I0219 19:42:46.500958 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgv4g\" (UniqueName: \"kubernetes.io/projected/1f905fe7-8488-4208-a331-c92b159859d9-kube-api-access-lgv4g\") pod \"redhat-operators-6krlh\" (UID: \"1f905fe7-8488-4208-a331-c92b159859d9\") " pod="openshift-marketplace/redhat-operators-6krlh" Feb 19 19:42:46 crc kubenswrapper[4749]: I0219 19:42:46.635959 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6krlh" Feb 19 19:42:47 crc kubenswrapper[4749]: I0219 19:42:47.183860 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6krlh"] Feb 19 19:42:47 crc kubenswrapper[4749]: I0219 19:42:47.229192 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6krlh" event={"ID":"1f905fe7-8488-4208-a331-c92b159859d9","Type":"ContainerStarted","Data":"cec5e25ac885608c3aa980918a1f310e8d1412a9193be6d29df8883ee711fbb7"} Feb 19 19:42:48 crc kubenswrapper[4749]: I0219 19:42:48.243270 4749 generic.go:334] "Generic (PLEG): container finished" podID="1f905fe7-8488-4208-a331-c92b159859d9" containerID="8ccfc4818c924ea8f285b0bfe54deea4ad0b998f6c1e3922fdb958eec69e8ddc" exitCode=0 Feb 19 19:42:48 crc kubenswrapper[4749]: I0219 19:42:48.243364 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6krlh" event={"ID":"1f905fe7-8488-4208-a331-c92b159859d9","Type":"ContainerDied","Data":"8ccfc4818c924ea8f285b0bfe54deea4ad0b998f6c1e3922fdb958eec69e8ddc"} Feb 19 19:42:50 crc kubenswrapper[4749]: I0219 19:42:50.267900 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6krlh" event={"ID":"1f905fe7-8488-4208-a331-c92b159859d9","Type":"ContainerStarted","Data":"068a5fc578a0c0c3a4b84fc4d5241bbf6f26ca0fabd7357dbc00fbf2f0c59ecd"} Feb 19 19:42:56 crc kubenswrapper[4749]: I0219 19:42:56.330797 4749 generic.go:334] "Generic (PLEG): container finished" podID="1f905fe7-8488-4208-a331-c92b159859d9" containerID="068a5fc578a0c0c3a4b84fc4d5241bbf6f26ca0fabd7357dbc00fbf2f0c59ecd" exitCode=0 Feb 19 19:42:56 crc kubenswrapper[4749]: I0219 19:42:56.331326 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6krlh" event={"ID":"1f905fe7-8488-4208-a331-c92b159859d9","Type":"ContainerDied","Data":"068a5fc578a0c0c3a4b84fc4d5241bbf6f26ca0fabd7357dbc00fbf2f0c59ecd"} Feb 19 19:42:57 crc kubenswrapper[4749]: I0219 19:42:57.346856 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6krlh" event={"ID":"1f905fe7-8488-4208-a331-c92b159859d9","Type":"ContainerStarted","Data":"734c0b98f0d81970cef9d802d149bed160598e626f757a273ccf20171b8e57c8"} Feb 19 19:42:57 crc kubenswrapper[4749]: I0219 19:42:57.366497 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6krlh" podStartSLOduration=2.8762448689999998 podStartE2EDuration="11.366444715s" podCreationTimestamp="2026-02-19 19:42:46 +0000 UTC" firstStartedPulling="2026-02-19 19:42:48.245470197 +0000 UTC m=+4142.206690151" lastFinishedPulling="2026-02-19 19:42:56.735670043 +0000 UTC m=+4150.696889997" observedRunningTime="2026-02-19 19:42:57.364768154 +0000 UTC m=+4151.325988098" watchObservedRunningTime="2026-02-19 19:42:57.366444715 +0000 UTC m=+4151.327664669" Feb 19 19:43:06 crc kubenswrapper[4749]: I0219 19:43:06.636824 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6krlh" Feb 19 19:43:06 crc kubenswrapper[4749]: I0219 19:43:06.637440 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6krlh" Feb 19 19:43:07 crc kubenswrapper[4749]: I0219 19:43:07.687750 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6krlh" podUID="1f905fe7-8488-4208-a331-c92b159859d9" containerName="registry-server" probeResult="failure" output=< Feb 19 19:43:07 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Feb 19 19:43:07 crc kubenswrapper[4749]: > Feb 19 19:43:16 crc kubenswrapper[4749]: I0219 19:43:16.696764 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6krlh" Feb 19 19:43:16 crc kubenswrapper[4749]: I0219 19:43:16.750391 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6krlh" Feb 19 19:43:17 crc kubenswrapper[4749]: I0219 19:43:17.495095 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6krlh"] Feb 19 19:43:18 crc kubenswrapper[4749]: I0219 19:43:18.535256 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6krlh" podUID="1f905fe7-8488-4208-a331-c92b159859d9" containerName="registry-server" containerID="cri-o://734c0b98f0d81970cef9d802d149bed160598e626f757a273ccf20171b8e57c8" gracePeriod=2 Feb 19 19:43:19 crc kubenswrapper[4749]: I0219 19:43:19.089156 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6krlh" Feb 19 19:43:19 crc kubenswrapper[4749]: I0219 19:43:19.171099 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgv4g\" (UniqueName: \"kubernetes.io/projected/1f905fe7-8488-4208-a331-c92b159859d9-kube-api-access-lgv4g\") pod \"1f905fe7-8488-4208-a331-c92b159859d9\" (UID: \"1f905fe7-8488-4208-a331-c92b159859d9\") " Feb 19 19:43:19 crc kubenswrapper[4749]: I0219 19:43:19.171271 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f905fe7-8488-4208-a331-c92b159859d9-utilities\") pod \"1f905fe7-8488-4208-a331-c92b159859d9\" (UID: \"1f905fe7-8488-4208-a331-c92b159859d9\") " Feb 19 19:43:19 crc kubenswrapper[4749]: I0219 19:43:19.171381 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f905fe7-8488-4208-a331-c92b159859d9-catalog-content\") pod \"1f905fe7-8488-4208-a331-c92b159859d9\" (UID: \"1f905fe7-8488-4208-a331-c92b159859d9\") " Feb 19 19:43:19 crc kubenswrapper[4749]: I0219 19:43:19.173303 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f905fe7-8488-4208-a331-c92b159859d9-utilities" (OuterVolumeSpecName: "utilities") pod "1f905fe7-8488-4208-a331-c92b159859d9" (UID: "1f905fe7-8488-4208-a331-c92b159859d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:43:19 crc kubenswrapper[4749]: I0219 19:43:19.192239 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f905fe7-8488-4208-a331-c92b159859d9-kube-api-access-lgv4g" (OuterVolumeSpecName: "kube-api-access-lgv4g") pod "1f905fe7-8488-4208-a331-c92b159859d9" (UID: "1f905fe7-8488-4208-a331-c92b159859d9"). InnerVolumeSpecName "kube-api-access-lgv4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:43:19 crc kubenswrapper[4749]: I0219 19:43:19.273780 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgv4g\" (UniqueName: \"kubernetes.io/projected/1f905fe7-8488-4208-a331-c92b159859d9-kube-api-access-lgv4g\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:19 crc kubenswrapper[4749]: I0219 19:43:19.273817 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f905fe7-8488-4208-a331-c92b159859d9-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:19 crc kubenswrapper[4749]: I0219 19:43:19.319630 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f905fe7-8488-4208-a331-c92b159859d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f905fe7-8488-4208-a331-c92b159859d9" (UID: "1f905fe7-8488-4208-a331-c92b159859d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:43:19 crc kubenswrapper[4749]: I0219 19:43:19.375739 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f905fe7-8488-4208-a331-c92b159859d9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:19 crc kubenswrapper[4749]: I0219 19:43:19.551657 4749 generic.go:334] "Generic (PLEG): container finished" podID="1f905fe7-8488-4208-a331-c92b159859d9" containerID="734c0b98f0d81970cef9d802d149bed160598e626f757a273ccf20171b8e57c8" exitCode=0 Feb 19 19:43:19 crc kubenswrapper[4749]: I0219 19:43:19.551708 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6krlh" event={"ID":"1f905fe7-8488-4208-a331-c92b159859d9","Type":"ContainerDied","Data":"734c0b98f0d81970cef9d802d149bed160598e626f757a273ccf20171b8e57c8"} Feb 19 19:43:19 crc kubenswrapper[4749]: I0219 19:43:19.551739 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6krlh" event={"ID":"1f905fe7-8488-4208-a331-c92b159859d9","Type":"ContainerDied","Data":"cec5e25ac885608c3aa980918a1f310e8d1412a9193be6d29df8883ee711fbb7"} Feb 19 19:43:19 crc kubenswrapper[4749]: I0219 19:43:19.551758 4749 scope.go:117] "RemoveContainer" containerID="734c0b98f0d81970cef9d802d149bed160598e626f757a273ccf20171b8e57c8" Feb 19 19:43:19 crc kubenswrapper[4749]: I0219 19:43:19.551930 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6krlh" Feb 19 19:43:19 crc kubenswrapper[4749]: I0219 19:43:19.595841 4749 scope.go:117] "RemoveContainer" containerID="068a5fc578a0c0c3a4b84fc4d5241bbf6f26ca0fabd7357dbc00fbf2f0c59ecd" Feb 19 19:43:19 crc kubenswrapper[4749]: I0219 19:43:19.596607 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6krlh"] Feb 19 19:43:19 crc kubenswrapper[4749]: I0219 19:43:19.606836 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6krlh"] Feb 19 19:43:19 crc kubenswrapper[4749]: I0219 19:43:19.632663 4749 scope.go:117] "RemoveContainer" containerID="8ccfc4818c924ea8f285b0bfe54deea4ad0b998f6c1e3922fdb958eec69e8ddc" Feb 19 19:43:19 crc kubenswrapper[4749]: I0219 19:43:19.704760 4749 scope.go:117] "RemoveContainer" containerID="734c0b98f0d81970cef9d802d149bed160598e626f757a273ccf20171b8e57c8" Feb 19 19:43:19 crc kubenswrapper[4749]: E0219 19:43:19.705157 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"734c0b98f0d81970cef9d802d149bed160598e626f757a273ccf20171b8e57c8\": container with ID starting with 734c0b98f0d81970cef9d802d149bed160598e626f757a273ccf20171b8e57c8 not found: ID does not exist" containerID="734c0b98f0d81970cef9d802d149bed160598e626f757a273ccf20171b8e57c8" Feb 19 19:43:19 crc kubenswrapper[4749]: I0219 19:43:19.705197 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"734c0b98f0d81970cef9d802d149bed160598e626f757a273ccf20171b8e57c8"} err="failed to get container status \"734c0b98f0d81970cef9d802d149bed160598e626f757a273ccf20171b8e57c8\": rpc error: code = NotFound desc = could not find container \"734c0b98f0d81970cef9d802d149bed160598e626f757a273ccf20171b8e57c8\": container with ID starting with 734c0b98f0d81970cef9d802d149bed160598e626f757a273ccf20171b8e57c8 not found: ID does not exist" Feb 19 19:43:19 crc kubenswrapper[4749]: I0219 19:43:19.705228 4749 scope.go:117] "RemoveContainer" containerID="068a5fc578a0c0c3a4b84fc4d5241bbf6f26ca0fabd7357dbc00fbf2f0c59ecd" Feb 19 19:43:19 crc kubenswrapper[4749]: E0219 19:43:19.705480 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"068a5fc578a0c0c3a4b84fc4d5241bbf6f26ca0fabd7357dbc00fbf2f0c59ecd\": container with ID starting with 068a5fc578a0c0c3a4b84fc4d5241bbf6f26ca0fabd7357dbc00fbf2f0c59ecd not found: ID does not exist" containerID="068a5fc578a0c0c3a4b84fc4d5241bbf6f26ca0fabd7357dbc00fbf2f0c59ecd" Feb 19 19:43:19 crc kubenswrapper[4749]: I0219 19:43:19.705508 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"068a5fc578a0c0c3a4b84fc4d5241bbf6f26ca0fabd7357dbc00fbf2f0c59ecd"} err="failed to get container status \"068a5fc578a0c0c3a4b84fc4d5241bbf6f26ca0fabd7357dbc00fbf2f0c59ecd\": rpc error: code = NotFound desc = could not find container \"068a5fc578a0c0c3a4b84fc4d5241bbf6f26ca0fabd7357dbc00fbf2f0c59ecd\": container with ID starting with 068a5fc578a0c0c3a4b84fc4d5241bbf6f26ca0fabd7357dbc00fbf2f0c59ecd not found: ID does not exist" Feb 19 19:43:19 crc kubenswrapper[4749]: I0219 19:43:19.705526 4749 scope.go:117] "RemoveContainer" containerID="8ccfc4818c924ea8f285b0bfe54deea4ad0b998f6c1e3922fdb958eec69e8ddc" Feb 19 19:43:19 crc kubenswrapper[4749]: E0219 19:43:19.705794 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ccfc4818c924ea8f285b0bfe54deea4ad0b998f6c1e3922fdb958eec69e8ddc\": container with ID starting with 8ccfc4818c924ea8f285b0bfe54deea4ad0b998f6c1e3922fdb958eec69e8ddc not found: ID does not exist" containerID="8ccfc4818c924ea8f285b0bfe54deea4ad0b998f6c1e3922fdb958eec69e8ddc" Feb 19 19:43:19 crc kubenswrapper[4749]: I0219 19:43:19.705825 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ccfc4818c924ea8f285b0bfe54deea4ad0b998f6c1e3922fdb958eec69e8ddc"} err="failed to get container status \"8ccfc4818c924ea8f285b0bfe54deea4ad0b998f6c1e3922fdb958eec69e8ddc\": rpc error: code = NotFound desc = could not find container \"8ccfc4818c924ea8f285b0bfe54deea4ad0b998f6c1e3922fdb958eec69e8ddc\": container with ID starting with 8ccfc4818c924ea8f285b0bfe54deea4ad0b998f6c1e3922fdb958eec69e8ddc not found: ID does not exist" Feb 19 19:43:20 crc kubenswrapper[4749]: I0219 19:43:20.689913 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f905fe7-8488-4208-a331-c92b159859d9" path="/var/lib/kubelet/pods/1f905fe7-8488-4208-a331-c92b159859d9/volumes" Feb 19 19:44:41 crc kubenswrapper[4749]: I0219 19:44:41.569122 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mw8rm"] Feb 19 19:44:41 crc kubenswrapper[4749]: E0219 19:44:41.570062 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f905fe7-8488-4208-a331-c92b159859d9" containerName="extract-utilities" Feb 19 19:44:41 crc kubenswrapper[4749]: I0219 19:44:41.570074 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f905fe7-8488-4208-a331-c92b159859d9" containerName="extract-utilities" Feb 19 19:44:41 crc kubenswrapper[4749]: E0219 19:44:41.570085 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f905fe7-8488-4208-a331-c92b159859d9" containerName="registry-server" Feb 19 19:44:41 crc kubenswrapper[4749]: I0219 19:44:41.570090 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f905fe7-8488-4208-a331-c92b159859d9" containerName="registry-server" Feb 19 19:44:41 crc kubenswrapper[4749]: E0219 19:44:41.570109 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f905fe7-8488-4208-a331-c92b159859d9" containerName="extract-content" Feb 19 19:44:41 crc kubenswrapper[4749]: I0219 19:44:41.570115 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f905fe7-8488-4208-a331-c92b159859d9" containerName="extract-content" Feb 19 19:44:41 crc kubenswrapper[4749]: I0219 19:44:41.570326 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f905fe7-8488-4208-a331-c92b159859d9" containerName="registry-server" Feb 19 19:44:41 crc kubenswrapper[4749]: I0219 19:44:41.571833 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mw8rm" Feb 19 19:44:41 crc kubenswrapper[4749]: I0219 19:44:41.577730 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mw8rm"] Feb 19 19:44:41 crc kubenswrapper[4749]: I0219 19:44:41.605782 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4-catalog-content\") pod \"certified-operators-mw8rm\" (UID: \"a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4\") " pod="openshift-marketplace/certified-operators-mw8rm" Feb 19 19:44:41 crc kubenswrapper[4749]: I0219 19:44:41.606011 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4-utilities\") pod \"certified-operators-mw8rm\" (UID: \"a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4\") " pod="openshift-marketplace/certified-operators-mw8rm" Feb 19 19:44:41 crc kubenswrapper[4749]: I0219 19:44:41.606325 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvfks\" (UniqueName: \"kubernetes.io/projected/a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4-kube-api-access-xvfks\") pod \"certified-operators-mw8rm\" (UID: \"a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4\") " pod="openshift-marketplace/certified-operators-mw8rm" Feb 19 19:44:41 crc kubenswrapper[4749]: I0219 19:44:41.708168 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4-utilities\") pod \"certified-operators-mw8rm\" (UID: \"a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4\") " pod="openshift-marketplace/certified-operators-mw8rm" Feb 19 19:44:41 crc kubenswrapper[4749]: I0219 19:44:41.708254 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvfks\" (UniqueName: \"kubernetes.io/projected/a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4-kube-api-access-xvfks\") pod \"certified-operators-mw8rm\" (UID: \"a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4\") " pod="openshift-marketplace/certified-operators-mw8rm" Feb 19 19:44:41 crc kubenswrapper[4749]: I0219 19:44:41.708353 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4-catalog-content\") pod \"certified-operators-mw8rm\" (UID: \"a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4\") " pod="openshift-marketplace/certified-operators-mw8rm" Feb 19 19:44:41 crc kubenswrapper[4749]: I0219 19:44:41.708690 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4-utilities\") pod \"certified-operators-mw8rm\" (UID: \"a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4\") " pod="openshift-marketplace/certified-operators-mw8rm" Feb 19 19:44:41 crc kubenswrapper[4749]: I0219 19:44:41.709048 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4-catalog-content\") pod \"certified-operators-mw8rm\" (UID: \"a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4\") " pod="openshift-marketplace/certified-operators-mw8rm" Feb 19 19:44:41 crc kubenswrapper[4749]: I0219 19:44:41.744264 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvfks\" (UniqueName: \"kubernetes.io/projected/a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4-kube-api-access-xvfks\") pod \"certified-operators-mw8rm\" (UID: \"a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4\") " pod="openshift-marketplace/certified-operators-mw8rm" Feb 19 19:44:41 crc kubenswrapper[4749]: I0219 19:44:41.909676 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mw8rm" Feb 19 19:44:42 crc kubenswrapper[4749]: I0219 19:44:42.452719 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mw8rm"] Feb 19 19:44:42 crc kubenswrapper[4749]: W0219 19:44:42.460160 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda36ef3c8_bb6c_4aab_ac41_9dda6a36a9f4.slice/crio-ca11ef9b1595789087b8f5db8730283cf3635bc85b64b9e6c838a17186214a82 WatchSource:0}: Error finding container ca11ef9b1595789087b8f5db8730283cf3635bc85b64b9e6c838a17186214a82: Status 404 returned error can't find the container with id ca11ef9b1595789087b8f5db8730283cf3635bc85b64b9e6c838a17186214a82 Feb 19 19:44:43 crc kubenswrapper[4749]: I0219 19:44:43.351337 4749 generic.go:334] "Generic (PLEG): container finished" podID="a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4" containerID="0a00648b624815950ea2eef0fd719cc5f8ce3cbe36166e58d2df28bc9bfa532e" exitCode=0 Feb 19 19:44:43 crc kubenswrapper[4749]: I0219 19:44:43.351434 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mw8rm" event={"ID":"a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4","Type":"ContainerDied","Data":"0a00648b624815950ea2eef0fd719cc5f8ce3cbe36166e58d2df28bc9bfa532e"} Feb 19 19:44:43 crc kubenswrapper[4749]: I0219 19:44:43.351796 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mw8rm" event={"ID":"a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4","Type":"ContainerStarted","Data":"ca11ef9b1595789087b8f5db8730283cf3635bc85b64b9e6c838a17186214a82"} Feb 19 19:44:43 crc kubenswrapper[4749]: I0219 19:44:43.354346 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 19:44:44 crc kubenswrapper[4749]: I0219 19:44:44.365319 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mw8rm" event={"ID":"a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4","Type":"ContainerStarted","Data":"192dec474ec65ba0ae2391a6fa8ab94e50e0ed6b56724e635b5dfc2afb85b41b"} Feb 19 19:44:45 crc kubenswrapper[4749]: E0219 19:44:45.633326 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda36ef3c8_bb6c_4aab_ac41_9dda6a36a9f4.slice/crio-192dec474ec65ba0ae2391a6fa8ab94e50e0ed6b56724e635b5dfc2afb85b41b.scope\": RecentStats: unable to find data in memory cache]" Feb 19 19:44:46 crc kubenswrapper[4749]: I0219 19:44:46.386016 4749 generic.go:334] "Generic (PLEG): container finished" podID="a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4" containerID="192dec474ec65ba0ae2391a6fa8ab94e50e0ed6b56724e635b5dfc2afb85b41b" exitCode=0 Feb 19 19:44:46 crc kubenswrapper[4749]: I0219 19:44:46.386108 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mw8rm" event={"ID":"a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4","Type":"ContainerDied","Data":"192dec474ec65ba0ae2391a6fa8ab94e50e0ed6b56724e635b5dfc2afb85b41b"} Feb 19 19:44:47 crc kubenswrapper[4749]: I0219 19:44:47.400246 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mw8rm" event={"ID":"a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4","Type":"ContainerStarted","Data":"ca03af524ab9d9d42e842573becc4ba20ce9370f160be148b059b235fee7662b"} Feb 19 19:44:47 crc kubenswrapper[4749]: I0219 19:44:47.421972 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mw8rm" podStartSLOduration=2.983392769 podStartE2EDuration="6.421944635s" podCreationTimestamp="2026-02-19 19:44:41 +0000 UTC" firstStartedPulling="2026-02-19 19:44:43.354051462 +0000 UTC m=+4257.315271416" lastFinishedPulling="2026-02-19 19:44:46.792603328 +0000 UTC m=+4260.753823282" observedRunningTime="2026-02-19 19:44:47.418119562 +0000 UTC m=+4261.379339536" watchObservedRunningTime="2026-02-19 19:44:47.421944635 +0000 UTC m=+4261.383164609" Feb 19 19:44:51 crc kubenswrapper[4749]: I0219 19:44:51.910038 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mw8rm" Feb 19 19:44:51 crc kubenswrapper[4749]: I0219 19:44:51.911574 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mw8rm" Feb 19 19:44:52 crc kubenswrapper[4749]: I0219 19:44:52.207999 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mw8rm" Feb 19 19:44:52 crc kubenswrapper[4749]: I0219 19:44:52.495272 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mw8rm" Feb 19 19:44:52 crc kubenswrapper[4749]: I0219 19:44:52.563503 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mw8rm"] Feb 19 19:44:54 crc kubenswrapper[4749]: I0219 19:44:54.463207 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mw8rm" podUID="a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4" containerName="registry-server" containerID="cri-o://ca03af524ab9d9d42e842573becc4ba20ce9370f160be148b059b235fee7662b" gracePeriod=2 Feb 19 19:44:54 crc kubenswrapper[4749]: I0219 19:44:54.724966 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:44:54 crc kubenswrapper[4749]: I0219 19:44:54.725321 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:44:54 crc kubenswrapper[4749]: I0219 19:44:54.995214 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mw8rm" Feb 19 19:44:55 crc kubenswrapper[4749]: I0219 19:44:55.089186 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvfks\" (UniqueName: \"kubernetes.io/projected/a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4-kube-api-access-xvfks\") pod \"a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4\" (UID: \"a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4\") " Feb 19 19:44:55 crc kubenswrapper[4749]: I0219 19:44:55.089452 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4-catalog-content\") pod \"a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4\" (UID: \"a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4\") " Feb 19 19:44:55 crc kubenswrapper[4749]: I0219 19:44:55.089655 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4-utilities\") pod \"a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4\" (UID: \"a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4\") " Feb 19 19:44:55 crc kubenswrapper[4749]: I0219 19:44:55.090939 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4-utilities" (OuterVolumeSpecName: "utilities") pod "a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4" (UID: "a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:44:55 crc kubenswrapper[4749]: I0219 19:44:55.096268 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4-kube-api-access-xvfks" (OuterVolumeSpecName: "kube-api-access-xvfks") pod "a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4" (UID: "a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4"). InnerVolumeSpecName "kube-api-access-xvfks". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:44:55 crc kubenswrapper[4749]: I0219 19:44:55.140845 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4" (UID: "a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:44:55 crc kubenswrapper[4749]: I0219 19:44:55.192798 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:44:55 crc kubenswrapper[4749]: I0219 19:44:55.192833 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:44:55 crc kubenswrapper[4749]: I0219 19:44:55.192846 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvfks\" (UniqueName: \"kubernetes.io/projected/a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4-kube-api-access-xvfks\") on node \"crc\" DevicePath \"\"" Feb 19 19:44:55 crc kubenswrapper[4749]: I0219 19:44:55.474696 4749 generic.go:334] "Generic (PLEG): container finished" podID="a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4" containerID="ca03af524ab9d9d42e842573becc4ba20ce9370f160be148b059b235fee7662b" exitCode=0 Feb 19 19:44:55 crc kubenswrapper[4749]: I0219 19:44:55.474746 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mw8rm" Feb 19 19:44:55 crc kubenswrapper[4749]: I0219 19:44:55.474741 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mw8rm" event={"ID":"a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4","Type":"ContainerDied","Data":"ca03af524ab9d9d42e842573becc4ba20ce9370f160be148b059b235fee7662b"} Feb 19 19:44:55 crc kubenswrapper[4749]: I0219 19:44:55.474800 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mw8rm" event={"ID":"a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4","Type":"ContainerDied","Data":"ca11ef9b1595789087b8f5db8730283cf3635bc85b64b9e6c838a17186214a82"} Feb 19 19:44:55 crc kubenswrapper[4749]: I0219 19:44:55.474823 4749 scope.go:117] "RemoveContainer" containerID="ca03af524ab9d9d42e842573becc4ba20ce9370f160be148b059b235fee7662b" Feb 19 19:44:55 crc kubenswrapper[4749]: I0219 19:44:55.498562 4749 scope.go:117] "RemoveContainer" containerID="192dec474ec65ba0ae2391a6fa8ab94e50e0ed6b56724e635b5dfc2afb85b41b" Feb 19 19:44:55 crc kubenswrapper[4749]: I0219 19:44:55.511756 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mw8rm"] Feb 19 19:44:55 crc kubenswrapper[4749]: I0219 19:44:55.520515 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mw8rm"] Feb 19 19:44:55 crc kubenswrapper[4749]: I0219 19:44:55.532143 4749 scope.go:117] "RemoveContainer" containerID="0a00648b624815950ea2eef0fd719cc5f8ce3cbe36166e58d2df28bc9bfa532e" Feb 19 19:44:55 crc kubenswrapper[4749]: I0219 19:44:55.571209 4749 scope.go:117] "RemoveContainer" containerID="ca03af524ab9d9d42e842573becc4ba20ce9370f160be148b059b235fee7662b" Feb 19 19:44:55 crc kubenswrapper[4749]: E0219 19:44:55.571596 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca03af524ab9d9d42e842573becc4ba20ce9370f160be148b059b235fee7662b\": container with ID starting with ca03af524ab9d9d42e842573becc4ba20ce9370f160be148b059b235fee7662b not found: ID does not exist" containerID="ca03af524ab9d9d42e842573becc4ba20ce9370f160be148b059b235fee7662b" Feb 19 19:44:55 crc kubenswrapper[4749]: I0219 19:44:55.571639 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca03af524ab9d9d42e842573becc4ba20ce9370f160be148b059b235fee7662b"} err="failed to get container status \"ca03af524ab9d9d42e842573becc4ba20ce9370f160be148b059b235fee7662b\": rpc error: code = NotFound desc = could not find container \"ca03af524ab9d9d42e842573becc4ba20ce9370f160be148b059b235fee7662b\": container with ID starting with ca03af524ab9d9d42e842573becc4ba20ce9370f160be148b059b235fee7662b not found: ID does not exist" Feb 19 19:44:55 crc kubenswrapper[4749]: I0219 19:44:55.571665 4749 scope.go:117] "RemoveContainer" containerID="192dec474ec65ba0ae2391a6fa8ab94e50e0ed6b56724e635b5dfc2afb85b41b" Feb 19 19:44:55 crc kubenswrapper[4749]: E0219 19:44:55.572057 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"192dec474ec65ba0ae2391a6fa8ab94e50e0ed6b56724e635b5dfc2afb85b41b\": container with ID starting with 192dec474ec65ba0ae2391a6fa8ab94e50e0ed6b56724e635b5dfc2afb85b41b not found: ID does not exist" containerID="192dec474ec65ba0ae2391a6fa8ab94e50e0ed6b56724e635b5dfc2afb85b41b" Feb 19 19:44:55 crc kubenswrapper[4749]: I0219 19:44:55.572083 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"192dec474ec65ba0ae2391a6fa8ab94e50e0ed6b56724e635b5dfc2afb85b41b"} err="failed to get container status \"192dec474ec65ba0ae2391a6fa8ab94e50e0ed6b56724e635b5dfc2afb85b41b\": rpc error: code = NotFound desc = could not find container \"192dec474ec65ba0ae2391a6fa8ab94e50e0ed6b56724e635b5dfc2afb85b41b\": container with ID starting with 192dec474ec65ba0ae2391a6fa8ab94e50e0ed6b56724e635b5dfc2afb85b41b not found: ID does not exist" Feb 19 19:44:55 crc kubenswrapper[4749]: I0219 19:44:55.572104 4749 scope.go:117] "RemoveContainer" containerID="0a00648b624815950ea2eef0fd719cc5f8ce3cbe36166e58d2df28bc9bfa532e" Feb 19 19:44:55 crc kubenswrapper[4749]: E0219 19:44:55.572295 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a00648b624815950ea2eef0fd719cc5f8ce3cbe36166e58d2df28bc9bfa532e\": container with ID starting with 0a00648b624815950ea2eef0fd719cc5f8ce3cbe36166e58d2df28bc9bfa532e not found: ID does not exist" containerID="0a00648b624815950ea2eef0fd719cc5f8ce3cbe36166e58d2df28bc9bfa532e" Feb 19 19:44:55 crc kubenswrapper[4749]: I0219 19:44:55.572326 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a00648b624815950ea2eef0fd719cc5f8ce3cbe36166e58d2df28bc9bfa532e"} err="failed to get container status \"0a00648b624815950ea2eef0fd719cc5f8ce3cbe36166e58d2df28bc9bfa532e\": rpc error: code = NotFound desc = could not find container \"0a00648b624815950ea2eef0fd719cc5f8ce3cbe36166e58d2df28bc9bfa532e\": container with ID starting with 0a00648b624815950ea2eef0fd719cc5f8ce3cbe36166e58d2df28bc9bfa532e not found: ID does not exist" Feb 19 19:44:56 crc kubenswrapper[4749]: I0219 19:44:56.697016 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4" path="/var/lib/kubelet/pods/a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4/volumes" Feb 19 19:45:00 crc kubenswrapper[4749]: I0219 19:45:00.174408 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525505-8sj6h"] Feb 19 19:45:00 crc kubenswrapper[4749]: E0219 19:45:00.175085 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4" containerName="extract-content" Feb 19 19:45:00 crc kubenswrapper[4749]: I0219 19:45:00.175097 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4" containerName="extract-content" Feb 19 19:45:00 crc kubenswrapper[4749]: E0219 19:45:00.175129 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4" containerName="registry-server" Feb 19 19:45:00 crc kubenswrapper[4749]: I0219 19:45:00.175135 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4" containerName="registry-server" Feb 19 19:45:00 crc kubenswrapper[4749]: E0219 19:45:00.175146 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4" containerName="extract-utilities" Feb 19 19:45:00 crc kubenswrapper[4749]: I0219 19:45:00.175154 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4" containerName="extract-utilities" Feb 19 19:45:00 crc kubenswrapper[4749]: I0219 19:45:00.175347 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a36ef3c8-bb6c-4aab-ac41-9dda6a36a9f4" containerName="registry-server" Feb 19 19:45:00 crc kubenswrapper[4749]: I0219 19:45:00.175990 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-8sj6h" Feb 19 19:45:00 crc kubenswrapper[4749]: I0219 19:45:00.177988 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 19:45:00 crc kubenswrapper[4749]: I0219 19:45:00.186906 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 19:45:00 crc kubenswrapper[4749]: I0219 19:45:00.189706 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525505-8sj6h"] Feb 19 19:45:00 crc kubenswrapper[4749]: I0219 19:45:00.297155 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c200fbdb-c994-445f-a23a-53416d1dd9d8-config-volume\") pod \"collect-profiles-29525505-8sj6h\" (UID: \"c200fbdb-c994-445f-a23a-53416d1dd9d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-8sj6h" Feb 19 19:45:00 crc kubenswrapper[4749]: I0219 19:45:00.297428 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c200fbdb-c994-445f-a23a-53416d1dd9d8-secret-volume\") pod \"collect-profiles-29525505-8sj6h\" (UID: \"c200fbdb-c994-445f-a23a-53416d1dd9d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-8sj6h" Feb 19 19:45:00 crc kubenswrapper[4749]: I0219 19:45:00.297805 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97ttx\" (UniqueName: \"kubernetes.io/projected/c200fbdb-c994-445f-a23a-53416d1dd9d8-kube-api-access-97ttx\") pod \"collect-profiles-29525505-8sj6h\" (UID: \"c200fbdb-c994-445f-a23a-53416d1dd9d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-8sj6h" Feb 19 19:45:00 crc kubenswrapper[4749]: I0219 19:45:00.399875 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c200fbdb-c994-445f-a23a-53416d1dd9d8-config-volume\") pod \"collect-profiles-29525505-8sj6h\" (UID: \"c200fbdb-c994-445f-a23a-53416d1dd9d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-8sj6h" Feb 19 19:45:00 crc kubenswrapper[4749]: I0219 19:45:00.400109 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c200fbdb-c994-445f-a23a-53416d1dd9d8-secret-volume\") pod \"collect-profiles-29525505-8sj6h\" (UID: \"c200fbdb-c994-445f-a23a-53416d1dd9d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-8sj6h" Feb 19 19:45:00 crc kubenswrapper[4749]: I0219 19:45:00.400248 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97ttx\" (UniqueName: \"kubernetes.io/projected/c200fbdb-c994-445f-a23a-53416d1dd9d8-kube-api-access-97ttx\") pod \"collect-profiles-29525505-8sj6h\" (UID: \"c200fbdb-c994-445f-a23a-53416d1dd9d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-8sj6h" Feb 19 19:45:00 crc kubenswrapper[4749]: I0219 19:45:00.401051 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c200fbdb-c994-445f-a23a-53416d1dd9d8-config-volume\") pod \"collect-profiles-29525505-8sj6h\" (UID: \"c200fbdb-c994-445f-a23a-53416d1dd9d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-8sj6h" Feb 19 19:45:00 crc kubenswrapper[4749]: I0219 19:45:00.409186 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c200fbdb-c994-445f-a23a-53416d1dd9d8-secret-volume\") pod \"collect-profiles-29525505-8sj6h\" (UID: \"c200fbdb-c994-445f-a23a-53416d1dd9d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-8sj6h" Feb 19 19:45:00 crc kubenswrapper[4749]: I0219 19:45:00.418583 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97ttx\" (UniqueName: \"kubernetes.io/projected/c200fbdb-c994-445f-a23a-53416d1dd9d8-kube-api-access-97ttx\") pod \"collect-profiles-29525505-8sj6h\" (UID: \"c200fbdb-c994-445f-a23a-53416d1dd9d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-8sj6h" Feb 19 19:45:00 crc kubenswrapper[4749]: I0219 19:45:00.500636 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-8sj6h" Feb 19 19:45:00 crc kubenswrapper[4749]: I0219 19:45:00.950039 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525505-8sj6h"] Feb 19 19:45:01 crc kubenswrapper[4749]: I0219 19:45:01.543994 4749 generic.go:334] "Generic (PLEG): container finished" podID="c200fbdb-c994-445f-a23a-53416d1dd9d8" containerID="91045deeec2368c3c19ae204a809cd4e835444bff269a95753bed00904e43eee" exitCode=0 Feb 19 19:45:01 crc kubenswrapper[4749]: I0219 19:45:01.544074 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-8sj6h" event={"ID":"c200fbdb-c994-445f-a23a-53416d1dd9d8","Type":"ContainerDied","Data":"91045deeec2368c3c19ae204a809cd4e835444bff269a95753bed00904e43eee"} Feb 19 19:45:01 crc kubenswrapper[4749]: I0219 19:45:01.544291 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-8sj6h" event={"ID":"c200fbdb-c994-445f-a23a-53416d1dd9d8","Type":"ContainerStarted","Data":"d2e66d29f896265bf6a577a162b2ba99533712821eb2b1327fe86fd265b80df5"} Feb 19 19:45:03 crc kubenswrapper[4749]: I0219 19:45:03.072687 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-8sj6h" Feb 19 19:45:03 crc kubenswrapper[4749]: I0219 19:45:03.167648 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97ttx\" (UniqueName: \"kubernetes.io/projected/c200fbdb-c994-445f-a23a-53416d1dd9d8-kube-api-access-97ttx\") pod \"c200fbdb-c994-445f-a23a-53416d1dd9d8\" (UID: \"c200fbdb-c994-445f-a23a-53416d1dd9d8\") " Feb 19 19:45:03 crc kubenswrapper[4749]: I0219 19:45:03.167956 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c200fbdb-c994-445f-a23a-53416d1dd9d8-secret-volume\") pod \"c200fbdb-c994-445f-a23a-53416d1dd9d8\" (UID: \"c200fbdb-c994-445f-a23a-53416d1dd9d8\") " Feb 19 19:45:03 crc kubenswrapper[4749]: I0219 19:45:03.168187 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c200fbdb-c994-445f-a23a-53416d1dd9d8-config-volume\") pod \"c200fbdb-c994-445f-a23a-53416d1dd9d8\" (UID: \"c200fbdb-c994-445f-a23a-53416d1dd9d8\") " Feb 19 19:45:03 crc kubenswrapper[4749]: I0219 19:45:03.168773 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c200fbdb-c994-445f-a23a-53416d1dd9d8-config-volume" (OuterVolumeSpecName: "config-volume") pod "c200fbdb-c994-445f-a23a-53416d1dd9d8" (UID: "c200fbdb-c994-445f-a23a-53416d1dd9d8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:45:03 crc kubenswrapper[4749]: I0219 19:45:03.174268 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c200fbdb-c994-445f-a23a-53416d1dd9d8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c200fbdb-c994-445f-a23a-53416d1dd9d8" (UID: "c200fbdb-c994-445f-a23a-53416d1dd9d8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:45:03 crc kubenswrapper[4749]: I0219 19:45:03.174974 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c200fbdb-c994-445f-a23a-53416d1dd9d8-kube-api-access-97ttx" (OuterVolumeSpecName: "kube-api-access-97ttx") pod "c200fbdb-c994-445f-a23a-53416d1dd9d8" (UID: "c200fbdb-c994-445f-a23a-53416d1dd9d8"). InnerVolumeSpecName "kube-api-access-97ttx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:45:03 crc kubenswrapper[4749]: I0219 19:45:03.270958 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c200fbdb-c994-445f-a23a-53416d1dd9d8-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 19:45:03 crc kubenswrapper[4749]: I0219 19:45:03.271006 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97ttx\" (UniqueName: \"kubernetes.io/projected/c200fbdb-c994-445f-a23a-53416d1dd9d8-kube-api-access-97ttx\") on node \"crc\" DevicePath \"\"" Feb 19 19:45:03 crc kubenswrapper[4749]: I0219 19:45:03.271021 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c200fbdb-c994-445f-a23a-53416d1dd9d8-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 19:45:03 crc kubenswrapper[4749]: I0219 19:45:03.567463 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-8sj6h" event={"ID":"c200fbdb-c994-445f-a23a-53416d1dd9d8","Type":"ContainerDied","Data":"d2e66d29f896265bf6a577a162b2ba99533712821eb2b1327fe86fd265b80df5"} Feb 19 19:45:03 crc kubenswrapper[4749]: I0219 19:45:03.567533 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2e66d29f896265bf6a577a162b2ba99533712821eb2b1327fe86fd265b80df5" Feb 19 19:45:03 crc kubenswrapper[4749]: I0219 19:45:03.567579 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-8sj6h" Feb 19 19:45:04 crc kubenswrapper[4749]: I0219 19:45:04.207913 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525460-dzdjc"] Feb 19 19:45:04 crc kubenswrapper[4749]: I0219 19:45:04.218716 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525460-dzdjc"] Feb 19 19:45:04 crc kubenswrapper[4749]: I0219 19:45:04.690479 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5286134-7c72-4d2e-b922-292886feb2c5" path="/var/lib/kubelet/pods/a5286134-7c72-4d2e-b922-292886feb2c5/volumes" Feb 19 19:45:24 crc kubenswrapper[4749]: I0219 19:45:24.725282 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:45:24 crc kubenswrapper[4749]: I0219 19:45:24.726539 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:45:25 crc kubenswrapper[4749]: I0219 19:45:25.126560 4749 scope.go:117] "RemoveContainer" containerID="0e6420a8bfd2e1f6598da4a56c6888903004ac5af42e017395320fda4929fba4" Feb 19 19:45:54 crc kubenswrapper[4749]: I0219 19:45:54.725119 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:45:54 crc kubenswrapper[4749]: I0219 19:45:54.725720 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:45:54 crc kubenswrapper[4749]: I0219 19:45:54.725764 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 19:45:54 crc kubenswrapper[4749]: I0219 19:45:54.726546 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"be5d4f602982c70fd80fccb3c8e3e2f90ab6ede82af7ed1404cc6dd2c26c49ad"} pod="openshift-machine-config-operator/machine-config-daemon-nzldt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:45:54 crc kubenswrapper[4749]: I0219 19:45:54.726607 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" containerID="cri-o://be5d4f602982c70fd80fccb3c8e3e2f90ab6ede82af7ed1404cc6dd2c26c49ad" gracePeriod=600 Feb 19 19:45:55 crc kubenswrapper[4749]: I0219 19:45:55.103718 4749 generic.go:334] "Generic (PLEG): container finished" podID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerID="be5d4f602982c70fd80fccb3c8e3e2f90ab6ede82af7ed1404cc6dd2c26c49ad" exitCode=0 Feb 19 19:45:55 crc kubenswrapper[4749]: I0219 19:45:55.103874 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerDied","Data":"be5d4f602982c70fd80fccb3c8e3e2f90ab6ede82af7ed1404cc6dd2c26c49ad"} Feb 19 19:45:55 crc kubenswrapper[4749]: I0219 19:45:55.104050 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerStarted","Data":"4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763"} Feb 19 19:45:55 crc kubenswrapper[4749]: I0219 19:45:55.104074 4749 scope.go:117] "RemoveContainer" containerID="85eff51d6ef05b7143f5f07f5810148d6fb33eb4c29135ee10b0191c734d8dfd" Feb 19 19:48:24 crc kubenswrapper[4749]: I0219 19:48:24.725396 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:48:24 crc kubenswrapper[4749]: I0219 19:48:24.725936 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:48:54 crc kubenswrapper[4749]: I0219 19:48:54.725772 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:48:54 crc kubenswrapper[4749]: I0219 19:48:54.726487 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:49:06 crc kubenswrapper[4749]: I0219 19:49:06.558852 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kb4z8"] Feb 19 19:49:06 crc kubenswrapper[4749]: E0219 19:49:06.559785 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c200fbdb-c994-445f-a23a-53416d1dd9d8" containerName="collect-profiles" Feb 19 19:49:06 crc kubenswrapper[4749]: I0219 19:49:06.559798 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c200fbdb-c994-445f-a23a-53416d1dd9d8" containerName="collect-profiles" Feb 19 19:49:06 crc kubenswrapper[4749]: I0219 19:49:06.559982 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c200fbdb-c994-445f-a23a-53416d1dd9d8" containerName="collect-profiles" Feb 19 19:49:06 crc kubenswrapper[4749]: I0219 19:49:06.561572 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kb4z8" Feb 19 19:49:06 crc kubenswrapper[4749]: I0219 19:49:06.573199 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kb4z8"] Feb 19 19:49:06 crc kubenswrapper[4749]: I0219 19:49:06.594133 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47d6ceba-063c-4281-b231-ed33bbcdd888-catalog-content\") pod \"redhat-marketplace-kb4z8\" (UID: \"47d6ceba-063c-4281-b231-ed33bbcdd888\") " pod="openshift-marketplace/redhat-marketplace-kb4z8" Feb 19 19:49:06 crc kubenswrapper[4749]: I0219 19:49:06.594244 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47d6ceba-063c-4281-b231-ed33bbcdd888-utilities\") pod \"redhat-marketplace-kb4z8\" (UID: \"47d6ceba-063c-4281-b231-ed33bbcdd888\") " pod="openshift-marketplace/redhat-marketplace-kb4z8" Feb 19 19:49:06 crc kubenswrapper[4749]: I0219 19:49:06.594308 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8tr9\" (UniqueName: \"kubernetes.io/projected/47d6ceba-063c-4281-b231-ed33bbcdd888-kube-api-access-k8tr9\") pod \"redhat-marketplace-kb4z8\" (UID: \"47d6ceba-063c-4281-b231-ed33bbcdd888\") " pod="openshift-marketplace/redhat-marketplace-kb4z8" Feb 19 19:49:06 crc kubenswrapper[4749]: I0219 19:49:06.696539 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47d6ceba-063c-4281-b231-ed33bbcdd888-catalog-content\") pod \"redhat-marketplace-kb4z8\" (UID: \"47d6ceba-063c-4281-b231-ed33bbcdd888\") " pod="openshift-marketplace/redhat-marketplace-kb4z8" Feb 19 19:49:06 crc kubenswrapper[4749]: I0219 19:49:06.696643 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47d6ceba-063c-4281-b231-ed33bbcdd888-utilities\") pod \"redhat-marketplace-kb4z8\" (UID: \"47d6ceba-063c-4281-b231-ed33bbcdd888\") " pod="openshift-marketplace/redhat-marketplace-kb4z8" Feb 19 19:49:06 crc kubenswrapper[4749]: I0219 19:49:06.696693 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8tr9\" (UniqueName: \"kubernetes.io/projected/47d6ceba-063c-4281-b231-ed33bbcdd888-kube-api-access-k8tr9\") pod \"redhat-marketplace-kb4z8\" (UID: \"47d6ceba-063c-4281-b231-ed33bbcdd888\") " pod="openshift-marketplace/redhat-marketplace-kb4z8" Feb 19 19:49:06 crc kubenswrapper[4749]: I0219 19:49:06.697252 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47d6ceba-063c-4281-b231-ed33bbcdd888-catalog-content\") pod \"redhat-marketplace-kb4z8\" (UID: \"47d6ceba-063c-4281-b231-ed33bbcdd888\") " pod="openshift-marketplace/redhat-marketplace-kb4z8" Feb 19 19:49:06 crc kubenswrapper[4749]: I0219 19:49:06.697283 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47d6ceba-063c-4281-b231-ed33bbcdd888-utilities\") pod \"redhat-marketplace-kb4z8\" (UID: \"47d6ceba-063c-4281-b231-ed33bbcdd888\") " pod="openshift-marketplace/redhat-marketplace-kb4z8" Feb 19 19:49:06 crc kubenswrapper[4749]: I0219 19:49:06.719763 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8tr9\" (UniqueName: \"kubernetes.io/projected/47d6ceba-063c-4281-b231-ed33bbcdd888-kube-api-access-k8tr9\") pod \"redhat-marketplace-kb4z8\" (UID: \"47d6ceba-063c-4281-b231-ed33bbcdd888\") " pod="openshift-marketplace/redhat-marketplace-kb4z8" Feb 19 19:49:06 crc kubenswrapper[4749]: I0219 19:49:06.880095 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kb4z8" Feb 19 19:49:07 crc kubenswrapper[4749]: I0219 19:49:07.434642 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kb4z8"] Feb 19 19:49:07 crc kubenswrapper[4749]: I0219 19:49:07.858692 4749 generic.go:334] "Generic (PLEG): container finished" podID="47d6ceba-063c-4281-b231-ed33bbcdd888" containerID="81df7b1a22c9ce907693b8f95c5a366a4fd46dd97537cabb9cbebdb35a884630" exitCode=0 Feb 19 19:49:07 crc kubenswrapper[4749]: I0219 19:49:07.858759 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kb4z8" event={"ID":"47d6ceba-063c-4281-b231-ed33bbcdd888","Type":"ContainerDied","Data":"81df7b1a22c9ce907693b8f95c5a366a4fd46dd97537cabb9cbebdb35a884630"} Feb 19 19:49:07 crc kubenswrapper[4749]: I0219 19:49:07.859004 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kb4z8" event={"ID":"47d6ceba-063c-4281-b231-ed33bbcdd888","Type":"ContainerStarted","Data":"d91e2ec68e79f436c41abe69b8e5fb446dace668db902c4a83a0a46373ef6698"} Feb 19 19:49:08 crc kubenswrapper[4749]: I0219 19:49:08.879064 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kb4z8" event={"ID":"47d6ceba-063c-4281-b231-ed33bbcdd888","Type":"ContainerStarted","Data":"1e31fc603f4485f19ce067ef755ada325ddee07ddfa8389fb79382026fbabd99"} Feb 19 19:49:09 crc kubenswrapper[4749]: I0219 19:49:09.889174 4749 generic.go:334] "Generic (PLEG): container finished" podID="47d6ceba-063c-4281-b231-ed33bbcdd888" containerID="1e31fc603f4485f19ce067ef755ada325ddee07ddfa8389fb79382026fbabd99" exitCode=0 Feb 19 19:49:09 crc kubenswrapper[4749]: I0219 19:49:09.889233 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kb4z8" event={"ID":"47d6ceba-063c-4281-b231-ed33bbcdd888","Type":"ContainerDied","Data":"1e31fc603f4485f19ce067ef755ada325ddee07ddfa8389fb79382026fbabd99"} Feb 19 19:49:10 crc kubenswrapper[4749]: I0219 19:49:10.901704 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kb4z8" event={"ID":"47d6ceba-063c-4281-b231-ed33bbcdd888","Type":"ContainerStarted","Data":"2dc89ebac8e33dff6d58659c40a143c9dcb8b65a9642dced110bcfcdd57e4733"} Feb 19 19:49:10 crc kubenswrapper[4749]: I0219 19:49:10.926232 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kb4z8" podStartSLOduration=2.434997601 podStartE2EDuration="4.926211772s" podCreationTimestamp="2026-02-19 19:49:06 +0000 UTC" firstStartedPulling="2026-02-19 19:49:07.862011803 +0000 UTC m=+4521.823231757" lastFinishedPulling="2026-02-19 19:49:10.353225974 +0000 UTC m=+4524.314445928" observedRunningTime="2026-02-19 19:49:10.917743548 +0000 UTC m=+4524.878963512" watchObservedRunningTime="2026-02-19 19:49:10.926211772 +0000 UTC m=+4524.887431726" Feb 19 19:49:16 crc kubenswrapper[4749]: I0219 19:49:16.880528 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kb4z8" Feb 19 19:49:16 crc kubenswrapper[4749]: I0219 19:49:16.880994 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kb4z8" Feb 19 19:49:16 crc kubenswrapper[4749]: I0219 19:49:16.931342 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kb4z8" Feb 19 19:49:17 crc kubenswrapper[4749]: I0219 19:49:17.000381 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kb4z8" Feb 19 19:49:17 crc kubenswrapper[4749]: I0219 19:49:17.169457 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kb4z8"] Feb 19 19:49:18 crc kubenswrapper[4749]: I0219 19:49:18.968217 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kb4z8" podUID="47d6ceba-063c-4281-b231-ed33bbcdd888" containerName="registry-server" containerID="cri-o://2dc89ebac8e33dff6d58659c40a143c9dcb8b65a9642dced110bcfcdd57e4733" gracePeriod=2 Feb 19 19:49:19 crc kubenswrapper[4749]: I0219 19:49:19.503055 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kb4z8" Feb 19 19:49:19 crc kubenswrapper[4749]: I0219 19:49:19.589702 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47d6ceba-063c-4281-b231-ed33bbcdd888-catalog-content\") pod \"47d6ceba-063c-4281-b231-ed33bbcdd888\" (UID: \"47d6ceba-063c-4281-b231-ed33bbcdd888\") " Feb 19 19:49:19 crc kubenswrapper[4749]: I0219 19:49:19.589763 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47d6ceba-063c-4281-b231-ed33bbcdd888-utilities\") pod \"47d6ceba-063c-4281-b231-ed33bbcdd888\" (UID: \"47d6ceba-063c-4281-b231-ed33bbcdd888\") " Feb 19 19:49:19 crc kubenswrapper[4749]: I0219 19:49:19.589903 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8tr9\" (UniqueName: \"kubernetes.io/projected/47d6ceba-063c-4281-b231-ed33bbcdd888-kube-api-access-k8tr9\") pod \"47d6ceba-063c-4281-b231-ed33bbcdd888\" (UID: \"47d6ceba-063c-4281-b231-ed33bbcdd888\") " Feb 19 19:49:19 crc kubenswrapper[4749]: I0219 19:49:19.590977 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47d6ceba-063c-4281-b231-ed33bbcdd888-utilities" (OuterVolumeSpecName: "utilities") pod "47d6ceba-063c-4281-b231-ed33bbcdd888" (UID: "47d6ceba-063c-4281-b231-ed33bbcdd888"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:49:19 crc kubenswrapper[4749]: I0219 19:49:19.607307 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47d6ceba-063c-4281-b231-ed33bbcdd888-kube-api-access-k8tr9" (OuterVolumeSpecName: "kube-api-access-k8tr9") pod "47d6ceba-063c-4281-b231-ed33bbcdd888" (UID: "47d6ceba-063c-4281-b231-ed33bbcdd888"). InnerVolumeSpecName "kube-api-access-k8tr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:49:19 crc kubenswrapper[4749]: I0219 19:49:19.616123 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47d6ceba-063c-4281-b231-ed33bbcdd888-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47d6ceba-063c-4281-b231-ed33bbcdd888" (UID: "47d6ceba-063c-4281-b231-ed33bbcdd888"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:49:19 crc kubenswrapper[4749]: I0219 19:49:19.693234 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47d6ceba-063c-4281-b231-ed33bbcdd888-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:49:19 crc kubenswrapper[4749]: I0219 19:49:19.693263 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47d6ceba-063c-4281-b231-ed33bbcdd888-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:49:19 crc kubenswrapper[4749]: I0219 19:49:19.693272 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8tr9\" (UniqueName: \"kubernetes.io/projected/47d6ceba-063c-4281-b231-ed33bbcdd888-kube-api-access-k8tr9\") on node \"crc\" DevicePath \"\"" Feb 19 19:49:19 crc kubenswrapper[4749]: I0219 19:49:19.979089 4749 generic.go:334] "Generic (PLEG): container finished" podID="47d6ceba-063c-4281-b231-ed33bbcdd888" containerID="2dc89ebac8e33dff6d58659c40a143c9dcb8b65a9642dced110bcfcdd57e4733" exitCode=0 Feb 19 19:49:19 crc kubenswrapper[4749]: I0219 19:49:19.979131 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kb4z8" event={"ID":"47d6ceba-063c-4281-b231-ed33bbcdd888","Type":"ContainerDied","Data":"2dc89ebac8e33dff6d58659c40a143c9dcb8b65a9642dced110bcfcdd57e4733"} Feb 19 19:49:19 crc kubenswrapper[4749]: I0219 19:49:19.979170 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kb4z8" Feb 19 19:49:19 crc kubenswrapper[4749]: I0219 19:49:19.980352 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kb4z8" event={"ID":"47d6ceba-063c-4281-b231-ed33bbcdd888","Type":"ContainerDied","Data":"d91e2ec68e79f436c41abe69b8e5fb446dace668db902c4a83a0a46373ef6698"} Feb 19 19:49:19 crc kubenswrapper[4749]: I0219 19:49:19.980450 4749 scope.go:117] "RemoveContainer" containerID="2dc89ebac8e33dff6d58659c40a143c9dcb8b65a9642dced110bcfcdd57e4733" Feb 19 19:49:20 crc kubenswrapper[4749]: I0219 19:49:20.009159 4749 scope.go:117] "RemoveContainer" containerID="1e31fc603f4485f19ce067ef755ada325ddee07ddfa8389fb79382026fbabd99" Feb 19 19:49:20 crc kubenswrapper[4749]: I0219 19:49:20.016624 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kb4z8"] Feb 19 19:49:20 crc kubenswrapper[4749]: I0219 19:49:20.026105 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kb4z8"] Feb 19 19:49:20 crc kubenswrapper[4749]: I0219 19:49:20.147345 4749 scope.go:117] "RemoveContainer" containerID="81df7b1a22c9ce907693b8f95c5a366a4fd46dd97537cabb9cbebdb35a884630" Feb 19 19:49:20 crc kubenswrapper[4749]: I0219 19:49:20.205100 4749 scope.go:117] "RemoveContainer" containerID="2dc89ebac8e33dff6d58659c40a143c9dcb8b65a9642dced110bcfcdd57e4733" Feb 19 19:49:20 crc kubenswrapper[4749]: E0219 19:49:20.205732 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dc89ebac8e33dff6d58659c40a143c9dcb8b65a9642dced110bcfcdd57e4733\": container with ID starting with 2dc89ebac8e33dff6d58659c40a143c9dcb8b65a9642dced110bcfcdd57e4733 not found: ID does not exist" containerID="2dc89ebac8e33dff6d58659c40a143c9dcb8b65a9642dced110bcfcdd57e4733" Feb 19 19:49:20 crc kubenswrapper[4749]: I0219 19:49:20.205771 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dc89ebac8e33dff6d58659c40a143c9dcb8b65a9642dced110bcfcdd57e4733"} err="failed to get container status \"2dc89ebac8e33dff6d58659c40a143c9dcb8b65a9642dced110bcfcdd57e4733\": rpc error: code = NotFound desc = could not find container \"2dc89ebac8e33dff6d58659c40a143c9dcb8b65a9642dced110bcfcdd57e4733\": container with ID starting with 2dc89ebac8e33dff6d58659c40a143c9dcb8b65a9642dced110bcfcdd57e4733 not found: ID does not exist" Feb 19 19:49:20 crc kubenswrapper[4749]: I0219 19:49:20.205798 4749 scope.go:117] "RemoveContainer" containerID="1e31fc603f4485f19ce067ef755ada325ddee07ddfa8389fb79382026fbabd99" Feb 19 19:49:20 crc kubenswrapper[4749]: E0219 19:49:20.206225 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e31fc603f4485f19ce067ef755ada325ddee07ddfa8389fb79382026fbabd99\": container with ID starting with 1e31fc603f4485f19ce067ef755ada325ddee07ddfa8389fb79382026fbabd99 not found: ID does not exist" containerID="1e31fc603f4485f19ce067ef755ada325ddee07ddfa8389fb79382026fbabd99" Feb 19 19:49:20 crc kubenswrapper[4749]: I0219 19:49:20.206333 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e31fc603f4485f19ce067ef755ada325ddee07ddfa8389fb79382026fbabd99"} err="failed to get container status \"1e31fc603f4485f19ce067ef755ada325ddee07ddfa8389fb79382026fbabd99\": rpc error: code = NotFound desc = could not find container \"1e31fc603f4485f19ce067ef755ada325ddee07ddfa8389fb79382026fbabd99\": container with ID starting with 1e31fc603f4485f19ce067ef755ada325ddee07ddfa8389fb79382026fbabd99 not found: ID does not exist" Feb 19 19:49:20 crc kubenswrapper[4749]: I0219 19:49:20.206436 4749 scope.go:117] "RemoveContainer" containerID="81df7b1a22c9ce907693b8f95c5a366a4fd46dd97537cabb9cbebdb35a884630" Feb 19 19:49:20 crc kubenswrapper[4749]: E0219 19:49:20.206772 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81df7b1a22c9ce907693b8f95c5a366a4fd46dd97537cabb9cbebdb35a884630\": container with ID starting with 81df7b1a22c9ce907693b8f95c5a366a4fd46dd97537cabb9cbebdb35a884630 not found: ID does not exist" containerID="81df7b1a22c9ce907693b8f95c5a366a4fd46dd97537cabb9cbebdb35a884630" Feb 19 19:49:20 crc kubenswrapper[4749]: I0219 19:49:20.206807 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81df7b1a22c9ce907693b8f95c5a366a4fd46dd97537cabb9cbebdb35a884630"} err="failed to get container status \"81df7b1a22c9ce907693b8f95c5a366a4fd46dd97537cabb9cbebdb35a884630\": rpc error: code = NotFound desc = could not find container \"81df7b1a22c9ce907693b8f95c5a366a4fd46dd97537cabb9cbebdb35a884630\": container with ID starting with 81df7b1a22c9ce907693b8f95c5a366a4fd46dd97537cabb9cbebdb35a884630 not found: ID does not exist" Feb 19 19:49:20 crc kubenswrapper[4749]: I0219 19:49:20.700382 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47d6ceba-063c-4281-b231-ed33bbcdd888" path="/var/lib/kubelet/pods/47d6ceba-063c-4281-b231-ed33bbcdd888/volumes" Feb 19 19:49:24 crc kubenswrapper[4749]: I0219 19:49:24.725308 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:49:24 crc kubenswrapper[4749]: I0219 19:49:24.725601 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:49:24 crc kubenswrapper[4749]: I0219 19:49:24.725651 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 19:49:24 crc kubenswrapper[4749]: I0219 19:49:24.726342 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763"} pod="openshift-machine-config-operator/machine-config-daemon-nzldt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:49:24 crc kubenswrapper[4749]: I0219 19:49:24.726400 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" containerID="cri-o://4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763" gracePeriod=600 Feb 19 19:49:24 crc kubenswrapper[4749]: E0219 19:49:24.845826 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:49:25 crc kubenswrapper[4749]: I0219 19:49:25.059409 4749 generic.go:334] "Generic (PLEG): container finished" podID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerID="4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763" exitCode=0 Feb 19 19:49:25 crc kubenswrapper[4749]: I0219 19:49:25.059455 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerDied","Data":"4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763"} Feb 19 19:49:25 crc kubenswrapper[4749]: I0219 19:49:25.059501 4749 scope.go:117] "RemoveContainer" containerID="be5d4f602982c70fd80fccb3c8e3e2f90ab6ede82af7ed1404cc6dd2c26c49ad" Feb 19 19:49:25 crc kubenswrapper[4749]: I0219 19:49:25.060219 4749 scope.go:117] "RemoveContainer" containerID="4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763" Feb 19 19:49:25 crc kubenswrapper[4749]: E0219 19:49:25.060456 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:49:39 crc kubenswrapper[4749]: I0219 19:49:39.679121 4749 scope.go:117] "RemoveContainer" containerID="4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763" Feb 19 19:49:39 crc kubenswrapper[4749]: E0219 19:49:39.679865 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:49:52 crc kubenswrapper[4749]: I0219 19:49:52.679578 4749 scope.go:117] "RemoveContainer" containerID="4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763" Feb 19 19:49:52 crc kubenswrapper[4749]: E0219 19:49:52.680455 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:50:04 crc kubenswrapper[4749]: I0219 19:50:04.679670 4749 scope.go:117] "RemoveContainer" containerID="4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763" Feb 19 19:50:04 crc kubenswrapper[4749]: E0219 19:50:04.680609 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:50:17 crc kubenswrapper[4749]: I0219 19:50:17.679873 4749 scope.go:117] "RemoveContainer" containerID="4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763" Feb 19 19:50:17 crc kubenswrapper[4749]: E0219 19:50:17.680664 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:50:32 crc kubenswrapper[4749]: I0219 19:50:32.679467 4749 scope.go:117] "RemoveContainer" containerID="4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763" Feb 19 19:50:32 crc kubenswrapper[4749]: E0219 19:50:32.680303 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:50:46 crc kubenswrapper[4749]: I0219 19:50:46.678935 4749 scope.go:117] "RemoveContainer" containerID="4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763" Feb 19 19:50:46 crc kubenswrapper[4749]: E0219 19:50:46.679773 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:50:59 crc kubenswrapper[4749]: I0219 19:50:59.679261 4749 scope.go:117] "RemoveContainer" containerID="4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763" Feb 19 19:50:59 crc kubenswrapper[4749]: E0219 19:50:59.680080 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:51:11 crc kubenswrapper[4749]: I0219 19:51:11.679349 4749 scope.go:117] "RemoveContainer" containerID="4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763" Feb 19 19:51:11 crc kubenswrapper[4749]: E0219 19:51:11.680284 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:51:22 crc kubenswrapper[4749]: I0219 19:51:22.678953 4749 scope.go:117] "RemoveContainer" containerID="4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763" Feb 19 19:51:22 crc kubenswrapper[4749]: E0219 19:51:22.679703 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:51:34 crc kubenswrapper[4749]: I0219 19:51:34.679939 4749 scope.go:117] "RemoveContainer" containerID="4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763" Feb 19 19:51:34 crc kubenswrapper[4749]: E0219 19:51:34.680872 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:51:49 crc kubenswrapper[4749]: I0219 19:51:49.678853 4749 scope.go:117] "RemoveContainer" containerID="4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763" Feb 19 19:51:49 crc kubenswrapper[4749]: E0219 19:51:49.679617 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:52:04 crc kubenswrapper[4749]: I0219 19:52:04.679965 4749 scope.go:117] "RemoveContainer" containerID="4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763" Feb 19 19:52:04 crc kubenswrapper[4749]: E0219 19:52:04.680739 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:52:11 crc kubenswrapper[4749]: I0219 19:52:11.696167 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qm2hx"] Feb 19 19:52:11 crc kubenswrapper[4749]: E0219 19:52:11.697126 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d6ceba-063c-4281-b231-ed33bbcdd888" containerName="extract-utilities" Feb 19 19:52:11 crc kubenswrapper[4749]: I0219 19:52:11.697143 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d6ceba-063c-4281-b231-ed33bbcdd888" containerName="extract-utilities" Feb 19 19:52:11 crc kubenswrapper[4749]: E0219 19:52:11.697173 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d6ceba-063c-4281-b231-ed33bbcdd888" containerName="extract-content" Feb 19 19:52:11 crc kubenswrapper[4749]: I0219 19:52:11.697180 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d6ceba-063c-4281-b231-ed33bbcdd888" containerName="extract-content" Feb 19 19:52:11 crc kubenswrapper[4749]: E0219 19:52:11.697209 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d6ceba-063c-4281-b231-ed33bbcdd888" containerName="registry-server" Feb 19 19:52:11 crc kubenswrapper[4749]: I0219 19:52:11.697215 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d6ceba-063c-4281-b231-ed33bbcdd888" containerName="registry-server" Feb 19 19:52:11 crc kubenswrapper[4749]: I0219 19:52:11.697438 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="47d6ceba-063c-4281-b231-ed33bbcdd888" containerName="registry-server" Feb 19 19:52:11 crc kubenswrapper[4749]: I0219 19:52:11.699117 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qm2hx" Feb 19 19:52:11 crc kubenswrapper[4749]: I0219 19:52:11.715406 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qm2hx"] Feb 19 19:52:11 crc kubenswrapper[4749]: I0219 19:52:11.813638 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdbq6\" (UniqueName: \"kubernetes.io/projected/2ea039f2-468c-4cb6-b805-d4aa382ae46f-kube-api-access-gdbq6\") pod \"community-operators-qm2hx\" (UID: \"2ea039f2-468c-4cb6-b805-d4aa382ae46f\") " pod="openshift-marketplace/community-operators-qm2hx" Feb 19 19:52:11 crc kubenswrapper[4749]: I0219 19:52:11.813732 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea039f2-468c-4cb6-b805-d4aa382ae46f-catalog-content\") pod \"community-operators-qm2hx\" (UID: \"2ea039f2-468c-4cb6-b805-d4aa382ae46f\") " pod="openshift-marketplace/community-operators-qm2hx" Feb 19 19:52:11 crc kubenswrapper[4749]: I0219 19:52:11.813789 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea039f2-468c-4cb6-b805-d4aa382ae46f-utilities\") pod \"community-operators-qm2hx\" (UID: \"2ea039f2-468c-4cb6-b805-d4aa382ae46f\") " pod="openshift-marketplace/community-operators-qm2hx" Feb 19 19:52:11 crc kubenswrapper[4749]: I0219 19:52:11.915562 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdbq6\" (UniqueName: \"kubernetes.io/projected/2ea039f2-468c-4cb6-b805-d4aa382ae46f-kube-api-access-gdbq6\") pod \"community-operators-qm2hx\" (UID: \"2ea039f2-468c-4cb6-b805-d4aa382ae46f\") " pod="openshift-marketplace/community-operators-qm2hx" Feb 19 19:52:11 crc kubenswrapper[4749]: I0219 19:52:11.915615 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea039f2-468c-4cb6-b805-d4aa382ae46f-catalog-content\") pod \"community-operators-qm2hx\" (UID: \"2ea039f2-468c-4cb6-b805-d4aa382ae46f\") " pod="openshift-marketplace/community-operators-qm2hx" Feb 19 19:52:11 crc kubenswrapper[4749]: I0219 19:52:11.915659 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea039f2-468c-4cb6-b805-d4aa382ae46f-utilities\") pod \"community-operators-qm2hx\" (UID: \"2ea039f2-468c-4cb6-b805-d4aa382ae46f\") " pod="openshift-marketplace/community-operators-qm2hx" Feb 19 19:52:11 crc kubenswrapper[4749]: I0219 19:52:11.916287 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea039f2-468c-4cb6-b805-d4aa382ae46f-catalog-content\") pod \"community-operators-qm2hx\" (UID: \"2ea039f2-468c-4cb6-b805-d4aa382ae46f\") " pod="openshift-marketplace/community-operators-qm2hx" Feb 19 19:52:11 crc kubenswrapper[4749]: I0219 19:52:11.916320 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea039f2-468c-4cb6-b805-d4aa382ae46f-utilities\") pod \"community-operators-qm2hx\" (UID: \"2ea039f2-468c-4cb6-b805-d4aa382ae46f\") " pod="openshift-marketplace/community-operators-qm2hx" Feb 19 19:52:11 crc kubenswrapper[4749]: I0219 19:52:11.945008 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdbq6\" (UniqueName: \"kubernetes.io/projected/2ea039f2-468c-4cb6-b805-d4aa382ae46f-kube-api-access-gdbq6\") pod \"community-operators-qm2hx\" (UID: \"2ea039f2-468c-4cb6-b805-d4aa382ae46f\") " pod="openshift-marketplace/community-operators-qm2hx" Feb 19 19:52:12 crc kubenswrapper[4749]: I0219 19:52:12.031525 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qm2hx" Feb 19 19:52:12 crc kubenswrapper[4749]: I0219 19:52:12.561976 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qm2hx"] Feb 19 19:52:12 crc kubenswrapper[4749]: I0219 19:52:12.630071 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qm2hx" event={"ID":"2ea039f2-468c-4cb6-b805-d4aa382ae46f","Type":"ContainerStarted","Data":"c1c0f2bf16ee0880786f7857cd45c348a96354cdf1c5de039c1a87f6e83579ab"} Feb 19 19:52:13 crc kubenswrapper[4749]: I0219 19:52:13.640281 4749 generic.go:334] "Generic (PLEG): container finished" podID="2ea039f2-468c-4cb6-b805-d4aa382ae46f" containerID="f4fb659718608aa7c9aa88cedc8923c0e87a756111c939b1e4ea3beb8f9d53f7" exitCode=0 Feb 19 19:52:13 crc kubenswrapper[4749]: I0219 19:52:13.640347 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qm2hx" event={"ID":"2ea039f2-468c-4cb6-b805-d4aa382ae46f","Type":"ContainerDied","Data":"f4fb659718608aa7c9aa88cedc8923c0e87a756111c939b1e4ea3beb8f9d53f7"} Feb 19 19:52:13 crc kubenswrapper[4749]: I0219 19:52:13.644352 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 19:52:14 crc kubenswrapper[4749]: I0219 19:52:14.650599 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qm2hx" event={"ID":"2ea039f2-468c-4cb6-b805-d4aa382ae46f","Type":"ContainerStarted","Data":"209931ef2120f9aa0d255864f687ed2512f86e1259c0d817e8c1de02bce99a26"} Feb 19 19:52:16 crc kubenswrapper[4749]: I0219 19:52:16.671217 4749 generic.go:334] "Generic (PLEG): container finished" podID="2ea039f2-468c-4cb6-b805-d4aa382ae46f" containerID="209931ef2120f9aa0d255864f687ed2512f86e1259c0d817e8c1de02bce99a26" exitCode=0 Feb 19 19:52:16 crc kubenswrapper[4749]: I0219 19:52:16.671258 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qm2hx" event={"ID":"2ea039f2-468c-4cb6-b805-d4aa382ae46f","Type":"ContainerDied","Data":"209931ef2120f9aa0d255864f687ed2512f86e1259c0d817e8c1de02bce99a26"} Feb 19 19:52:16 crc kubenswrapper[4749]: I0219 19:52:16.679491 4749 scope.go:117] "RemoveContainer" containerID="4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763" Feb 19 19:52:16 crc kubenswrapper[4749]: E0219 19:52:16.679792 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:52:17 crc kubenswrapper[4749]: I0219 19:52:17.682055 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qm2hx" event={"ID":"2ea039f2-468c-4cb6-b805-d4aa382ae46f","Type":"ContainerStarted","Data":"cc098401d1ba0983c9315b1df172d361ce7d7fa3afbd17af05af290dd725b3e1"} Feb 19 19:52:17 crc kubenswrapper[4749]: I0219 19:52:17.704584 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qm2hx" podStartSLOduration=3.27963059 podStartE2EDuration="6.704565472s" podCreationTimestamp="2026-02-19 19:52:11 +0000 UTC" firstStartedPulling="2026-02-19 19:52:13.64404336 +0000 UTC m=+4707.605263314" lastFinishedPulling="2026-02-19 19:52:17.068978242 +0000 UTC m=+4711.030198196" observedRunningTime="2026-02-19 19:52:17.697374558 +0000 UTC m=+4711.658594522" watchObservedRunningTime="2026-02-19 19:52:17.704565472 +0000 UTC m=+4711.665785426" Feb 19 19:52:22 crc kubenswrapper[4749]: I0219 19:52:22.032418 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qm2hx" Feb 19 19:52:22 crc kubenswrapper[4749]: I0219 19:52:22.033286 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qm2hx" Feb 19 19:52:22 crc kubenswrapper[4749]: I0219 19:52:22.108092 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qm2hx" Feb 19 19:52:22 crc kubenswrapper[4749]: I0219 19:52:22.775783 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qm2hx" Feb 19 19:52:22 crc kubenswrapper[4749]: I0219 19:52:22.826021 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qm2hx"] Feb 19 19:52:24 crc kubenswrapper[4749]: I0219 19:52:24.745922 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qm2hx" podUID="2ea039f2-468c-4cb6-b805-d4aa382ae46f" containerName="registry-server" containerID="cri-o://cc098401d1ba0983c9315b1df172d361ce7d7fa3afbd17af05af290dd725b3e1" gracePeriod=2 Feb 19 19:52:25 crc kubenswrapper[4749]: I0219 19:52:25.217422 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qm2hx" Feb 19 19:52:25 crc kubenswrapper[4749]: I0219 19:52:25.324878 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea039f2-468c-4cb6-b805-d4aa382ae46f-utilities\") pod \"2ea039f2-468c-4cb6-b805-d4aa382ae46f\" (UID: \"2ea039f2-468c-4cb6-b805-d4aa382ae46f\") " Feb 19 19:52:25 crc kubenswrapper[4749]: I0219 19:52:25.325961 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdbq6\" (UniqueName: \"kubernetes.io/projected/2ea039f2-468c-4cb6-b805-d4aa382ae46f-kube-api-access-gdbq6\") pod \"2ea039f2-468c-4cb6-b805-d4aa382ae46f\" (UID: \"2ea039f2-468c-4cb6-b805-d4aa382ae46f\") " Feb 19 19:52:25 crc kubenswrapper[4749]: I0219 19:52:25.326101 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea039f2-468c-4cb6-b805-d4aa382ae46f-catalog-content\") pod \"2ea039f2-468c-4cb6-b805-d4aa382ae46f\" (UID: \"2ea039f2-468c-4cb6-b805-d4aa382ae46f\") " Feb 19 19:52:25 crc kubenswrapper[4749]: I0219 19:52:25.326093 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ea039f2-468c-4cb6-b805-d4aa382ae46f-utilities" (OuterVolumeSpecName: "utilities") pod "2ea039f2-468c-4cb6-b805-d4aa382ae46f" (UID: "2ea039f2-468c-4cb6-b805-d4aa382ae46f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:52:25 crc kubenswrapper[4749]: I0219 19:52:25.327334 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea039f2-468c-4cb6-b805-d4aa382ae46f-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:25 crc kubenswrapper[4749]: I0219 19:52:25.334203 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ea039f2-468c-4cb6-b805-d4aa382ae46f-kube-api-access-gdbq6" (OuterVolumeSpecName: "kube-api-access-gdbq6") pod "2ea039f2-468c-4cb6-b805-d4aa382ae46f" (UID: "2ea039f2-468c-4cb6-b805-d4aa382ae46f"). InnerVolumeSpecName "kube-api-access-gdbq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:52:25 crc kubenswrapper[4749]: I0219 19:52:25.398313 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ea039f2-468c-4cb6-b805-d4aa382ae46f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ea039f2-468c-4cb6-b805-d4aa382ae46f" (UID: "2ea039f2-468c-4cb6-b805-d4aa382ae46f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:52:25 crc kubenswrapper[4749]: I0219 19:52:25.429487 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdbq6\" (UniqueName: \"kubernetes.io/projected/2ea039f2-468c-4cb6-b805-d4aa382ae46f-kube-api-access-gdbq6\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:25 crc kubenswrapper[4749]: I0219 19:52:25.429752 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea039f2-468c-4cb6-b805-d4aa382ae46f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:25 crc kubenswrapper[4749]: I0219 19:52:25.756442 4749 generic.go:334] "Generic (PLEG): container finished" podID="2ea039f2-468c-4cb6-b805-d4aa382ae46f" containerID="cc098401d1ba0983c9315b1df172d361ce7d7fa3afbd17af05af290dd725b3e1" exitCode=0 Feb 19 19:52:25 crc kubenswrapper[4749]: I0219 19:52:25.756486 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qm2hx" event={"ID":"2ea039f2-468c-4cb6-b805-d4aa382ae46f","Type":"ContainerDied","Data":"cc098401d1ba0983c9315b1df172d361ce7d7fa3afbd17af05af290dd725b3e1"} Feb 19 19:52:25 crc kubenswrapper[4749]: I0219 19:52:25.756504 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qm2hx" Feb 19 19:52:25 crc kubenswrapper[4749]: I0219 19:52:25.756523 4749 scope.go:117] "RemoveContainer" containerID="cc098401d1ba0983c9315b1df172d361ce7d7fa3afbd17af05af290dd725b3e1" Feb 19 19:52:25 crc kubenswrapper[4749]: I0219 19:52:25.756513 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qm2hx" event={"ID":"2ea039f2-468c-4cb6-b805-d4aa382ae46f","Type":"ContainerDied","Data":"c1c0f2bf16ee0880786f7857cd45c348a96354cdf1c5de039c1a87f6e83579ab"} Feb 19 19:52:25 crc kubenswrapper[4749]: I0219 19:52:25.776660 4749 scope.go:117] "RemoveContainer" containerID="209931ef2120f9aa0d255864f687ed2512f86e1259c0d817e8c1de02bce99a26" Feb 19 19:52:25 crc kubenswrapper[4749]: I0219 19:52:25.803748 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qm2hx"] Feb 19 19:52:25 crc kubenswrapper[4749]: I0219 19:52:25.803848 4749 scope.go:117] "RemoveContainer" containerID="f4fb659718608aa7c9aa88cedc8923c0e87a756111c939b1e4ea3beb8f9d53f7" Feb 19 19:52:25 crc kubenswrapper[4749]: I0219 19:52:25.818573 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qm2hx"] Feb 19 19:52:25 crc kubenswrapper[4749]: I0219 19:52:25.862256 4749 scope.go:117] "RemoveContainer" containerID="cc098401d1ba0983c9315b1df172d361ce7d7fa3afbd17af05af290dd725b3e1" Feb 19 19:52:25 crc kubenswrapper[4749]: E0219 19:52:25.862640 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc098401d1ba0983c9315b1df172d361ce7d7fa3afbd17af05af290dd725b3e1\": container with ID starting with cc098401d1ba0983c9315b1df172d361ce7d7fa3afbd17af05af290dd725b3e1 not found: ID does not exist" containerID="cc098401d1ba0983c9315b1df172d361ce7d7fa3afbd17af05af290dd725b3e1" Feb 19 19:52:25 crc kubenswrapper[4749]: I0219 19:52:25.862686 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc098401d1ba0983c9315b1df172d361ce7d7fa3afbd17af05af290dd725b3e1"} err="failed to get container status \"cc098401d1ba0983c9315b1df172d361ce7d7fa3afbd17af05af290dd725b3e1\": rpc error: code = NotFound desc = could not find container \"cc098401d1ba0983c9315b1df172d361ce7d7fa3afbd17af05af290dd725b3e1\": container with ID starting with cc098401d1ba0983c9315b1df172d361ce7d7fa3afbd17af05af290dd725b3e1 not found: ID does not exist" Feb 19 19:52:25 crc kubenswrapper[4749]: I0219 19:52:25.862714 4749 scope.go:117] "RemoveContainer" containerID="209931ef2120f9aa0d255864f687ed2512f86e1259c0d817e8c1de02bce99a26" Feb 19 19:52:25 crc kubenswrapper[4749]: E0219 19:52:25.864597 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"209931ef2120f9aa0d255864f687ed2512f86e1259c0d817e8c1de02bce99a26\": container with ID starting with 209931ef2120f9aa0d255864f687ed2512f86e1259c0d817e8c1de02bce99a26 not found: ID does not exist" containerID="209931ef2120f9aa0d255864f687ed2512f86e1259c0d817e8c1de02bce99a26" Feb 19 19:52:25 crc kubenswrapper[4749]: I0219 19:52:25.864638 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"209931ef2120f9aa0d255864f687ed2512f86e1259c0d817e8c1de02bce99a26"} err="failed to get container status \"209931ef2120f9aa0d255864f687ed2512f86e1259c0d817e8c1de02bce99a26\": rpc error: code = NotFound desc = could not find container \"209931ef2120f9aa0d255864f687ed2512f86e1259c0d817e8c1de02bce99a26\": container with ID starting with 209931ef2120f9aa0d255864f687ed2512f86e1259c0d817e8c1de02bce99a26 not found: ID does not exist" Feb 19 19:52:25 crc kubenswrapper[4749]: I0219 19:52:25.864666 4749 scope.go:117] "RemoveContainer" containerID="f4fb659718608aa7c9aa88cedc8923c0e87a756111c939b1e4ea3beb8f9d53f7" Feb 19 19:52:25 crc kubenswrapper[4749]: E0219 19:52:25.866141 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4fb659718608aa7c9aa88cedc8923c0e87a756111c939b1e4ea3beb8f9d53f7\": container with ID starting with f4fb659718608aa7c9aa88cedc8923c0e87a756111c939b1e4ea3beb8f9d53f7 not found: ID does not exist" containerID="f4fb659718608aa7c9aa88cedc8923c0e87a756111c939b1e4ea3beb8f9d53f7" Feb 19 19:52:25 crc kubenswrapper[4749]: I0219 19:52:25.866169 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4fb659718608aa7c9aa88cedc8923c0e87a756111c939b1e4ea3beb8f9d53f7"} err="failed to get container status \"f4fb659718608aa7c9aa88cedc8923c0e87a756111c939b1e4ea3beb8f9d53f7\": rpc error: code = NotFound desc = could not find container \"f4fb659718608aa7c9aa88cedc8923c0e87a756111c939b1e4ea3beb8f9d53f7\": container with ID starting with f4fb659718608aa7c9aa88cedc8923c0e87a756111c939b1e4ea3beb8f9d53f7 not found: ID does not exist" Feb 19 19:52:26 crc kubenswrapper[4749]: I0219 19:52:26.689707 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ea039f2-468c-4cb6-b805-d4aa382ae46f" path="/var/lib/kubelet/pods/2ea039f2-468c-4cb6-b805-d4aa382ae46f/volumes" Feb 19 19:52:30 crc kubenswrapper[4749]: I0219 19:52:30.679850 4749 scope.go:117] "RemoveContainer" containerID="4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763" Feb 19 19:52:30 crc kubenswrapper[4749]: E0219 19:52:30.680728 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:52:41 crc kubenswrapper[4749]: I0219 19:52:41.679359 4749 scope.go:117] "RemoveContainer" containerID="4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763" Feb 19 19:52:41 crc kubenswrapper[4749]: E0219 19:52:41.681000 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:52:53 crc kubenswrapper[4749]: I0219 19:52:53.679467 4749 scope.go:117] "RemoveContainer" containerID="4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763" Feb 19 19:52:53 crc kubenswrapper[4749]: E0219 19:52:53.680354 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:53:08 crc kubenswrapper[4749]: I0219 19:53:08.683366 4749 scope.go:117] "RemoveContainer" containerID="4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763" Feb 19 19:53:08 crc kubenswrapper[4749]: E0219 19:53:08.684245 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:53:22 crc kubenswrapper[4749]: I0219 19:53:22.679373 4749 scope.go:117] "RemoveContainer" containerID="4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763" Feb 19 19:53:22 crc kubenswrapper[4749]: E0219 19:53:22.680178 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:53:35 crc kubenswrapper[4749]: I0219 19:53:35.679587 4749 scope.go:117] "RemoveContainer" containerID="4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763" Feb 19 19:53:35 crc kubenswrapper[4749]: E0219 19:53:35.680221 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:53:50 crc kubenswrapper[4749]: I0219 19:53:50.679142 4749 scope.go:117] "RemoveContainer" containerID="4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763" Feb 19 19:53:50 crc kubenswrapper[4749]: E0219 19:53:50.679968 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:54:04 crc kubenswrapper[4749]: I0219 19:54:04.681022 4749 scope.go:117] "RemoveContainer" containerID="4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763" Feb 19 19:54:04 crc kubenswrapper[4749]: E0219 19:54:04.681955 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:54:16 crc kubenswrapper[4749]: I0219 19:54:16.687107 4749 scope.go:117] "RemoveContainer" containerID="4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763" Feb 19 19:54:16 crc kubenswrapper[4749]: E0219 19:54:16.687801 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 19:54:30 crc kubenswrapper[4749]: I0219 19:54:30.680488 4749 scope.go:117] "RemoveContainer" containerID="4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763" Feb 19 19:54:30 crc kubenswrapper[4749]: I0219 19:54:30.966456 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerStarted","Data":"67efe7d26f397e5c6c83b5c604adacb30f5ae2a6ee4a04b1e3e91db09ca88945"} Feb 19 19:54:34 crc kubenswrapper[4749]: I0219 19:54:34.009945 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mvn5p"] Feb 19 19:54:34 crc kubenswrapper[4749]: E0219 19:54:34.010986 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea039f2-468c-4cb6-b805-d4aa382ae46f" containerName="extract-utilities" Feb 19 19:54:34 crc kubenswrapper[4749]: I0219 19:54:34.011004 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea039f2-468c-4cb6-b805-d4aa382ae46f" containerName="extract-utilities" Feb 19 19:54:34 crc kubenswrapper[4749]: E0219 19:54:34.011069 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea039f2-468c-4cb6-b805-d4aa382ae46f" containerName="extract-content" Feb 19 19:54:34 crc kubenswrapper[4749]: I0219 19:54:34.011080 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea039f2-468c-4cb6-b805-d4aa382ae46f" containerName="extract-content" Feb 19 19:54:34 crc kubenswrapper[4749]: E0219 19:54:34.011093 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea039f2-468c-4cb6-b805-d4aa382ae46f" containerName="registry-server" Feb 19 19:54:34 crc kubenswrapper[4749]: I0219 19:54:34.011101 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea039f2-468c-4cb6-b805-d4aa382ae46f" containerName="registry-server" Feb 19 19:54:34 crc kubenswrapper[4749]: I0219 19:54:34.011326 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ea039f2-468c-4cb6-b805-d4aa382ae46f" containerName="registry-server" Feb 19 19:54:34 crc kubenswrapper[4749]: I0219 19:54:34.013155 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvn5p" Feb 19 19:54:34 crc kubenswrapper[4749]: I0219 19:54:34.025814 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mvn5p"] Feb 19 19:54:34 crc kubenswrapper[4749]: I0219 19:54:34.140442 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bffa375b-600a-4b38-a5cf-1f0573aca7d6-catalog-content\") pod \"redhat-operators-mvn5p\" (UID: \"bffa375b-600a-4b38-a5cf-1f0573aca7d6\") " pod="openshift-marketplace/redhat-operators-mvn5p" Feb 19 19:54:34 crc kubenswrapper[4749]: I0219 19:54:34.140570 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpksr\" (UniqueName: \"kubernetes.io/projected/bffa375b-600a-4b38-a5cf-1f0573aca7d6-kube-api-access-qpksr\") pod \"redhat-operators-mvn5p\" (UID: \"bffa375b-600a-4b38-a5cf-1f0573aca7d6\") " pod="openshift-marketplace/redhat-operators-mvn5p" Feb 19 19:54:34 crc kubenswrapper[4749]: I0219 19:54:34.140615 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bffa375b-600a-4b38-a5cf-1f0573aca7d6-utilities\") pod \"redhat-operators-mvn5p\" (UID: \"bffa375b-600a-4b38-a5cf-1f0573aca7d6\") " pod="openshift-marketplace/redhat-operators-mvn5p" Feb 19 19:54:34 crc kubenswrapper[4749]: I0219 19:54:34.242730 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bffa375b-600a-4b38-a5cf-1f0573aca7d6-catalog-content\") pod \"redhat-operators-mvn5p\" (UID: \"bffa375b-600a-4b38-a5cf-1f0573aca7d6\") " pod="openshift-marketplace/redhat-operators-mvn5p" Feb 19 19:54:34 crc kubenswrapper[4749]: I0219 19:54:34.242855 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpksr\" (UniqueName: \"kubernetes.io/projected/bffa375b-600a-4b38-a5cf-1f0573aca7d6-kube-api-access-qpksr\") pod \"redhat-operators-mvn5p\" (UID: \"bffa375b-600a-4b38-a5cf-1f0573aca7d6\") " pod="openshift-marketplace/redhat-operators-mvn5p" Feb 19 19:54:34 crc kubenswrapper[4749]: I0219 19:54:34.242889 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bffa375b-600a-4b38-a5cf-1f0573aca7d6-utilities\") pod \"redhat-operators-mvn5p\" (UID: \"bffa375b-600a-4b38-a5cf-1f0573aca7d6\") " pod="openshift-marketplace/redhat-operators-mvn5p" Feb 19 19:54:34 crc kubenswrapper[4749]: I0219 19:54:34.243446 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bffa375b-600a-4b38-a5cf-1f0573aca7d6-catalog-content\") pod \"redhat-operators-mvn5p\" (UID: \"bffa375b-600a-4b38-a5cf-1f0573aca7d6\") " pod="openshift-marketplace/redhat-operators-mvn5p" Feb 19 19:54:34 crc kubenswrapper[4749]: I0219 19:54:34.243481 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bffa375b-600a-4b38-a5cf-1f0573aca7d6-utilities\") pod \"redhat-operators-mvn5p\" (UID: \"bffa375b-600a-4b38-a5cf-1f0573aca7d6\") " pod="openshift-marketplace/redhat-operators-mvn5p" Feb 19 19:54:34 crc kubenswrapper[4749]: I0219 19:54:34.266106 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpksr\" (UniqueName: \"kubernetes.io/projected/bffa375b-600a-4b38-a5cf-1f0573aca7d6-kube-api-access-qpksr\") pod \"redhat-operators-mvn5p\" (UID: \"bffa375b-600a-4b38-a5cf-1f0573aca7d6\") " pod="openshift-marketplace/redhat-operators-mvn5p" Feb 19 19:54:34 crc kubenswrapper[4749]: I0219 19:54:34.334633 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvn5p" Feb 19 19:54:34 crc kubenswrapper[4749]: I0219 19:54:34.830215 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mvn5p"] Feb 19 19:54:35 crc kubenswrapper[4749]: I0219 19:54:35.017350 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvn5p" event={"ID":"bffa375b-600a-4b38-a5cf-1f0573aca7d6","Type":"ContainerStarted","Data":"66305ef23c664b3135736c23c0875c2bca05de440bc4f9acbd75ad23530708d4"} Feb 19 19:54:36 crc kubenswrapper[4749]: I0219 19:54:36.027719 4749 generic.go:334] "Generic (PLEG): container finished" podID="bffa375b-600a-4b38-a5cf-1f0573aca7d6" containerID="8c6f14a3687fe0c8a36a8eb97282be1263464be4a724da08776147efeb4a0022" exitCode=0 Feb 19 19:54:36 crc kubenswrapper[4749]: I0219 19:54:36.027847 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvn5p" event={"ID":"bffa375b-600a-4b38-a5cf-1f0573aca7d6","Type":"ContainerDied","Data":"8c6f14a3687fe0c8a36a8eb97282be1263464be4a724da08776147efeb4a0022"} Feb 19 19:54:37 crc kubenswrapper[4749]: I0219 19:54:37.038191 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvn5p" event={"ID":"bffa375b-600a-4b38-a5cf-1f0573aca7d6","Type":"ContainerStarted","Data":"d30d709882167466a1c38904c42446137d5976ef7c65fb270290156cc4d00ebe"} Feb 19 19:54:42 crc kubenswrapper[4749]: I0219 19:54:42.091242 4749 generic.go:334] "Generic (PLEG): container finished" podID="bffa375b-600a-4b38-a5cf-1f0573aca7d6" containerID="d30d709882167466a1c38904c42446137d5976ef7c65fb270290156cc4d00ebe" exitCode=0 Feb 19 19:54:42 crc kubenswrapper[4749]: I0219 19:54:42.091319 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvn5p" event={"ID":"bffa375b-600a-4b38-a5cf-1f0573aca7d6","Type":"ContainerDied","Data":"d30d709882167466a1c38904c42446137d5976ef7c65fb270290156cc4d00ebe"} Feb 19 19:54:43 crc kubenswrapper[4749]: I0219 19:54:43.104278 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvn5p" event={"ID":"bffa375b-600a-4b38-a5cf-1f0573aca7d6","Type":"ContainerStarted","Data":"de7728291d4b0323570afd917647553fa758c92ba321a4da6c4c6e207e4ca99d"} Feb 19 19:54:43 crc kubenswrapper[4749]: I0219 19:54:43.129844 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mvn5p" podStartSLOduration=3.700305549 podStartE2EDuration="10.129827413s" podCreationTimestamp="2026-02-19 19:54:33 +0000 UTC" firstStartedPulling="2026-02-19 19:54:36.03129781 +0000 UTC m=+4849.992517764" lastFinishedPulling="2026-02-19 19:54:42.460819674 +0000 UTC m=+4856.422039628" observedRunningTime="2026-02-19 19:54:43.125639032 +0000 UTC m=+4857.086858986" watchObservedRunningTime="2026-02-19 19:54:43.129827413 +0000 UTC m=+4857.091047367" Feb 19 19:54:44 crc kubenswrapper[4749]: I0219 19:54:44.335062 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mvn5p" Feb 19 19:54:44 crc kubenswrapper[4749]: I0219 19:54:44.335102 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mvn5p" Feb 19 19:54:45 crc kubenswrapper[4749]: I0219 19:54:45.380333 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mvn5p" podUID="bffa375b-600a-4b38-a5cf-1f0573aca7d6" containerName="registry-server" probeResult="failure" output=< Feb 19 19:54:45 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Feb 19 19:54:45 crc kubenswrapper[4749]: > Feb 19 19:54:55 crc kubenswrapper[4749]: I0219 19:54:55.389500 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mvn5p" podUID="bffa375b-600a-4b38-a5cf-1f0573aca7d6" containerName="registry-server" probeResult="failure" output=< Feb 19 19:54:55 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Feb 19 19:54:55 crc kubenswrapper[4749]: > Feb 19 19:55:00 crc kubenswrapper[4749]: I0219 19:55:00.649597 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bjw78"] Feb 19 19:55:00 crc kubenswrapper[4749]: I0219 19:55:00.654569 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bjw78" Feb 19 19:55:00 crc kubenswrapper[4749]: I0219 19:55:00.661918 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bjw78"] Feb 19 19:55:00 crc kubenswrapper[4749]: I0219 19:55:00.823168 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8c5c0b7-a76e-46e1-aa95-066286eab595-utilities\") pod \"certified-operators-bjw78\" (UID: \"d8c5c0b7-a76e-46e1-aa95-066286eab595\") " pod="openshift-marketplace/certified-operators-bjw78" Feb 19 19:55:00 crc kubenswrapper[4749]: I0219 19:55:00.823290 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8c5c0b7-a76e-46e1-aa95-066286eab595-catalog-content\") pod \"certified-operators-bjw78\" (UID: \"d8c5c0b7-a76e-46e1-aa95-066286eab595\") " pod="openshift-marketplace/certified-operators-bjw78" Feb 19 19:55:00 crc kubenswrapper[4749]: I0219 19:55:00.823404 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6248\" (UniqueName: \"kubernetes.io/projected/d8c5c0b7-a76e-46e1-aa95-066286eab595-kube-api-access-v6248\") pod \"certified-operators-bjw78\" (UID: \"d8c5c0b7-a76e-46e1-aa95-066286eab595\") " pod="openshift-marketplace/certified-operators-bjw78" Feb 19 19:55:00 crc kubenswrapper[4749]: I0219 19:55:00.924663 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6248\" (UniqueName: \"kubernetes.io/projected/d8c5c0b7-a76e-46e1-aa95-066286eab595-kube-api-access-v6248\") pod \"certified-operators-bjw78\" (UID: \"d8c5c0b7-a76e-46e1-aa95-066286eab595\") " pod="openshift-marketplace/certified-operators-bjw78" Feb 19 19:55:00 crc kubenswrapper[4749]: I0219 19:55:00.924820 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8c5c0b7-a76e-46e1-aa95-066286eab595-utilities\") pod \"certified-operators-bjw78\" (UID: \"d8c5c0b7-a76e-46e1-aa95-066286eab595\") " pod="openshift-marketplace/certified-operators-bjw78" Feb 19 19:55:00 crc kubenswrapper[4749]: I0219 19:55:00.924882 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8c5c0b7-a76e-46e1-aa95-066286eab595-catalog-content\") pod \"certified-operators-bjw78\" (UID: \"d8c5c0b7-a76e-46e1-aa95-066286eab595\") " pod="openshift-marketplace/certified-operators-bjw78" Feb 19 19:55:00 crc kubenswrapper[4749]: I0219 19:55:00.925387 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8c5c0b7-a76e-46e1-aa95-066286eab595-catalog-content\") pod \"certified-operators-bjw78\" (UID: \"d8c5c0b7-a76e-46e1-aa95-066286eab595\") " pod="openshift-marketplace/certified-operators-bjw78" Feb 19 19:55:00 crc kubenswrapper[4749]: I0219 19:55:00.925620 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8c5c0b7-a76e-46e1-aa95-066286eab595-utilities\") pod \"certified-operators-bjw78\" (UID: \"d8c5c0b7-a76e-46e1-aa95-066286eab595\") " pod="openshift-marketplace/certified-operators-bjw78" Feb 19 19:55:00 crc kubenswrapper[4749]: I0219 19:55:00.948327 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6248\" (UniqueName: \"kubernetes.io/projected/d8c5c0b7-a76e-46e1-aa95-066286eab595-kube-api-access-v6248\") pod \"certified-operators-bjw78\" (UID: \"d8c5c0b7-a76e-46e1-aa95-066286eab595\") " pod="openshift-marketplace/certified-operators-bjw78" Feb 19 19:55:00 crc kubenswrapper[4749]: I0219 19:55:00.984214 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bjw78" Feb 19 19:55:01 crc kubenswrapper[4749]: I0219 19:55:01.587387 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bjw78"] Feb 19 19:55:01 crc kubenswrapper[4749]: W0219 19:55:01.605917 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8c5c0b7_a76e_46e1_aa95_066286eab595.slice/crio-b7edcb1c4f7e58327e0da5b29d7f1e1e2865ea92c5c70c9406bf024998b02bc3 WatchSource:0}: Error finding container b7edcb1c4f7e58327e0da5b29d7f1e1e2865ea92c5c70c9406bf024998b02bc3: Status 404 returned error can't find the container with id b7edcb1c4f7e58327e0da5b29d7f1e1e2865ea92c5c70c9406bf024998b02bc3 Feb 19 19:55:02 crc kubenswrapper[4749]: I0219 19:55:02.294012 4749 generic.go:334] "Generic (PLEG): container finished" podID="d8c5c0b7-a76e-46e1-aa95-066286eab595" containerID="a0f77dfc860fb3bacf26d50b7c36b9f6b14ad90cd848a87a75417cc9d30ede93" exitCode=0 Feb 19 19:55:02 crc kubenswrapper[4749]: I0219 19:55:02.294113 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bjw78" event={"ID":"d8c5c0b7-a76e-46e1-aa95-066286eab595","Type":"ContainerDied","Data":"a0f77dfc860fb3bacf26d50b7c36b9f6b14ad90cd848a87a75417cc9d30ede93"} Feb 19 19:55:02 crc kubenswrapper[4749]: I0219 19:55:02.294407 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bjw78" event={"ID":"d8c5c0b7-a76e-46e1-aa95-066286eab595","Type":"ContainerStarted","Data":"b7edcb1c4f7e58327e0da5b29d7f1e1e2865ea92c5c70c9406bf024998b02bc3"} Feb 19 19:55:04 crc kubenswrapper[4749]: I0219 19:55:04.319750 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bjw78" event={"ID":"d8c5c0b7-a76e-46e1-aa95-066286eab595","Type":"ContainerStarted","Data":"d917e2b8cd16b168ddb262fb467535b99e98d738f58c4a5d5b9cb4ffcaa78e79"} Feb 19 19:55:04 crc kubenswrapper[4749]: I0219 19:55:04.386084 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mvn5p" Feb 19 19:55:04 crc kubenswrapper[4749]: I0219 19:55:04.437763 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mvn5p" Feb 19 19:55:05 crc kubenswrapper[4749]: I0219 19:55:05.334225 4749 generic.go:334] "Generic (PLEG): container finished" podID="d8c5c0b7-a76e-46e1-aa95-066286eab595" containerID="d917e2b8cd16b168ddb262fb467535b99e98d738f58c4a5d5b9cb4ffcaa78e79" exitCode=0 Feb 19 19:55:05 crc kubenswrapper[4749]: I0219 19:55:05.334299 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bjw78" event={"ID":"d8c5c0b7-a76e-46e1-aa95-066286eab595","Type":"ContainerDied","Data":"d917e2b8cd16b168ddb262fb467535b99e98d738f58c4a5d5b9cb4ffcaa78e79"} Feb 19 19:55:05 crc kubenswrapper[4749]: I0219 19:55:05.623188 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mvn5p"] Feb 19 19:55:06 crc kubenswrapper[4749]: I0219 19:55:06.343700 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bjw78" event={"ID":"d8c5c0b7-a76e-46e1-aa95-066286eab595","Type":"ContainerStarted","Data":"3e4919bc9fb4a5442da1b7e459e6808e6f83950cb3054d4793d8ff990cfb8ae5"} Feb 19 19:55:06 crc kubenswrapper[4749]: I0219 19:55:06.343802 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mvn5p" podUID="bffa375b-600a-4b38-a5cf-1f0573aca7d6" containerName="registry-server" containerID="cri-o://de7728291d4b0323570afd917647553fa758c92ba321a4da6c4c6e207e4ca99d" gracePeriod=2 Feb 19 19:55:06 crc kubenswrapper[4749]: I0219 19:55:06.379850 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bjw78" podStartSLOduration=2.926798457 podStartE2EDuration="6.379827468s" podCreationTimestamp="2026-02-19 19:55:00 +0000 UTC" firstStartedPulling="2026-02-19 19:55:02.295708866 +0000 UTC m=+4876.256928820" lastFinishedPulling="2026-02-19 19:55:05.748737877 +0000 UTC m=+4879.709957831" observedRunningTime="2026-02-19 19:55:06.375788951 +0000 UTC m=+4880.337008905" watchObservedRunningTime="2026-02-19 19:55:06.379827468 +0000 UTC m=+4880.341047422" Feb 19 19:55:06 crc kubenswrapper[4749]: I0219 19:55:06.922963 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvn5p" Feb 19 19:55:06 crc kubenswrapper[4749]: I0219 19:55:06.957782 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpksr\" (UniqueName: \"kubernetes.io/projected/bffa375b-600a-4b38-a5cf-1f0573aca7d6-kube-api-access-qpksr\") pod \"bffa375b-600a-4b38-a5cf-1f0573aca7d6\" (UID: \"bffa375b-600a-4b38-a5cf-1f0573aca7d6\") " Feb 19 19:55:06 crc kubenswrapper[4749]: I0219 19:55:06.957848 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bffa375b-600a-4b38-a5cf-1f0573aca7d6-utilities\") pod \"bffa375b-600a-4b38-a5cf-1f0573aca7d6\" (UID: \"bffa375b-600a-4b38-a5cf-1f0573aca7d6\") " Feb 19 19:55:06 crc kubenswrapper[4749]: I0219 19:55:06.957915 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bffa375b-600a-4b38-a5cf-1f0573aca7d6-catalog-content\") pod \"bffa375b-600a-4b38-a5cf-1f0573aca7d6\" (UID: \"bffa375b-600a-4b38-a5cf-1f0573aca7d6\") " Feb 19 19:55:06 crc kubenswrapper[4749]: I0219 19:55:06.958807 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bffa375b-600a-4b38-a5cf-1f0573aca7d6-utilities" (OuterVolumeSpecName: "utilities") pod "bffa375b-600a-4b38-a5cf-1f0573aca7d6" (UID: "bffa375b-600a-4b38-a5cf-1f0573aca7d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:55:06 crc kubenswrapper[4749]: I0219 19:55:06.963061 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bffa375b-600a-4b38-a5cf-1f0573aca7d6-kube-api-access-qpksr" (OuterVolumeSpecName: "kube-api-access-qpksr") pod "bffa375b-600a-4b38-a5cf-1f0573aca7d6" (UID: "bffa375b-600a-4b38-a5cf-1f0573aca7d6"). InnerVolumeSpecName "kube-api-access-qpksr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:55:07 crc kubenswrapper[4749]: I0219 19:55:07.059804 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpksr\" (UniqueName: \"kubernetes.io/projected/bffa375b-600a-4b38-a5cf-1f0573aca7d6-kube-api-access-qpksr\") on node \"crc\" DevicePath \"\"" Feb 19 19:55:07 crc kubenswrapper[4749]: I0219 19:55:07.059853 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bffa375b-600a-4b38-a5cf-1f0573aca7d6-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:55:07 crc kubenswrapper[4749]: I0219 19:55:07.092948 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bffa375b-600a-4b38-a5cf-1f0573aca7d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bffa375b-600a-4b38-a5cf-1f0573aca7d6" (UID: "bffa375b-600a-4b38-a5cf-1f0573aca7d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:55:07 crc kubenswrapper[4749]: I0219 19:55:07.162453 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bffa375b-600a-4b38-a5cf-1f0573aca7d6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:55:07 crc kubenswrapper[4749]: I0219 19:55:07.354616 4749 generic.go:334] "Generic (PLEG): container finished" podID="bffa375b-600a-4b38-a5cf-1f0573aca7d6" containerID="de7728291d4b0323570afd917647553fa758c92ba321a4da6c4c6e207e4ca99d" exitCode=0 Feb 19 19:55:07 crc kubenswrapper[4749]: I0219 19:55:07.354656 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvn5p" event={"ID":"bffa375b-600a-4b38-a5cf-1f0573aca7d6","Type":"ContainerDied","Data":"de7728291d4b0323570afd917647553fa758c92ba321a4da6c4c6e207e4ca99d"} Feb 19 19:55:07 crc kubenswrapper[4749]: I0219 19:55:07.354696 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvn5p" event={"ID":"bffa375b-600a-4b38-a5cf-1f0573aca7d6","Type":"ContainerDied","Data":"66305ef23c664b3135736c23c0875c2bca05de440bc4f9acbd75ad23530708d4"} Feb 19 19:55:07 crc kubenswrapper[4749]: I0219 19:55:07.354714 4749 scope.go:117] "RemoveContainer" containerID="de7728291d4b0323570afd917647553fa758c92ba321a4da6c4c6e207e4ca99d" Feb 19 19:55:07 crc kubenswrapper[4749]: I0219 19:55:07.354738 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvn5p" Feb 19 19:55:07 crc kubenswrapper[4749]: I0219 19:55:07.381598 4749 scope.go:117] "RemoveContainer" containerID="d30d709882167466a1c38904c42446137d5976ef7c65fb270290156cc4d00ebe" Feb 19 19:55:07 crc kubenswrapper[4749]: I0219 19:55:07.390791 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mvn5p"] Feb 19 19:55:07 crc kubenswrapper[4749]: I0219 19:55:07.402510 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mvn5p"] Feb 19 19:55:07 crc kubenswrapper[4749]: I0219 19:55:07.409427 4749 scope.go:117] "RemoveContainer" containerID="8c6f14a3687fe0c8a36a8eb97282be1263464be4a724da08776147efeb4a0022" Feb 19 19:55:07 crc kubenswrapper[4749]: I0219 19:55:07.458484 4749 scope.go:117] "RemoveContainer" containerID="de7728291d4b0323570afd917647553fa758c92ba321a4da6c4c6e207e4ca99d" Feb 19 19:55:07 crc kubenswrapper[4749]: E0219 19:55:07.458907 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de7728291d4b0323570afd917647553fa758c92ba321a4da6c4c6e207e4ca99d\": container with ID starting with de7728291d4b0323570afd917647553fa758c92ba321a4da6c4c6e207e4ca99d not found: ID does not exist" containerID="de7728291d4b0323570afd917647553fa758c92ba321a4da6c4c6e207e4ca99d" Feb 19 19:55:07 crc kubenswrapper[4749]: I0219 19:55:07.458938 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de7728291d4b0323570afd917647553fa758c92ba321a4da6c4c6e207e4ca99d"} err="failed to get container status \"de7728291d4b0323570afd917647553fa758c92ba321a4da6c4c6e207e4ca99d\": rpc error: code = NotFound desc = could not find container \"de7728291d4b0323570afd917647553fa758c92ba321a4da6c4c6e207e4ca99d\": container with ID starting with de7728291d4b0323570afd917647553fa758c92ba321a4da6c4c6e207e4ca99d not found: ID does not exist" Feb 19 19:55:07 crc kubenswrapper[4749]: I0219 19:55:07.458960 4749 scope.go:117] "RemoveContainer" containerID="d30d709882167466a1c38904c42446137d5976ef7c65fb270290156cc4d00ebe" Feb 19 19:55:07 crc kubenswrapper[4749]: E0219 19:55:07.459196 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d30d709882167466a1c38904c42446137d5976ef7c65fb270290156cc4d00ebe\": container with ID starting with d30d709882167466a1c38904c42446137d5976ef7c65fb270290156cc4d00ebe not found: ID does not exist" containerID="d30d709882167466a1c38904c42446137d5976ef7c65fb270290156cc4d00ebe" Feb 19 19:55:07 crc kubenswrapper[4749]: I0219 19:55:07.459225 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d30d709882167466a1c38904c42446137d5976ef7c65fb270290156cc4d00ebe"} err="failed to get container status \"d30d709882167466a1c38904c42446137d5976ef7c65fb270290156cc4d00ebe\": rpc error: code = NotFound desc = could not find container \"d30d709882167466a1c38904c42446137d5976ef7c65fb270290156cc4d00ebe\": container with ID starting with d30d709882167466a1c38904c42446137d5976ef7c65fb270290156cc4d00ebe not found: ID does not exist" Feb 19 19:55:07 crc kubenswrapper[4749]: I0219 19:55:07.459243 4749 scope.go:117] "RemoveContainer" containerID="8c6f14a3687fe0c8a36a8eb97282be1263464be4a724da08776147efeb4a0022" Feb 19 19:55:07 crc kubenswrapper[4749]: E0219 19:55:07.459435 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c6f14a3687fe0c8a36a8eb97282be1263464be4a724da08776147efeb4a0022\": container with ID starting with 8c6f14a3687fe0c8a36a8eb97282be1263464be4a724da08776147efeb4a0022 not found: ID does not exist" containerID="8c6f14a3687fe0c8a36a8eb97282be1263464be4a724da08776147efeb4a0022" Feb 19 19:55:07 crc kubenswrapper[4749]: I0219 19:55:07.459455 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c6f14a3687fe0c8a36a8eb97282be1263464be4a724da08776147efeb4a0022"} err="failed to get container status \"8c6f14a3687fe0c8a36a8eb97282be1263464be4a724da08776147efeb4a0022\": rpc error: code = NotFound desc = could not find container \"8c6f14a3687fe0c8a36a8eb97282be1263464be4a724da08776147efeb4a0022\": container with ID starting with 8c6f14a3687fe0c8a36a8eb97282be1263464be4a724da08776147efeb4a0022 not found: ID does not exist" Feb 19 19:55:08 crc kubenswrapper[4749]: I0219 19:55:08.689876 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bffa375b-600a-4b38-a5cf-1f0573aca7d6" path="/var/lib/kubelet/pods/bffa375b-600a-4b38-a5cf-1f0573aca7d6/volumes" Feb 19 19:55:10 crc kubenswrapper[4749]: I0219 19:55:10.985389 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bjw78" Feb 19 19:55:10 crc kubenswrapper[4749]: I0219 19:55:10.985770 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bjw78" Feb 19 19:55:11 crc kubenswrapper[4749]: I0219 19:55:11.035959 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bjw78" Feb 19 19:55:11 crc kubenswrapper[4749]: I0219 19:55:11.442438 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bjw78" Feb 19 19:55:12 crc kubenswrapper[4749]: I0219 19:55:12.230077 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bjw78"] Feb 19 19:55:13 crc kubenswrapper[4749]: I0219 19:55:13.407159 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bjw78" podUID="d8c5c0b7-a76e-46e1-aa95-066286eab595" containerName="registry-server" containerID="cri-o://3e4919bc9fb4a5442da1b7e459e6808e6f83950cb3054d4793d8ff990cfb8ae5" gracePeriod=2 Feb 19 19:55:13 crc kubenswrapper[4749]: I0219 19:55:13.918206 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bjw78" Feb 19 19:55:14 crc kubenswrapper[4749]: I0219 19:55:14.094307 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8c5c0b7-a76e-46e1-aa95-066286eab595-utilities\") pod \"d8c5c0b7-a76e-46e1-aa95-066286eab595\" (UID: \"d8c5c0b7-a76e-46e1-aa95-066286eab595\") " Feb 19 19:55:14 crc kubenswrapper[4749]: I0219 19:55:14.094401 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6248\" (UniqueName: \"kubernetes.io/projected/d8c5c0b7-a76e-46e1-aa95-066286eab595-kube-api-access-v6248\") pod \"d8c5c0b7-a76e-46e1-aa95-066286eab595\" (UID: \"d8c5c0b7-a76e-46e1-aa95-066286eab595\") " Feb 19 19:55:14 crc kubenswrapper[4749]: I0219 19:55:14.094492 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8c5c0b7-a76e-46e1-aa95-066286eab595-catalog-content\") pod \"d8c5c0b7-a76e-46e1-aa95-066286eab595\" (UID: \"d8c5c0b7-a76e-46e1-aa95-066286eab595\") " Feb 19 19:55:14 crc kubenswrapper[4749]: I0219 19:55:14.095793 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8c5c0b7-a76e-46e1-aa95-066286eab595-utilities" (OuterVolumeSpecName: "utilities") pod "d8c5c0b7-a76e-46e1-aa95-066286eab595" (UID: "d8c5c0b7-a76e-46e1-aa95-066286eab595"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:55:14 crc kubenswrapper[4749]: I0219 19:55:14.100875 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8c5c0b7-a76e-46e1-aa95-066286eab595-kube-api-access-v6248" (OuterVolumeSpecName: "kube-api-access-v6248") pod "d8c5c0b7-a76e-46e1-aa95-066286eab595" (UID: "d8c5c0b7-a76e-46e1-aa95-066286eab595"). InnerVolumeSpecName "kube-api-access-v6248". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:55:14 crc kubenswrapper[4749]: I0219 19:55:14.155575 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8c5c0b7-a76e-46e1-aa95-066286eab595-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8c5c0b7-a76e-46e1-aa95-066286eab595" (UID: "d8c5c0b7-a76e-46e1-aa95-066286eab595"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:55:14 crc kubenswrapper[4749]: I0219 19:55:14.197212 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8c5c0b7-a76e-46e1-aa95-066286eab595-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:55:14 crc kubenswrapper[4749]: I0219 19:55:14.197250 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6248\" (UniqueName: \"kubernetes.io/projected/d8c5c0b7-a76e-46e1-aa95-066286eab595-kube-api-access-v6248\") on node \"crc\" DevicePath \"\"" Feb 19 19:55:14 crc kubenswrapper[4749]: I0219 19:55:14.197264 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8c5c0b7-a76e-46e1-aa95-066286eab595-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:55:14 crc kubenswrapper[4749]: I0219 19:55:14.419611 4749 generic.go:334] "Generic (PLEG): container finished" podID="d8c5c0b7-a76e-46e1-aa95-066286eab595" containerID="3e4919bc9fb4a5442da1b7e459e6808e6f83950cb3054d4793d8ff990cfb8ae5" exitCode=0 Feb 19 19:55:14 crc kubenswrapper[4749]: I0219 19:55:14.419660 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bjw78" event={"ID":"d8c5c0b7-a76e-46e1-aa95-066286eab595","Type":"ContainerDied","Data":"3e4919bc9fb4a5442da1b7e459e6808e6f83950cb3054d4793d8ff990cfb8ae5"} Feb 19 19:55:14 crc kubenswrapper[4749]: I0219 19:55:14.419693 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bjw78" Feb 19 19:55:14 crc kubenswrapper[4749]: I0219 19:55:14.419714 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bjw78" event={"ID":"d8c5c0b7-a76e-46e1-aa95-066286eab595","Type":"ContainerDied","Data":"b7edcb1c4f7e58327e0da5b29d7f1e1e2865ea92c5c70c9406bf024998b02bc3"} Feb 19 19:55:14 crc kubenswrapper[4749]: I0219 19:55:14.419731 4749 scope.go:117] "RemoveContainer" containerID="3e4919bc9fb4a5442da1b7e459e6808e6f83950cb3054d4793d8ff990cfb8ae5" Feb 19 19:55:14 crc kubenswrapper[4749]: I0219 19:55:14.445458 4749 scope.go:117] "RemoveContainer" containerID="d917e2b8cd16b168ddb262fb467535b99e98d738f58c4a5d5b9cb4ffcaa78e79" Feb 19 19:55:14 crc kubenswrapper[4749]: I0219 19:55:14.468766 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bjw78"] Feb 19 19:55:14 crc kubenswrapper[4749]: I0219 19:55:14.489138 4749 scope.go:117] "RemoveContainer" containerID="a0f77dfc860fb3bacf26d50b7c36b9f6b14ad90cd848a87a75417cc9d30ede93" Feb 19 19:55:14 crc kubenswrapper[4749]: I0219 19:55:14.492440 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bjw78"] Feb 19 19:55:14 crc kubenswrapper[4749]: I0219 19:55:14.526801 4749 scope.go:117] "RemoveContainer" containerID="3e4919bc9fb4a5442da1b7e459e6808e6f83950cb3054d4793d8ff990cfb8ae5" Feb 19 19:55:14 crc kubenswrapper[4749]: E0219 19:55:14.528488 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e4919bc9fb4a5442da1b7e459e6808e6f83950cb3054d4793d8ff990cfb8ae5\": container with ID starting with 3e4919bc9fb4a5442da1b7e459e6808e6f83950cb3054d4793d8ff990cfb8ae5 not found: ID does not exist" containerID="3e4919bc9fb4a5442da1b7e459e6808e6f83950cb3054d4793d8ff990cfb8ae5" Feb 19 19:55:14 crc kubenswrapper[4749]: I0219 19:55:14.528521 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e4919bc9fb4a5442da1b7e459e6808e6f83950cb3054d4793d8ff990cfb8ae5"} err="failed to get container status \"3e4919bc9fb4a5442da1b7e459e6808e6f83950cb3054d4793d8ff990cfb8ae5\": rpc error: code = NotFound desc = could not find container \"3e4919bc9fb4a5442da1b7e459e6808e6f83950cb3054d4793d8ff990cfb8ae5\": container with ID starting with 3e4919bc9fb4a5442da1b7e459e6808e6f83950cb3054d4793d8ff990cfb8ae5 not found: ID does not exist" Feb 19 19:55:14 crc kubenswrapper[4749]: I0219 19:55:14.528546 4749 scope.go:117] "RemoveContainer" containerID="d917e2b8cd16b168ddb262fb467535b99e98d738f58c4a5d5b9cb4ffcaa78e79" Feb 19 19:55:14 crc kubenswrapper[4749]: E0219 19:55:14.528820 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d917e2b8cd16b168ddb262fb467535b99e98d738f58c4a5d5b9cb4ffcaa78e79\": container with ID starting with d917e2b8cd16b168ddb262fb467535b99e98d738f58c4a5d5b9cb4ffcaa78e79 not found: ID does not exist" containerID="d917e2b8cd16b168ddb262fb467535b99e98d738f58c4a5d5b9cb4ffcaa78e79" Feb 19 19:55:14 crc kubenswrapper[4749]: I0219 19:55:14.528893 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d917e2b8cd16b168ddb262fb467535b99e98d738f58c4a5d5b9cb4ffcaa78e79"} err="failed to get container status \"d917e2b8cd16b168ddb262fb467535b99e98d738f58c4a5d5b9cb4ffcaa78e79\": rpc error: code = NotFound desc = could not find container \"d917e2b8cd16b168ddb262fb467535b99e98d738f58c4a5d5b9cb4ffcaa78e79\": container with ID starting with d917e2b8cd16b168ddb262fb467535b99e98d738f58c4a5d5b9cb4ffcaa78e79 not found: ID does not exist" Feb 19 19:55:14 crc kubenswrapper[4749]: I0219 19:55:14.528907 4749 scope.go:117] "RemoveContainer" containerID="a0f77dfc860fb3bacf26d50b7c36b9f6b14ad90cd848a87a75417cc9d30ede93" Feb 19 19:55:14 crc kubenswrapper[4749]: E0219 19:55:14.529271 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0f77dfc860fb3bacf26d50b7c36b9f6b14ad90cd848a87a75417cc9d30ede93\": container with ID starting with a0f77dfc860fb3bacf26d50b7c36b9f6b14ad90cd848a87a75417cc9d30ede93 not found: ID does not exist" containerID="a0f77dfc860fb3bacf26d50b7c36b9f6b14ad90cd848a87a75417cc9d30ede93" Feb 19 19:55:14 crc kubenswrapper[4749]: I0219 19:55:14.529332 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0f77dfc860fb3bacf26d50b7c36b9f6b14ad90cd848a87a75417cc9d30ede93"} err="failed to get container status \"a0f77dfc860fb3bacf26d50b7c36b9f6b14ad90cd848a87a75417cc9d30ede93\": rpc error: code = NotFound desc = could not find container \"a0f77dfc860fb3bacf26d50b7c36b9f6b14ad90cd848a87a75417cc9d30ede93\": container with ID starting with a0f77dfc860fb3bacf26d50b7c36b9f6b14ad90cd848a87a75417cc9d30ede93 not found: ID does not exist" Feb 19 19:55:14 crc kubenswrapper[4749]: I0219 19:55:14.691156 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8c5c0b7-a76e-46e1-aa95-066286eab595" path="/var/lib/kubelet/pods/d8c5c0b7-a76e-46e1-aa95-066286eab595/volumes" Feb 19 19:56:54 crc kubenswrapper[4749]: I0219 19:56:54.725109 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:56:54 crc kubenswrapper[4749]: I0219 19:56:54.725916 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:57:24 crc kubenswrapper[4749]: I0219 19:57:24.724850 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:57:24 crc kubenswrapper[4749]: I0219 19:57:24.725654 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:57:54 crc kubenswrapper[4749]: I0219 19:57:54.725279 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:57:54 crc kubenswrapper[4749]: I0219 19:57:54.725854 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:57:54 crc kubenswrapper[4749]: I0219 19:57:54.725897 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 19:57:54 crc kubenswrapper[4749]: I0219 19:57:54.726792 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"67efe7d26f397e5c6c83b5c604adacb30f5ae2a6ee4a04b1e3e91db09ca88945"} pod="openshift-machine-config-operator/machine-config-daemon-nzldt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:57:54 crc kubenswrapper[4749]: I0219 19:57:54.726846 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" containerID="cri-o://67efe7d26f397e5c6c83b5c604adacb30f5ae2a6ee4a04b1e3e91db09ca88945" gracePeriod=600 Feb 19 19:57:54 crc kubenswrapper[4749]: I0219 19:57:54.858407 4749 generic.go:334] "Generic (PLEG): container finished" podID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerID="67efe7d26f397e5c6c83b5c604adacb30f5ae2a6ee4a04b1e3e91db09ca88945" exitCode=0 Feb 19 19:57:54 crc kubenswrapper[4749]: I0219 19:57:54.858451 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerDied","Data":"67efe7d26f397e5c6c83b5c604adacb30f5ae2a6ee4a04b1e3e91db09ca88945"} Feb 19 19:57:54 crc kubenswrapper[4749]: I0219 19:57:54.858929 4749 scope.go:117] "RemoveContainer" containerID="4c93e4c8dd8ff51967255788f0836bd119dd48f00b026a146678e893b6554763" Feb 19 19:57:55 crc kubenswrapper[4749]: I0219 19:57:55.869034 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerStarted","Data":"872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5"} Feb 19 19:58:42 crc kubenswrapper[4749]: I0219 19:58:42.304308 4749 generic.go:334] "Generic (PLEG): container finished" podID="656c9f00-c5aa-4d25-b425-84c0ce173433" containerID="4c00e383e557ddaab27532f44ffd780aa43c3e24dd11aba0455e0d56b77de22a" exitCode=0 Feb 19 19:58:42 crc kubenswrapper[4749]: I0219 19:58:42.304407 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"656c9f00-c5aa-4d25-b425-84c0ce173433","Type":"ContainerDied","Data":"4c00e383e557ddaab27532f44ffd780aa43c3e24dd11aba0455e0d56b77de22a"} Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.692785 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.789844 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"656c9f00-c5aa-4d25-b425-84c0ce173433\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.790012 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/656c9f00-c5aa-4d25-b425-84c0ce173433-test-operator-ephemeral-workdir\") pod \"656c9f00-c5aa-4d25-b425-84c0ce173433\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.790059 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqtf8\" (UniqueName: \"kubernetes.io/projected/656c9f00-c5aa-4d25-b425-84c0ce173433-kube-api-access-mqtf8\") pod \"656c9f00-c5aa-4d25-b425-84c0ce173433\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.790091 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/656c9f00-c5aa-4d25-b425-84c0ce173433-config-data\") pod \"656c9f00-c5aa-4d25-b425-84c0ce173433\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.790130 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/656c9f00-c5aa-4d25-b425-84c0ce173433-openstack-config-secret\") pod \"656c9f00-c5aa-4d25-b425-84c0ce173433\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.790148 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/656c9f00-c5aa-4d25-b425-84c0ce173433-openstack-config\") pod \"656c9f00-c5aa-4d25-b425-84c0ce173433\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.790188 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/656c9f00-c5aa-4d25-b425-84c0ce173433-ssh-key\") pod \"656c9f00-c5aa-4d25-b425-84c0ce173433\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.790235 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/656c9f00-c5aa-4d25-b425-84c0ce173433-ca-certs\") pod \"656c9f00-c5aa-4d25-b425-84c0ce173433\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.790251 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/656c9f00-c5aa-4d25-b425-84c0ce173433-test-operator-ephemeral-temporary\") pod \"656c9f00-c5aa-4d25-b425-84c0ce173433\" (UID: \"656c9f00-c5aa-4d25-b425-84c0ce173433\") " Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.791019 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/656c9f00-c5aa-4d25-b425-84c0ce173433-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "656c9f00-c5aa-4d25-b425-84c0ce173433" (UID: "656c9f00-c5aa-4d25-b425-84c0ce173433"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.791471 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/656c9f00-c5aa-4d25-b425-84c0ce173433-config-data" (OuterVolumeSpecName: "config-data") pod "656c9f00-c5aa-4d25-b425-84c0ce173433" (UID: "656c9f00-c5aa-4d25-b425-84c0ce173433"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.795858 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "test-operator-logs") pod "656c9f00-c5aa-4d25-b425-84c0ce173433" (UID: "656c9f00-c5aa-4d25-b425-84c0ce173433"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.798048 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/656c9f00-c5aa-4d25-b425-84c0ce173433-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "656c9f00-c5aa-4d25-b425-84c0ce173433" (UID: "656c9f00-c5aa-4d25-b425-84c0ce173433"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.804596 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/656c9f00-c5aa-4d25-b425-84c0ce173433-kube-api-access-mqtf8" (OuterVolumeSpecName: "kube-api-access-mqtf8") pod "656c9f00-c5aa-4d25-b425-84c0ce173433" (UID: "656c9f00-c5aa-4d25-b425-84c0ce173433"). InnerVolumeSpecName "kube-api-access-mqtf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.821248 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/656c9f00-c5aa-4d25-b425-84c0ce173433-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "656c9f00-c5aa-4d25-b425-84c0ce173433" (UID: "656c9f00-c5aa-4d25-b425-84c0ce173433"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.821390 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/656c9f00-c5aa-4d25-b425-84c0ce173433-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "656c9f00-c5aa-4d25-b425-84c0ce173433" (UID: "656c9f00-c5aa-4d25-b425-84c0ce173433"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.826820 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/656c9f00-c5aa-4d25-b425-84c0ce173433-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "656c9f00-c5aa-4d25-b425-84c0ce173433" (UID: "656c9f00-c5aa-4d25-b425-84c0ce173433"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.846592 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/656c9f00-c5aa-4d25-b425-84c0ce173433-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "656c9f00-c5aa-4d25-b425-84c0ce173433" (UID: "656c9f00-c5aa-4d25-b425-84c0ce173433"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.892928 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.893248 4749 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/656c9f00-c5aa-4d25-b425-84c0ce173433-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.893350 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqtf8\" (UniqueName: \"kubernetes.io/projected/656c9f00-c5aa-4d25-b425-84c0ce173433-kube-api-access-mqtf8\") on node \"crc\" DevicePath \"\"" Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.893430 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/656c9f00-c5aa-4d25-b425-84c0ce173433-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.893519 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/656c9f00-c5aa-4d25-b425-84c0ce173433-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.893596 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/656c9f00-c5aa-4d25-b425-84c0ce173433-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.893663 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/656c9f00-c5aa-4d25-b425-84c0ce173433-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.893738 4749 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/656c9f00-c5aa-4d25-b425-84c0ce173433-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.893811 4749 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/656c9f00-c5aa-4d25-b425-84c0ce173433-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.924714 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 19 19:58:43 crc kubenswrapper[4749]: I0219 19:58:43.996423 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 19 19:58:44 crc kubenswrapper[4749]: I0219 19:58:44.328817 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"656c9f00-c5aa-4d25-b425-84c0ce173433","Type":"ContainerDied","Data":"217e50d8dcb7fe0119bde170bc997d9c44e71b7d36574e0b1cdf6b3832f1cb13"} Feb 19 19:58:44 crc kubenswrapper[4749]: I0219 19:58:44.328887 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="217e50d8dcb7fe0119bde170bc997d9c44e71b7d36574e0b1cdf6b3832f1cb13" Feb 19 19:58:44 crc kubenswrapper[4749]: I0219 19:58:44.328954 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 19 19:58:54 crc kubenswrapper[4749]: I0219 19:58:54.153422 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 19 19:58:54 crc kubenswrapper[4749]: E0219 19:58:54.154550 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="656c9f00-c5aa-4d25-b425-84c0ce173433" containerName="tempest-tests-tempest-tests-runner" Feb 19 19:58:54 crc kubenswrapper[4749]: I0219 19:58:54.154567 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="656c9f00-c5aa-4d25-b425-84c0ce173433" containerName="tempest-tests-tempest-tests-runner" Feb 19 19:58:54 crc kubenswrapper[4749]: E0219 19:58:54.154585 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8c5c0b7-a76e-46e1-aa95-066286eab595" containerName="registry-server" Feb 19 19:58:54 crc kubenswrapper[4749]: I0219 19:58:54.154593 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8c5c0b7-a76e-46e1-aa95-066286eab595" containerName="registry-server" Feb 19 19:58:54 crc kubenswrapper[4749]: E0219 19:58:54.154621 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8c5c0b7-a76e-46e1-aa95-066286eab595" containerName="extract-content" Feb 19 19:58:54 crc kubenswrapper[4749]: I0219 19:58:54.154629 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8c5c0b7-a76e-46e1-aa95-066286eab595" containerName="extract-content" Feb 19 19:58:54 crc kubenswrapper[4749]: E0219 19:58:54.154655 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bffa375b-600a-4b38-a5cf-1f0573aca7d6" containerName="extract-content" Feb 19 19:58:54 crc kubenswrapper[4749]: I0219 19:58:54.154665 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bffa375b-600a-4b38-a5cf-1f0573aca7d6" containerName="extract-content" Feb 19 19:58:54 crc kubenswrapper[4749]: E0219 19:58:54.154683 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bffa375b-600a-4b38-a5cf-1f0573aca7d6" containerName="extract-utilities" Feb 19 19:58:54 crc kubenswrapper[4749]: I0219 19:58:54.154691 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bffa375b-600a-4b38-a5cf-1f0573aca7d6" containerName="extract-utilities" Feb 19 19:58:54 crc kubenswrapper[4749]: E0219 19:58:54.154709 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bffa375b-600a-4b38-a5cf-1f0573aca7d6" containerName="registry-server" Feb 19 19:58:54 crc kubenswrapper[4749]: I0219 19:58:54.154718 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bffa375b-600a-4b38-a5cf-1f0573aca7d6" containerName="registry-server" Feb 19 19:58:54 crc kubenswrapper[4749]: E0219 19:58:54.154734 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8c5c0b7-a76e-46e1-aa95-066286eab595" containerName="extract-utilities" Feb 19 19:58:54 crc kubenswrapper[4749]: I0219 19:58:54.154741 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8c5c0b7-a76e-46e1-aa95-066286eab595" containerName="extract-utilities" Feb 19 19:58:54 crc kubenswrapper[4749]: I0219 19:58:54.154973 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8c5c0b7-a76e-46e1-aa95-066286eab595" containerName="registry-server" Feb 19 19:58:54 crc kubenswrapper[4749]: I0219 19:58:54.155000 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bffa375b-600a-4b38-a5cf-1f0573aca7d6" containerName="registry-server" Feb 19 19:58:54 crc kubenswrapper[4749]: I0219 19:58:54.155017 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="656c9f00-c5aa-4d25-b425-84c0ce173433" containerName="tempest-tests-tempest-tests-runner" Feb 19 19:58:54 crc kubenswrapper[4749]: I0219 19:58:54.155914 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 19:58:54 crc kubenswrapper[4749]: I0219 19:58:54.165510 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 19 19:58:54 crc kubenswrapper[4749]: I0219 19:58:54.167120 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-tl9hr" Feb 19 19:58:54 crc kubenswrapper[4749]: I0219 19:58:54.321744 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj57c\" (UniqueName: \"kubernetes.io/projected/00d5b3f4-f6de-4204-a2a8-633a9d9041e3-kube-api-access-xj57c\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"00d5b3f4-f6de-4204-a2a8-633a9d9041e3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 19:58:54 crc kubenswrapper[4749]: I0219 19:58:54.322052 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"00d5b3f4-f6de-4204-a2a8-633a9d9041e3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 19:58:54 crc kubenswrapper[4749]: I0219 19:58:54.424388 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj57c\" (UniqueName: \"kubernetes.io/projected/00d5b3f4-f6de-4204-a2a8-633a9d9041e3-kube-api-access-xj57c\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"00d5b3f4-f6de-4204-a2a8-633a9d9041e3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 19:58:54 crc kubenswrapper[4749]: I0219 19:58:54.424576 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"00d5b3f4-f6de-4204-a2a8-633a9d9041e3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 19:58:54 crc kubenswrapper[4749]: I0219 19:58:54.425194 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"00d5b3f4-f6de-4204-a2a8-633a9d9041e3\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 19:58:54 crc kubenswrapper[4749]: I0219 19:58:54.453831 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj57c\" (UniqueName: \"kubernetes.io/projected/00d5b3f4-f6de-4204-a2a8-633a9d9041e3-kube-api-access-xj57c\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"00d5b3f4-f6de-4204-a2a8-633a9d9041e3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 19:58:54 crc kubenswrapper[4749]: I0219 19:58:54.457710 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"00d5b3f4-f6de-4204-a2a8-633a9d9041e3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 19:58:54 crc kubenswrapper[4749]: I0219 19:58:54.482769 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 19:58:54 crc kubenswrapper[4749]: I0219 19:58:54.958549 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 19 19:58:54 crc kubenswrapper[4749]: I0219 19:58:54.966642 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 19:58:55 crc kubenswrapper[4749]: I0219 19:58:55.498101 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"00d5b3f4-f6de-4204-a2a8-633a9d9041e3","Type":"ContainerStarted","Data":"a586b083a5e6c066ec48eb9459190531bd8af8dcfcf61e289dbcc8779f42b424"} Feb 19 19:58:56 crc kubenswrapper[4749]: I0219 19:58:56.508264 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"00d5b3f4-f6de-4204-a2a8-633a9d9041e3","Type":"ContainerStarted","Data":"ef2feea78d4b856765892f380eaed077ad60437551970f120cf1637e82c9b82b"} Feb 19 19:58:56 crc kubenswrapper[4749]: I0219 19:58:56.523832 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.4144861770000001 podStartE2EDuration="2.523817981s" podCreationTimestamp="2026-02-19 19:58:54 +0000 UTC" firstStartedPulling="2026-02-19 19:58:54.966465425 +0000 UTC m=+5108.927685379" lastFinishedPulling="2026-02-19 19:58:56.075797229 +0000 UTC m=+5110.037017183" observedRunningTime="2026-02-19 19:58:56.521652379 +0000 UTC m=+5110.482872333" watchObservedRunningTime="2026-02-19 19:58:56.523817981 +0000 UTC m=+5110.485037935" Feb 19 19:59:21 crc kubenswrapper[4749]: I0219 19:59:21.695670 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sfq27/must-gather-ghk8b"] Feb 19 19:59:21 crc kubenswrapper[4749]: I0219 19:59:21.698014 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sfq27/must-gather-ghk8b" Feb 19 19:59:21 crc kubenswrapper[4749]: I0219 19:59:21.700646 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-sfq27"/"default-dockercfg-zwcwq" Feb 19 19:59:21 crc kubenswrapper[4749]: I0219 19:59:21.700696 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-sfq27"/"kube-root-ca.crt" Feb 19 19:59:21 crc kubenswrapper[4749]: I0219 19:59:21.700750 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-sfq27"/"openshift-service-ca.crt" Feb 19 19:59:21 crc kubenswrapper[4749]: I0219 19:59:21.707298 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sfq27/must-gather-ghk8b"] Feb 19 19:59:21 crc kubenswrapper[4749]: I0219 19:59:21.842862 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6baeaaaa-644d-4cd2-bff2-ff6c5e696204-must-gather-output\") pod \"must-gather-ghk8b\" (UID: \"6baeaaaa-644d-4cd2-bff2-ff6c5e696204\") " pod="openshift-must-gather-sfq27/must-gather-ghk8b" Feb 19 19:59:21 crc kubenswrapper[4749]: I0219 19:59:21.843333 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvkdv\" (UniqueName: \"kubernetes.io/projected/6baeaaaa-644d-4cd2-bff2-ff6c5e696204-kube-api-access-tvkdv\") pod \"must-gather-ghk8b\" (UID: \"6baeaaaa-644d-4cd2-bff2-ff6c5e696204\") " pod="openshift-must-gather-sfq27/must-gather-ghk8b" Feb 19 19:59:21 crc kubenswrapper[4749]: I0219 19:59:21.944973 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6baeaaaa-644d-4cd2-bff2-ff6c5e696204-must-gather-output\") pod \"must-gather-ghk8b\" (UID: \"6baeaaaa-644d-4cd2-bff2-ff6c5e696204\") " pod="openshift-must-gather-sfq27/must-gather-ghk8b" Feb 19 19:59:21 crc kubenswrapper[4749]: I0219 19:59:21.945104 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvkdv\" (UniqueName: \"kubernetes.io/projected/6baeaaaa-644d-4cd2-bff2-ff6c5e696204-kube-api-access-tvkdv\") pod \"must-gather-ghk8b\" (UID: \"6baeaaaa-644d-4cd2-bff2-ff6c5e696204\") " pod="openshift-must-gather-sfq27/must-gather-ghk8b" Feb 19 19:59:21 crc kubenswrapper[4749]: I0219 19:59:21.945381 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6baeaaaa-644d-4cd2-bff2-ff6c5e696204-must-gather-output\") pod \"must-gather-ghk8b\" (UID: \"6baeaaaa-644d-4cd2-bff2-ff6c5e696204\") " pod="openshift-must-gather-sfq27/must-gather-ghk8b" Feb 19 19:59:21 crc kubenswrapper[4749]: I0219 19:59:21.966694 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvkdv\" (UniqueName: \"kubernetes.io/projected/6baeaaaa-644d-4cd2-bff2-ff6c5e696204-kube-api-access-tvkdv\") pod \"must-gather-ghk8b\" (UID: \"6baeaaaa-644d-4cd2-bff2-ff6c5e696204\") " pod="openshift-must-gather-sfq27/must-gather-ghk8b" Feb 19 19:59:22 crc kubenswrapper[4749]: I0219 19:59:22.020040 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sfq27/must-gather-ghk8b" Feb 19 19:59:22 crc kubenswrapper[4749]: I0219 19:59:22.530262 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sfq27/must-gather-ghk8b"] Feb 19 19:59:22 crc kubenswrapper[4749]: I0219 19:59:22.795108 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sfq27/must-gather-ghk8b" event={"ID":"6baeaaaa-644d-4cd2-bff2-ff6c5e696204","Type":"ContainerStarted","Data":"45ecef79741f0e0b75464e3e08f19ac590b19ce3c43536a8146bda3d55dd3bbb"} Feb 19 19:59:28 crc kubenswrapper[4749]: I0219 19:59:28.860353 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sfq27/must-gather-ghk8b" event={"ID":"6baeaaaa-644d-4cd2-bff2-ff6c5e696204","Type":"ContainerStarted","Data":"2eab541f23d459c8e4eafc1599a29ee6dab665bbfe1a0cc34cbdd34d4ffd60e5"} Feb 19 19:59:29 crc kubenswrapper[4749]: I0219 19:59:29.870608 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sfq27/must-gather-ghk8b" event={"ID":"6baeaaaa-644d-4cd2-bff2-ff6c5e696204","Type":"ContainerStarted","Data":"09fa0555e62df6e8c411b1580ab4181288469e95a923159ad6bb4dc613f6b9a9"} Feb 19 19:59:29 crc kubenswrapper[4749]: I0219 19:59:29.885833 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-sfq27/must-gather-ghk8b" podStartSLOduration=2.952699574 podStartE2EDuration="8.88581308s" podCreationTimestamp="2026-02-19 19:59:21 +0000 UTC" firstStartedPulling="2026-02-19 19:59:22.537394383 +0000 UTC m=+5136.498614337" lastFinishedPulling="2026-02-19 19:59:28.470507889 +0000 UTC m=+5142.431727843" observedRunningTime="2026-02-19 19:59:29.883403981 +0000 UTC m=+5143.844623965" watchObservedRunningTime="2026-02-19 19:59:29.88581308 +0000 UTC m=+5143.847033034" Feb 19 19:59:32 crc kubenswrapper[4749]: I0219 19:59:32.505831 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sfq27/crc-debug-rpt4z"] Feb 19 19:59:32 crc kubenswrapper[4749]: I0219 19:59:32.509200 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sfq27/crc-debug-rpt4z" Feb 19 19:59:32 crc kubenswrapper[4749]: I0219 19:59:32.679250 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n294c\" (UniqueName: \"kubernetes.io/projected/d90dd387-65e1-4efa-a0b2-5d1d8178aca4-kube-api-access-n294c\") pod \"crc-debug-rpt4z\" (UID: \"d90dd387-65e1-4efa-a0b2-5d1d8178aca4\") " pod="openshift-must-gather-sfq27/crc-debug-rpt4z" Feb 19 19:59:32 crc kubenswrapper[4749]: I0219 19:59:32.679535 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d90dd387-65e1-4efa-a0b2-5d1d8178aca4-host\") pod \"crc-debug-rpt4z\" (UID: \"d90dd387-65e1-4efa-a0b2-5d1d8178aca4\") " pod="openshift-must-gather-sfq27/crc-debug-rpt4z" Feb 19 19:59:32 crc kubenswrapper[4749]: I0219 19:59:32.782040 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d90dd387-65e1-4efa-a0b2-5d1d8178aca4-host\") pod \"crc-debug-rpt4z\" (UID: \"d90dd387-65e1-4efa-a0b2-5d1d8178aca4\") " pod="openshift-must-gather-sfq27/crc-debug-rpt4z" Feb 19 19:59:32 crc kubenswrapper[4749]: I0219 19:59:32.782515 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n294c\" (UniqueName: \"kubernetes.io/projected/d90dd387-65e1-4efa-a0b2-5d1d8178aca4-kube-api-access-n294c\") pod \"crc-debug-rpt4z\" (UID: \"d90dd387-65e1-4efa-a0b2-5d1d8178aca4\") " pod="openshift-must-gather-sfq27/crc-debug-rpt4z" Feb 19 19:59:32 crc kubenswrapper[4749]: I0219 19:59:32.782171 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d90dd387-65e1-4efa-a0b2-5d1d8178aca4-host\") pod \"crc-debug-rpt4z\" (UID: \"d90dd387-65e1-4efa-a0b2-5d1d8178aca4\") " pod="openshift-must-gather-sfq27/crc-debug-rpt4z" Feb 19 19:59:32 crc kubenswrapper[4749]: I0219 19:59:32.804695 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n294c\" (UniqueName: \"kubernetes.io/projected/d90dd387-65e1-4efa-a0b2-5d1d8178aca4-kube-api-access-n294c\") pod \"crc-debug-rpt4z\" (UID: \"d90dd387-65e1-4efa-a0b2-5d1d8178aca4\") " pod="openshift-must-gather-sfq27/crc-debug-rpt4z" Feb 19 19:59:32 crc kubenswrapper[4749]: I0219 19:59:32.828591 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sfq27/crc-debug-rpt4z" Feb 19 19:59:32 crc kubenswrapper[4749]: I0219 19:59:32.928424 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sfq27/crc-debug-rpt4z" event={"ID":"d90dd387-65e1-4efa-a0b2-5d1d8178aca4","Type":"ContainerStarted","Data":"3f2c43d4840221c57f22d4a8b87cdb9fc55ba8865113a0a859fdb03f771d7f20"} Feb 19 19:59:44 crc kubenswrapper[4749]: I0219 19:59:44.028607 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sfq27/crc-debug-rpt4z" event={"ID":"d90dd387-65e1-4efa-a0b2-5d1d8178aca4","Type":"ContainerStarted","Data":"ee6efd9227a60bdb65af71d3d6a049beea9d0a0df051d733be5335e9d7e86e69"} Feb 19 19:59:44 crc kubenswrapper[4749]: I0219 19:59:44.044691 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-sfq27/crc-debug-rpt4z" podStartSLOduration=1.7389038110000001 podStartE2EDuration="12.044673393s" podCreationTimestamp="2026-02-19 19:59:32 +0000 UTC" firstStartedPulling="2026-02-19 19:59:32.877958541 +0000 UTC m=+5146.839178495" lastFinishedPulling="2026-02-19 19:59:43.183728123 +0000 UTC m=+5157.144948077" observedRunningTime="2026-02-19 19:59:44.042212934 +0000 UTC m=+5158.003432888" watchObservedRunningTime="2026-02-19 19:59:44.044673393 +0000 UTC m=+5158.005893347" Feb 19 20:00:00 crc kubenswrapper[4749]: I0219 20:00:00.145852 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525520-d4nxp"] Feb 19 20:00:00 crc kubenswrapper[4749]: I0219 20:00:00.148021 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-d4nxp" Feb 19 20:00:00 crc kubenswrapper[4749]: I0219 20:00:00.150478 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 20:00:00 crc kubenswrapper[4749]: I0219 20:00:00.157465 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 20:00:00 crc kubenswrapper[4749]: I0219 20:00:00.159089 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525520-d4nxp"] Feb 19 20:00:00 crc kubenswrapper[4749]: I0219 20:00:00.305733 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fa47884-562c-4388-901c-4c63d9560b35-config-volume\") pod \"collect-profiles-29525520-d4nxp\" (UID: \"5fa47884-562c-4388-901c-4c63d9560b35\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-d4nxp" Feb 19 20:00:00 crc kubenswrapper[4749]: I0219 20:00:00.305845 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fa47884-562c-4388-901c-4c63d9560b35-secret-volume\") pod \"collect-profiles-29525520-d4nxp\" (UID: \"5fa47884-562c-4388-901c-4c63d9560b35\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-d4nxp" Feb 19 20:00:00 crc kubenswrapper[4749]: I0219 20:00:00.305889 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bwqk\" (UniqueName: \"kubernetes.io/projected/5fa47884-562c-4388-901c-4c63d9560b35-kube-api-access-4bwqk\") pod \"collect-profiles-29525520-d4nxp\" (UID: \"5fa47884-562c-4388-901c-4c63d9560b35\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-d4nxp" Feb 19 20:00:00 crc kubenswrapper[4749]: I0219 20:00:00.408180 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fa47884-562c-4388-901c-4c63d9560b35-config-volume\") pod \"collect-profiles-29525520-d4nxp\" (UID: \"5fa47884-562c-4388-901c-4c63d9560b35\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-d4nxp" Feb 19 20:00:00 crc kubenswrapper[4749]: I0219 20:00:00.408295 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fa47884-562c-4388-901c-4c63d9560b35-secret-volume\") pod \"collect-profiles-29525520-d4nxp\" (UID: \"5fa47884-562c-4388-901c-4c63d9560b35\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-d4nxp" Feb 19 20:00:00 crc kubenswrapper[4749]: I0219 20:00:00.408322 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bwqk\" (UniqueName: \"kubernetes.io/projected/5fa47884-562c-4388-901c-4c63d9560b35-kube-api-access-4bwqk\") pod \"collect-profiles-29525520-d4nxp\" (UID: \"5fa47884-562c-4388-901c-4c63d9560b35\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-d4nxp" Feb 19 20:00:00 crc kubenswrapper[4749]: I0219 20:00:00.408945 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fa47884-562c-4388-901c-4c63d9560b35-config-volume\") pod \"collect-profiles-29525520-d4nxp\" (UID: \"5fa47884-562c-4388-901c-4c63d9560b35\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-d4nxp" Feb 19 20:00:00 crc kubenswrapper[4749]: I0219 20:00:00.426287 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fa47884-562c-4388-901c-4c63d9560b35-secret-volume\") pod \"collect-profiles-29525520-d4nxp\" (UID: \"5fa47884-562c-4388-901c-4c63d9560b35\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-d4nxp" Feb 19 20:00:00 crc kubenswrapper[4749]: I0219 20:00:00.429327 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bwqk\" (UniqueName: \"kubernetes.io/projected/5fa47884-562c-4388-901c-4c63d9560b35-kube-api-access-4bwqk\") pod \"collect-profiles-29525520-d4nxp\" (UID: \"5fa47884-562c-4388-901c-4c63d9560b35\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-d4nxp" Feb 19 20:00:00 crc kubenswrapper[4749]: I0219 20:00:00.472239 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-d4nxp" Feb 19 20:00:00 crc kubenswrapper[4749]: W0219 20:00:00.968370 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fa47884_562c_4388_901c_4c63d9560b35.slice/crio-858aa3aab5ff142b6d427e4adfc4e8242afb85fb4313b244041aeb4709130df6 WatchSource:0}: Error finding container 858aa3aab5ff142b6d427e4adfc4e8242afb85fb4313b244041aeb4709130df6: Status 404 returned error can't find the container with id 858aa3aab5ff142b6d427e4adfc4e8242afb85fb4313b244041aeb4709130df6 Feb 19 20:00:00 crc kubenswrapper[4749]: I0219 20:00:00.980860 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525520-d4nxp"] Feb 19 20:00:01 crc kubenswrapper[4749]: I0219 20:00:01.185296 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-d4nxp" event={"ID":"5fa47884-562c-4388-901c-4c63d9560b35","Type":"ContainerStarted","Data":"858aa3aab5ff142b6d427e4adfc4e8242afb85fb4313b244041aeb4709130df6"} Feb 19 20:00:02 crc kubenswrapper[4749]: I0219 20:00:02.217900 4749 generic.go:334] "Generic (PLEG): container finished" podID="5fa47884-562c-4388-901c-4c63d9560b35" containerID="af52c0eddafca4e6298cd2833f90f8c8b908f590afe9117dc8d8e4c0a4e8a2e9" exitCode=0 Feb 19 20:00:02 crc kubenswrapper[4749]: I0219 20:00:02.217976 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-d4nxp" event={"ID":"5fa47884-562c-4388-901c-4c63d9560b35","Type":"ContainerDied","Data":"af52c0eddafca4e6298cd2833f90f8c8b908f590afe9117dc8d8e4c0a4e8a2e9"} Feb 19 20:00:02 crc kubenswrapper[4749]: I0219 20:00:02.307361 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m5bks"] Feb 19 20:00:02 crc kubenswrapper[4749]: I0219 20:00:02.309639 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5bks" Feb 19 20:00:02 crc kubenswrapper[4749]: I0219 20:00:02.321043 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5bks"] Feb 19 20:00:02 crc kubenswrapper[4749]: I0219 20:00:02.452734 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59v7m\" (UniqueName: \"kubernetes.io/projected/5122c576-5ac2-42a4-a117-cb9f98cd48a3-kube-api-access-59v7m\") pod \"redhat-marketplace-m5bks\" (UID: \"5122c576-5ac2-42a4-a117-cb9f98cd48a3\") " pod="openshift-marketplace/redhat-marketplace-m5bks" Feb 19 20:00:02 crc kubenswrapper[4749]: I0219 20:00:02.452959 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5122c576-5ac2-42a4-a117-cb9f98cd48a3-catalog-content\") pod \"redhat-marketplace-m5bks\" (UID: \"5122c576-5ac2-42a4-a117-cb9f98cd48a3\") " pod="openshift-marketplace/redhat-marketplace-m5bks" Feb 19 20:00:02 crc kubenswrapper[4749]: I0219 20:00:02.453003 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5122c576-5ac2-42a4-a117-cb9f98cd48a3-utilities\") pod \"redhat-marketplace-m5bks\" (UID: \"5122c576-5ac2-42a4-a117-cb9f98cd48a3\") " pod="openshift-marketplace/redhat-marketplace-m5bks" Feb 19 20:00:02 crc kubenswrapper[4749]: I0219 20:00:02.555617 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5122c576-5ac2-42a4-a117-cb9f98cd48a3-catalog-content\") pod \"redhat-marketplace-m5bks\" (UID: \"5122c576-5ac2-42a4-a117-cb9f98cd48a3\") " pod="openshift-marketplace/redhat-marketplace-m5bks" Feb 19 20:00:02 crc kubenswrapper[4749]: I0219 20:00:02.555668 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5122c576-5ac2-42a4-a117-cb9f98cd48a3-utilities\") pod \"redhat-marketplace-m5bks\" (UID: \"5122c576-5ac2-42a4-a117-cb9f98cd48a3\") " pod="openshift-marketplace/redhat-marketplace-m5bks" Feb 19 20:00:02 crc kubenswrapper[4749]: I0219 20:00:02.555964 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59v7m\" (UniqueName: \"kubernetes.io/projected/5122c576-5ac2-42a4-a117-cb9f98cd48a3-kube-api-access-59v7m\") pod \"redhat-marketplace-m5bks\" (UID: \"5122c576-5ac2-42a4-a117-cb9f98cd48a3\") " pod="openshift-marketplace/redhat-marketplace-m5bks" Feb 19 20:00:02 crc kubenswrapper[4749]: I0219 20:00:02.556858 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5122c576-5ac2-42a4-a117-cb9f98cd48a3-catalog-content\") pod \"redhat-marketplace-m5bks\" (UID: \"5122c576-5ac2-42a4-a117-cb9f98cd48a3\") " pod="openshift-marketplace/redhat-marketplace-m5bks" Feb 19 20:00:02 crc kubenswrapper[4749]: I0219 20:00:02.557138 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5122c576-5ac2-42a4-a117-cb9f98cd48a3-utilities\") pod \"redhat-marketplace-m5bks\" (UID: \"5122c576-5ac2-42a4-a117-cb9f98cd48a3\") " pod="openshift-marketplace/redhat-marketplace-m5bks" Feb 19 20:00:02 crc kubenswrapper[4749]: I0219 20:00:02.580273 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59v7m\" (UniqueName: \"kubernetes.io/projected/5122c576-5ac2-42a4-a117-cb9f98cd48a3-kube-api-access-59v7m\") pod \"redhat-marketplace-m5bks\" (UID: \"5122c576-5ac2-42a4-a117-cb9f98cd48a3\") " pod="openshift-marketplace/redhat-marketplace-m5bks" Feb 19 20:00:02 crc kubenswrapper[4749]: I0219 20:00:02.646165 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5bks" Feb 19 20:00:03 crc kubenswrapper[4749]: I0219 20:00:03.218732 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5bks"] Feb 19 20:00:03 crc kubenswrapper[4749]: I0219 20:00:03.238827 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5bks" event={"ID":"5122c576-5ac2-42a4-a117-cb9f98cd48a3","Type":"ContainerStarted","Data":"9abd0a6d5ddb12e438714f2ae267e777ad0a51b4e7a7b364d4eb3e958100336b"} Feb 19 20:00:03 crc kubenswrapper[4749]: I0219 20:00:03.540012 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-d4nxp" Feb 19 20:00:03 crc kubenswrapper[4749]: I0219 20:00:03.700082 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fa47884-562c-4388-901c-4c63d9560b35-secret-volume\") pod \"5fa47884-562c-4388-901c-4c63d9560b35\" (UID: \"5fa47884-562c-4388-901c-4c63d9560b35\") " Feb 19 20:00:03 crc kubenswrapper[4749]: I0219 20:00:03.700224 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bwqk\" (UniqueName: \"kubernetes.io/projected/5fa47884-562c-4388-901c-4c63d9560b35-kube-api-access-4bwqk\") pod \"5fa47884-562c-4388-901c-4c63d9560b35\" (UID: \"5fa47884-562c-4388-901c-4c63d9560b35\") " Feb 19 20:00:03 crc kubenswrapper[4749]: I0219 20:00:03.700270 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fa47884-562c-4388-901c-4c63d9560b35-config-volume\") pod \"5fa47884-562c-4388-901c-4c63d9560b35\" (UID: \"5fa47884-562c-4388-901c-4c63d9560b35\") " Feb 19 20:00:03 crc kubenswrapper[4749]: I0219 20:00:03.701496 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fa47884-562c-4388-901c-4c63d9560b35-config-volume" (OuterVolumeSpecName: "config-volume") pod "5fa47884-562c-4388-901c-4c63d9560b35" (UID: "5fa47884-562c-4388-901c-4c63d9560b35"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 20:00:03 crc kubenswrapper[4749]: I0219 20:00:03.713216 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fa47884-562c-4388-901c-4c63d9560b35-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5fa47884-562c-4388-901c-4c63d9560b35" (UID: "5fa47884-562c-4388-901c-4c63d9560b35"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:00:03 crc kubenswrapper[4749]: I0219 20:00:03.721666 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fa47884-562c-4388-901c-4c63d9560b35-kube-api-access-4bwqk" (OuterVolumeSpecName: "kube-api-access-4bwqk") pod "5fa47884-562c-4388-901c-4c63d9560b35" (UID: "5fa47884-562c-4388-901c-4c63d9560b35"). InnerVolumeSpecName "kube-api-access-4bwqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:00:03 crc kubenswrapper[4749]: I0219 20:00:03.803906 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fa47884-562c-4388-901c-4c63d9560b35-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:03 crc kubenswrapper[4749]: I0219 20:00:03.803949 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bwqk\" (UniqueName: \"kubernetes.io/projected/5fa47884-562c-4388-901c-4c63d9560b35-kube-api-access-4bwqk\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:03 crc kubenswrapper[4749]: I0219 20:00:03.803964 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fa47884-562c-4388-901c-4c63d9560b35-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:04 crc kubenswrapper[4749]: I0219 20:00:04.249293 4749 generic.go:334] "Generic (PLEG): container finished" podID="5122c576-5ac2-42a4-a117-cb9f98cd48a3" containerID="bce65422575f09adbc03a5c60998cd38200652bf338fe893c95573a885a283f4" exitCode=0 Feb 19 20:00:04 crc kubenswrapper[4749]: I0219 20:00:04.249378 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5bks" event={"ID":"5122c576-5ac2-42a4-a117-cb9f98cd48a3","Type":"ContainerDied","Data":"bce65422575f09adbc03a5c60998cd38200652bf338fe893c95573a885a283f4"} Feb 19 20:00:04 crc kubenswrapper[4749]: I0219 20:00:04.251896 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-d4nxp" event={"ID":"5fa47884-562c-4388-901c-4c63d9560b35","Type":"ContainerDied","Data":"858aa3aab5ff142b6d427e4adfc4e8242afb85fb4313b244041aeb4709130df6"} Feb 19 20:00:04 crc kubenswrapper[4749]: I0219 20:00:04.251920 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="858aa3aab5ff142b6d427e4adfc4e8242afb85fb4313b244041aeb4709130df6" Feb 19 20:00:04 crc kubenswrapper[4749]: I0219 20:00:04.251964 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-d4nxp" Feb 19 20:00:04 crc kubenswrapper[4749]: I0219 20:00:04.620576 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525475-jbrdb"] Feb 19 20:00:04 crc kubenswrapper[4749]: I0219 20:00:04.630820 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525475-jbrdb"] Feb 19 20:00:04 crc kubenswrapper[4749]: I0219 20:00:04.692159 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a7ee6e5-4a4d-49d3-83f1-59192e92daba" path="/var/lib/kubelet/pods/9a7ee6e5-4a4d-49d3-83f1-59192e92daba/volumes" Feb 19 20:00:06 crc kubenswrapper[4749]: I0219 20:00:06.271260 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5bks" event={"ID":"5122c576-5ac2-42a4-a117-cb9f98cd48a3","Type":"ContainerStarted","Data":"5abf34ea7b5cf3b263f5342e322f0f67a89b4ac11ff09a41d1bee4f9267218e9"} Feb 19 20:00:08 crc kubenswrapper[4749]: I0219 20:00:08.306587 4749 generic.go:334] "Generic (PLEG): container finished" podID="5122c576-5ac2-42a4-a117-cb9f98cd48a3" containerID="5abf34ea7b5cf3b263f5342e322f0f67a89b4ac11ff09a41d1bee4f9267218e9" exitCode=0 Feb 19 20:00:08 crc kubenswrapper[4749]: I0219 20:00:08.306624 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5bks" event={"ID":"5122c576-5ac2-42a4-a117-cb9f98cd48a3","Type":"ContainerDied","Data":"5abf34ea7b5cf3b263f5342e322f0f67a89b4ac11ff09a41d1bee4f9267218e9"} Feb 19 20:00:10 crc kubenswrapper[4749]: I0219 20:00:10.325729 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5bks" event={"ID":"5122c576-5ac2-42a4-a117-cb9f98cd48a3","Type":"ContainerStarted","Data":"c49943eba091345092fbd3ddcac424c36229146f3c5469976075ec1e3aafcac0"} Feb 19 20:00:10 crc kubenswrapper[4749]: I0219 20:00:10.345335 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m5bks" podStartSLOduration=2.854185886 podStartE2EDuration="8.345315578s" podCreationTimestamp="2026-02-19 20:00:02 +0000 UTC" firstStartedPulling="2026-02-19 20:00:04.251412758 +0000 UTC m=+5178.212632712" lastFinishedPulling="2026-02-19 20:00:09.74254245 +0000 UTC m=+5183.703762404" observedRunningTime="2026-02-19 20:00:10.342288214 +0000 UTC m=+5184.303508188" watchObservedRunningTime="2026-02-19 20:00:10.345315578 +0000 UTC m=+5184.306535532" Feb 19 20:00:12 crc kubenswrapper[4749]: I0219 20:00:12.646860 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m5bks" Feb 19 20:00:12 crc kubenswrapper[4749]: I0219 20:00:12.647257 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m5bks" Feb 19 20:00:12 crc kubenswrapper[4749]: I0219 20:00:12.707363 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m5bks" Feb 19 20:00:22 crc kubenswrapper[4749]: I0219 20:00:22.702407 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m5bks" Feb 19 20:00:22 crc kubenswrapper[4749]: I0219 20:00:22.763106 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5bks"] Feb 19 20:00:23 crc kubenswrapper[4749]: I0219 20:00:23.441481 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m5bks" podUID="5122c576-5ac2-42a4-a117-cb9f98cd48a3" containerName="registry-server" containerID="cri-o://c49943eba091345092fbd3ddcac424c36229146f3c5469976075ec1e3aafcac0" gracePeriod=2 Feb 19 20:00:23 crc kubenswrapper[4749]: I0219 20:00:23.905287 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5bks" Feb 19 20:00:24 crc kubenswrapper[4749]: I0219 20:00:24.018671 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5122c576-5ac2-42a4-a117-cb9f98cd48a3-utilities\") pod \"5122c576-5ac2-42a4-a117-cb9f98cd48a3\" (UID: \"5122c576-5ac2-42a4-a117-cb9f98cd48a3\") " Feb 19 20:00:24 crc kubenswrapper[4749]: I0219 20:00:24.018850 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5122c576-5ac2-42a4-a117-cb9f98cd48a3-catalog-content\") pod \"5122c576-5ac2-42a4-a117-cb9f98cd48a3\" (UID: \"5122c576-5ac2-42a4-a117-cb9f98cd48a3\") " Feb 19 20:00:24 crc kubenswrapper[4749]: I0219 20:00:24.018889 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59v7m\" (UniqueName: \"kubernetes.io/projected/5122c576-5ac2-42a4-a117-cb9f98cd48a3-kube-api-access-59v7m\") pod \"5122c576-5ac2-42a4-a117-cb9f98cd48a3\" (UID: \"5122c576-5ac2-42a4-a117-cb9f98cd48a3\") " Feb 19 20:00:24 crc kubenswrapper[4749]: I0219 20:00:24.019683 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5122c576-5ac2-42a4-a117-cb9f98cd48a3-utilities" (OuterVolumeSpecName: "utilities") pod "5122c576-5ac2-42a4-a117-cb9f98cd48a3" (UID: "5122c576-5ac2-42a4-a117-cb9f98cd48a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:00:24 crc kubenswrapper[4749]: I0219 20:00:24.027809 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5122c576-5ac2-42a4-a117-cb9f98cd48a3-kube-api-access-59v7m" (OuterVolumeSpecName: "kube-api-access-59v7m") pod "5122c576-5ac2-42a4-a117-cb9f98cd48a3" (UID: "5122c576-5ac2-42a4-a117-cb9f98cd48a3"). InnerVolumeSpecName "kube-api-access-59v7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:00:24 crc kubenswrapper[4749]: I0219 20:00:24.057111 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5122c576-5ac2-42a4-a117-cb9f98cd48a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5122c576-5ac2-42a4-a117-cb9f98cd48a3" (UID: "5122c576-5ac2-42a4-a117-cb9f98cd48a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:00:24 crc kubenswrapper[4749]: I0219 20:00:24.121705 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5122c576-5ac2-42a4-a117-cb9f98cd48a3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:24 crc kubenswrapper[4749]: I0219 20:00:24.121762 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59v7m\" (UniqueName: \"kubernetes.io/projected/5122c576-5ac2-42a4-a117-cb9f98cd48a3-kube-api-access-59v7m\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:24 crc kubenswrapper[4749]: I0219 20:00:24.121777 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5122c576-5ac2-42a4-a117-cb9f98cd48a3-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:24 crc kubenswrapper[4749]: I0219 20:00:24.451937 4749 generic.go:334] "Generic (PLEG): container finished" podID="5122c576-5ac2-42a4-a117-cb9f98cd48a3" containerID="c49943eba091345092fbd3ddcac424c36229146f3c5469976075ec1e3aafcac0" exitCode=0 Feb 19 20:00:24 crc kubenswrapper[4749]: I0219 20:00:24.452252 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5bks" event={"ID":"5122c576-5ac2-42a4-a117-cb9f98cd48a3","Type":"ContainerDied","Data":"c49943eba091345092fbd3ddcac424c36229146f3c5469976075ec1e3aafcac0"} Feb 19 20:00:24 crc kubenswrapper[4749]: I0219 20:00:24.452387 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5bks" event={"ID":"5122c576-5ac2-42a4-a117-cb9f98cd48a3","Type":"ContainerDied","Data":"9abd0a6d5ddb12e438714f2ae267e777ad0a51b4e7a7b364d4eb3e958100336b"} Feb 19 20:00:24 crc kubenswrapper[4749]: I0219 20:00:24.452483 4749 scope.go:117] "RemoveContainer" containerID="c49943eba091345092fbd3ddcac424c36229146f3c5469976075ec1e3aafcac0" Feb 19 20:00:24 crc kubenswrapper[4749]: I0219 20:00:24.452716 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5bks" Feb 19 20:00:24 crc kubenswrapper[4749]: I0219 20:00:24.589711 4749 scope.go:117] "RemoveContainer" containerID="5abf34ea7b5cf3b263f5342e322f0f67a89b4ac11ff09a41d1bee4f9267218e9" Feb 19 20:00:24 crc kubenswrapper[4749]: I0219 20:00:24.611358 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5bks"] Feb 19 20:00:24 crc kubenswrapper[4749]: I0219 20:00:24.619055 4749 scope.go:117] "RemoveContainer" containerID="bce65422575f09adbc03a5c60998cd38200652bf338fe893c95573a885a283f4" Feb 19 20:00:24 crc kubenswrapper[4749]: I0219 20:00:24.628067 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5bks"] Feb 19 20:00:24 crc kubenswrapper[4749]: I0219 20:00:24.665992 4749 scope.go:117] "RemoveContainer" containerID="c49943eba091345092fbd3ddcac424c36229146f3c5469976075ec1e3aafcac0" Feb 19 20:00:24 crc kubenswrapper[4749]: E0219 20:00:24.666580 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c49943eba091345092fbd3ddcac424c36229146f3c5469976075ec1e3aafcac0\": container with ID starting with c49943eba091345092fbd3ddcac424c36229146f3c5469976075ec1e3aafcac0 not found: ID does not exist" containerID="c49943eba091345092fbd3ddcac424c36229146f3c5469976075ec1e3aafcac0" Feb 19 20:00:24 crc kubenswrapper[4749]: I0219 20:00:24.666623 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c49943eba091345092fbd3ddcac424c36229146f3c5469976075ec1e3aafcac0"} err="failed to get container status \"c49943eba091345092fbd3ddcac424c36229146f3c5469976075ec1e3aafcac0\": rpc error: code = NotFound desc = could not find container \"c49943eba091345092fbd3ddcac424c36229146f3c5469976075ec1e3aafcac0\": container with ID starting with c49943eba091345092fbd3ddcac424c36229146f3c5469976075ec1e3aafcac0 not found: ID does not exist" Feb 19 20:00:24 crc kubenswrapper[4749]: I0219 20:00:24.666652 4749 scope.go:117] "RemoveContainer" containerID="5abf34ea7b5cf3b263f5342e322f0f67a89b4ac11ff09a41d1bee4f9267218e9" Feb 19 20:00:24 crc kubenswrapper[4749]: E0219 20:00:24.667059 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5abf34ea7b5cf3b263f5342e322f0f67a89b4ac11ff09a41d1bee4f9267218e9\": container with ID starting with 5abf34ea7b5cf3b263f5342e322f0f67a89b4ac11ff09a41d1bee4f9267218e9 not found: ID does not exist" containerID="5abf34ea7b5cf3b263f5342e322f0f67a89b4ac11ff09a41d1bee4f9267218e9" Feb 19 20:00:24 crc kubenswrapper[4749]: I0219 20:00:24.667085 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5abf34ea7b5cf3b263f5342e322f0f67a89b4ac11ff09a41d1bee4f9267218e9"} err="failed to get container status \"5abf34ea7b5cf3b263f5342e322f0f67a89b4ac11ff09a41d1bee4f9267218e9\": rpc error: code = NotFound desc = could not find container \"5abf34ea7b5cf3b263f5342e322f0f67a89b4ac11ff09a41d1bee4f9267218e9\": container with ID starting with 5abf34ea7b5cf3b263f5342e322f0f67a89b4ac11ff09a41d1bee4f9267218e9 not found: ID does not exist" Feb 19 20:00:24 crc kubenswrapper[4749]: I0219 20:00:24.667106 4749 scope.go:117] "RemoveContainer" containerID="bce65422575f09adbc03a5c60998cd38200652bf338fe893c95573a885a283f4" Feb 19 20:00:24 crc kubenswrapper[4749]: E0219 20:00:24.667369 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bce65422575f09adbc03a5c60998cd38200652bf338fe893c95573a885a283f4\": container with ID starting with bce65422575f09adbc03a5c60998cd38200652bf338fe893c95573a885a283f4 not found: ID does not exist" containerID="bce65422575f09adbc03a5c60998cd38200652bf338fe893c95573a885a283f4" Feb 19 20:00:24 crc kubenswrapper[4749]: I0219 20:00:24.667395 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bce65422575f09adbc03a5c60998cd38200652bf338fe893c95573a885a283f4"} err="failed to get container status \"bce65422575f09adbc03a5c60998cd38200652bf338fe893c95573a885a283f4\": rpc error: code = NotFound desc = could not find container \"bce65422575f09adbc03a5c60998cd38200652bf338fe893c95573a885a283f4\": container with ID starting with bce65422575f09adbc03a5c60998cd38200652bf338fe893c95573a885a283f4 not found: ID does not exist" Feb 19 20:00:24 crc kubenswrapper[4749]: I0219 20:00:24.691959 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5122c576-5ac2-42a4-a117-cb9f98cd48a3" path="/var/lib/kubelet/pods/5122c576-5ac2-42a4-a117-cb9f98cd48a3/volumes" Feb 19 20:00:24 crc kubenswrapper[4749]: I0219 20:00:24.725400 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 20:00:24 crc kubenswrapper[4749]: I0219 20:00:24.725457 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 20:00:26 crc kubenswrapper[4749]: I0219 20:00:26.147896 4749 scope.go:117] "RemoveContainer" containerID="f0bf23d1584e1a3ec050cc5c328a9e55559dd4f6d25aa109e51fc657d81f3484" Feb 19 20:00:31 crc kubenswrapper[4749]: I0219 20:00:31.522633 4749 generic.go:334] "Generic (PLEG): container finished" podID="d90dd387-65e1-4efa-a0b2-5d1d8178aca4" containerID="ee6efd9227a60bdb65af71d3d6a049beea9d0a0df051d733be5335e9d7e86e69" exitCode=0 Feb 19 20:00:31 crc kubenswrapper[4749]: I0219 20:00:31.522719 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sfq27/crc-debug-rpt4z" event={"ID":"d90dd387-65e1-4efa-a0b2-5d1d8178aca4","Type":"ContainerDied","Data":"ee6efd9227a60bdb65af71d3d6a049beea9d0a0df051d733be5335e9d7e86e69"} Feb 19 20:00:32 crc kubenswrapper[4749]: I0219 20:00:32.646305 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sfq27/crc-debug-rpt4z" Feb 19 20:00:32 crc kubenswrapper[4749]: I0219 20:00:32.692569 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-sfq27/crc-debug-rpt4z"] Feb 19 20:00:32 crc kubenswrapper[4749]: I0219 20:00:32.693790 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-sfq27/crc-debug-rpt4z"] Feb 19 20:00:32 crc kubenswrapper[4749]: I0219 20:00:32.808216 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n294c\" (UniqueName: \"kubernetes.io/projected/d90dd387-65e1-4efa-a0b2-5d1d8178aca4-kube-api-access-n294c\") pod \"d90dd387-65e1-4efa-a0b2-5d1d8178aca4\" (UID: \"d90dd387-65e1-4efa-a0b2-5d1d8178aca4\") " Feb 19 20:00:32 crc kubenswrapper[4749]: I0219 20:00:32.808783 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d90dd387-65e1-4efa-a0b2-5d1d8178aca4-host\") pod \"d90dd387-65e1-4efa-a0b2-5d1d8178aca4\" (UID: \"d90dd387-65e1-4efa-a0b2-5d1d8178aca4\") " Feb 19 20:00:32 crc kubenswrapper[4749]: I0219 20:00:32.808915 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d90dd387-65e1-4efa-a0b2-5d1d8178aca4-host" (OuterVolumeSpecName: "host") pod "d90dd387-65e1-4efa-a0b2-5d1d8178aca4" (UID: "d90dd387-65e1-4efa-a0b2-5d1d8178aca4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 20:00:32 crc kubenswrapper[4749]: I0219 20:00:32.809384 4749 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d90dd387-65e1-4efa-a0b2-5d1d8178aca4-host\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:32 crc kubenswrapper[4749]: I0219 20:00:32.817451 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d90dd387-65e1-4efa-a0b2-5d1d8178aca4-kube-api-access-n294c" (OuterVolumeSpecName: "kube-api-access-n294c") pod "d90dd387-65e1-4efa-a0b2-5d1d8178aca4" (UID: "d90dd387-65e1-4efa-a0b2-5d1d8178aca4"). InnerVolumeSpecName "kube-api-access-n294c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:00:32 crc kubenswrapper[4749]: I0219 20:00:32.911824 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n294c\" (UniqueName: \"kubernetes.io/projected/d90dd387-65e1-4efa-a0b2-5d1d8178aca4-kube-api-access-n294c\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:33 crc kubenswrapper[4749]: I0219 20:00:33.540578 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sfq27/crc-debug-rpt4z" Feb 19 20:00:33 crc kubenswrapper[4749]: I0219 20:00:33.540621 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f2c43d4840221c57f22d4a8b87cdb9fc55ba8865113a0a859fdb03f771d7f20" Feb 19 20:00:33 crc kubenswrapper[4749]: I0219 20:00:33.852313 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sfq27/crc-debug-gfjnx"] Feb 19 20:00:33 crc kubenswrapper[4749]: E0219 20:00:33.852692 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fa47884-562c-4388-901c-4c63d9560b35" containerName="collect-profiles" Feb 19 20:00:33 crc kubenswrapper[4749]: I0219 20:00:33.852704 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fa47884-562c-4388-901c-4c63d9560b35" containerName="collect-profiles" Feb 19 20:00:33 crc kubenswrapper[4749]: E0219 20:00:33.852720 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5122c576-5ac2-42a4-a117-cb9f98cd48a3" containerName="extract-utilities" Feb 19 20:00:33 crc kubenswrapper[4749]: I0219 20:00:33.852727 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5122c576-5ac2-42a4-a117-cb9f98cd48a3" containerName="extract-utilities" Feb 19 20:00:33 crc kubenswrapper[4749]: E0219 20:00:33.852744 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5122c576-5ac2-42a4-a117-cb9f98cd48a3" containerName="registry-server" Feb 19 20:00:33 crc kubenswrapper[4749]: I0219 20:00:33.852751 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5122c576-5ac2-42a4-a117-cb9f98cd48a3" containerName="registry-server" Feb 19 20:00:33 crc kubenswrapper[4749]: E0219 20:00:33.852774 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5122c576-5ac2-42a4-a117-cb9f98cd48a3" containerName="extract-content" Feb 19 20:00:33 crc kubenswrapper[4749]: I0219 20:00:33.852781 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5122c576-5ac2-42a4-a117-cb9f98cd48a3" containerName="extract-content" Feb 19 20:00:33 crc kubenswrapper[4749]: E0219 20:00:33.852793 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d90dd387-65e1-4efa-a0b2-5d1d8178aca4" containerName="container-00" Feb 19 20:00:33 crc kubenswrapper[4749]: I0219 20:00:33.852800 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d90dd387-65e1-4efa-a0b2-5d1d8178aca4" containerName="container-00" Feb 19 20:00:33 crc kubenswrapper[4749]: I0219 20:00:33.852980 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fa47884-562c-4388-901c-4c63d9560b35" containerName="collect-profiles" Feb 19 20:00:33 crc kubenswrapper[4749]: I0219 20:00:33.853004 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d90dd387-65e1-4efa-a0b2-5d1d8178aca4" containerName="container-00" Feb 19 20:00:33 crc kubenswrapper[4749]: I0219 20:00:33.853016 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5122c576-5ac2-42a4-a117-cb9f98cd48a3" containerName="registry-server" Feb 19 20:00:33 crc kubenswrapper[4749]: I0219 20:00:33.853762 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sfq27/crc-debug-gfjnx" Feb 19 20:00:34 crc kubenswrapper[4749]: I0219 20:00:34.034571 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n4qb\" (UniqueName: \"kubernetes.io/projected/5c2bd445-2635-46db-a0fb-304689071b20-kube-api-access-6n4qb\") pod \"crc-debug-gfjnx\" (UID: \"5c2bd445-2635-46db-a0fb-304689071b20\") " pod="openshift-must-gather-sfq27/crc-debug-gfjnx" Feb 19 20:00:34 crc kubenswrapper[4749]: I0219 20:00:34.034665 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c2bd445-2635-46db-a0fb-304689071b20-host\") pod \"crc-debug-gfjnx\" (UID: \"5c2bd445-2635-46db-a0fb-304689071b20\") " pod="openshift-must-gather-sfq27/crc-debug-gfjnx" Feb 19 20:00:34 crc kubenswrapper[4749]: I0219 20:00:34.136543 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n4qb\" (UniqueName: \"kubernetes.io/projected/5c2bd445-2635-46db-a0fb-304689071b20-kube-api-access-6n4qb\") pod \"crc-debug-gfjnx\" (UID: \"5c2bd445-2635-46db-a0fb-304689071b20\") " pod="openshift-must-gather-sfq27/crc-debug-gfjnx" Feb 19 20:00:34 crc kubenswrapper[4749]: I0219 20:00:34.136985 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c2bd445-2635-46db-a0fb-304689071b20-host\") pod \"crc-debug-gfjnx\" (UID: \"5c2bd445-2635-46db-a0fb-304689071b20\") " pod="openshift-must-gather-sfq27/crc-debug-gfjnx" Feb 19 20:00:34 crc kubenswrapper[4749]: I0219 20:00:34.137087 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c2bd445-2635-46db-a0fb-304689071b20-host\") pod \"crc-debug-gfjnx\" (UID: \"5c2bd445-2635-46db-a0fb-304689071b20\") " pod="openshift-must-gather-sfq27/crc-debug-gfjnx" Feb 19 20:00:34 crc kubenswrapper[4749]: I0219 20:00:34.159619 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n4qb\" (UniqueName: \"kubernetes.io/projected/5c2bd445-2635-46db-a0fb-304689071b20-kube-api-access-6n4qb\") pod \"crc-debug-gfjnx\" (UID: \"5c2bd445-2635-46db-a0fb-304689071b20\") " pod="openshift-must-gather-sfq27/crc-debug-gfjnx" Feb 19 20:00:34 crc kubenswrapper[4749]: I0219 20:00:34.170386 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sfq27/crc-debug-gfjnx" Feb 19 20:00:34 crc kubenswrapper[4749]: I0219 20:00:34.550505 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sfq27/crc-debug-gfjnx" event={"ID":"5c2bd445-2635-46db-a0fb-304689071b20","Type":"ContainerStarted","Data":"01e4afbec47b99516ac12f1173a0d5a786ccf408a679c2e879582c2a68992a76"} Feb 19 20:00:34 crc kubenswrapper[4749]: I0219 20:00:34.550824 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sfq27/crc-debug-gfjnx" event={"ID":"5c2bd445-2635-46db-a0fb-304689071b20","Type":"ContainerStarted","Data":"50b684d4e8d044a6f94c584bc79e0a13a7c6bcd39613dead00aa99856acc922e"} Feb 19 20:00:34 crc kubenswrapper[4749]: I0219 20:00:34.566802 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-sfq27/crc-debug-gfjnx" podStartSLOduration=1.5667873079999999 podStartE2EDuration="1.566787308s" podCreationTimestamp="2026-02-19 20:00:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 20:00:34.563477678 +0000 UTC m=+5208.524697632" watchObservedRunningTime="2026-02-19 20:00:34.566787308 +0000 UTC m=+5208.528007262" Feb 19 20:00:34 crc kubenswrapper[4749]: I0219 20:00:34.691074 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d90dd387-65e1-4efa-a0b2-5d1d8178aca4" path="/var/lib/kubelet/pods/d90dd387-65e1-4efa-a0b2-5d1d8178aca4/volumes" Feb 19 20:00:35 crc kubenswrapper[4749]: I0219 20:00:35.560601 4749 generic.go:334] "Generic (PLEG): container finished" podID="5c2bd445-2635-46db-a0fb-304689071b20" containerID="01e4afbec47b99516ac12f1173a0d5a786ccf408a679c2e879582c2a68992a76" exitCode=0 Feb 19 20:00:35 crc kubenswrapper[4749]: I0219 20:00:35.560664 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sfq27/crc-debug-gfjnx" event={"ID":"5c2bd445-2635-46db-a0fb-304689071b20","Type":"ContainerDied","Data":"01e4afbec47b99516ac12f1173a0d5a786ccf408a679c2e879582c2a68992a76"} Feb 19 20:00:36 crc kubenswrapper[4749]: I0219 20:00:36.696181 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sfq27/crc-debug-gfjnx" Feb 19 20:00:36 crc kubenswrapper[4749]: I0219 20:00:36.779730 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c2bd445-2635-46db-a0fb-304689071b20-host\") pod \"5c2bd445-2635-46db-a0fb-304689071b20\" (UID: \"5c2bd445-2635-46db-a0fb-304689071b20\") " Feb 19 20:00:36 crc kubenswrapper[4749]: I0219 20:00:36.779857 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n4qb\" (UniqueName: \"kubernetes.io/projected/5c2bd445-2635-46db-a0fb-304689071b20-kube-api-access-6n4qb\") pod \"5c2bd445-2635-46db-a0fb-304689071b20\" (UID: \"5c2bd445-2635-46db-a0fb-304689071b20\") " Feb 19 20:00:36 crc kubenswrapper[4749]: I0219 20:00:36.780845 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c2bd445-2635-46db-a0fb-304689071b20-host" (OuterVolumeSpecName: "host") pod "5c2bd445-2635-46db-a0fb-304689071b20" (UID: "5c2bd445-2635-46db-a0fb-304689071b20"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 20:00:36 crc kubenswrapper[4749]: I0219 20:00:36.786863 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c2bd445-2635-46db-a0fb-304689071b20-kube-api-access-6n4qb" (OuterVolumeSpecName: "kube-api-access-6n4qb") pod "5c2bd445-2635-46db-a0fb-304689071b20" (UID: "5c2bd445-2635-46db-a0fb-304689071b20"). InnerVolumeSpecName "kube-api-access-6n4qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:00:36 crc kubenswrapper[4749]: I0219 20:00:36.820507 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-sfq27/crc-debug-gfjnx"] Feb 19 20:00:36 crc kubenswrapper[4749]: I0219 20:00:36.830651 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-sfq27/crc-debug-gfjnx"] Feb 19 20:00:36 crc kubenswrapper[4749]: I0219 20:00:36.882801 4749 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c2bd445-2635-46db-a0fb-304689071b20-host\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:36 crc kubenswrapper[4749]: I0219 20:00:36.882835 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n4qb\" (UniqueName: \"kubernetes.io/projected/5c2bd445-2635-46db-a0fb-304689071b20-kube-api-access-6n4qb\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:37 crc kubenswrapper[4749]: I0219 20:00:37.578752 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50b684d4e8d044a6f94c584bc79e0a13a7c6bcd39613dead00aa99856acc922e" Feb 19 20:00:37 crc kubenswrapper[4749]: I0219 20:00:37.578792 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sfq27/crc-debug-gfjnx" Feb 19 20:00:38 crc kubenswrapper[4749]: I0219 20:00:38.001850 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sfq27/crc-debug-wfq66"] Feb 19 20:00:38 crc kubenswrapper[4749]: E0219 20:00:38.002294 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c2bd445-2635-46db-a0fb-304689071b20" containerName="container-00" Feb 19 20:00:38 crc kubenswrapper[4749]: I0219 20:00:38.002307 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c2bd445-2635-46db-a0fb-304689071b20" containerName="container-00" Feb 19 20:00:38 crc kubenswrapper[4749]: I0219 20:00:38.002493 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c2bd445-2635-46db-a0fb-304689071b20" containerName="container-00" Feb 19 20:00:38 crc kubenswrapper[4749]: I0219 20:00:38.003200 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sfq27/crc-debug-wfq66" Feb 19 20:00:38 crc kubenswrapper[4749]: I0219 20:00:38.106905 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa69ffc1-303c-4ae2-b012-34440282b0fb-host\") pod \"crc-debug-wfq66\" (UID: \"aa69ffc1-303c-4ae2-b012-34440282b0fb\") " pod="openshift-must-gather-sfq27/crc-debug-wfq66" Feb 19 20:00:38 crc kubenswrapper[4749]: I0219 20:00:38.106969 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d76q6\" (UniqueName: \"kubernetes.io/projected/aa69ffc1-303c-4ae2-b012-34440282b0fb-kube-api-access-d76q6\") pod \"crc-debug-wfq66\" (UID: \"aa69ffc1-303c-4ae2-b012-34440282b0fb\") " pod="openshift-must-gather-sfq27/crc-debug-wfq66" Feb 19 20:00:38 crc kubenswrapper[4749]: I0219 20:00:38.208333 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d76q6\" (UniqueName: \"kubernetes.io/projected/aa69ffc1-303c-4ae2-b012-34440282b0fb-kube-api-access-d76q6\") pod \"crc-debug-wfq66\" (UID: \"aa69ffc1-303c-4ae2-b012-34440282b0fb\") " pod="openshift-must-gather-sfq27/crc-debug-wfq66" Feb 19 20:00:38 crc kubenswrapper[4749]: I0219 20:00:38.208530 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa69ffc1-303c-4ae2-b012-34440282b0fb-host\") pod \"crc-debug-wfq66\" (UID: \"aa69ffc1-303c-4ae2-b012-34440282b0fb\") " pod="openshift-must-gather-sfq27/crc-debug-wfq66" Feb 19 20:00:38 crc kubenswrapper[4749]: I0219 20:00:38.208630 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa69ffc1-303c-4ae2-b012-34440282b0fb-host\") pod \"crc-debug-wfq66\" (UID: \"aa69ffc1-303c-4ae2-b012-34440282b0fb\") " pod="openshift-must-gather-sfq27/crc-debug-wfq66" Feb 19 20:00:38 crc kubenswrapper[4749]: I0219 20:00:38.234687 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d76q6\" (UniqueName: \"kubernetes.io/projected/aa69ffc1-303c-4ae2-b012-34440282b0fb-kube-api-access-d76q6\") pod \"crc-debug-wfq66\" (UID: \"aa69ffc1-303c-4ae2-b012-34440282b0fb\") " pod="openshift-must-gather-sfq27/crc-debug-wfq66" Feb 19 20:00:38 crc kubenswrapper[4749]: I0219 20:00:38.321803 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sfq27/crc-debug-wfq66" Feb 19 20:00:38 crc kubenswrapper[4749]: W0219 20:00:38.354451 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa69ffc1_303c_4ae2_b012_34440282b0fb.slice/crio-e5d8831644cefc7bae993a784047d1c331b6e248fbe6c363a90081b102ed45fd WatchSource:0}: Error finding container e5d8831644cefc7bae993a784047d1c331b6e248fbe6c363a90081b102ed45fd: Status 404 returned error can't find the container with id e5d8831644cefc7bae993a784047d1c331b6e248fbe6c363a90081b102ed45fd Feb 19 20:00:38 crc kubenswrapper[4749]: I0219 20:00:38.589889 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sfq27/crc-debug-wfq66" event={"ID":"aa69ffc1-303c-4ae2-b012-34440282b0fb","Type":"ContainerStarted","Data":"e5d8831644cefc7bae993a784047d1c331b6e248fbe6c363a90081b102ed45fd"} Feb 19 20:00:38 crc kubenswrapper[4749]: I0219 20:00:38.689640 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c2bd445-2635-46db-a0fb-304689071b20" path="/var/lib/kubelet/pods/5c2bd445-2635-46db-a0fb-304689071b20/volumes" Feb 19 20:00:39 crc kubenswrapper[4749]: I0219 20:00:39.600103 4749 generic.go:334] "Generic (PLEG): container finished" podID="aa69ffc1-303c-4ae2-b012-34440282b0fb" containerID="0ea42cd3709568a1abeb3f697901e7587bf541776035caf7a33cc0e2002ed261" exitCode=0 Feb 19 20:00:39 crc kubenswrapper[4749]: I0219 20:00:39.600179 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sfq27/crc-debug-wfq66" event={"ID":"aa69ffc1-303c-4ae2-b012-34440282b0fb","Type":"ContainerDied","Data":"0ea42cd3709568a1abeb3f697901e7587bf541776035caf7a33cc0e2002ed261"} Feb 19 20:00:39 crc kubenswrapper[4749]: I0219 20:00:39.648185 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-sfq27/crc-debug-wfq66"] Feb 19 20:00:39 crc kubenswrapper[4749]: I0219 20:00:39.660445 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-sfq27/crc-debug-wfq66"] Feb 19 20:00:40 crc kubenswrapper[4749]: I0219 20:00:40.725558 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sfq27/crc-debug-wfq66" Feb 19 20:00:40 crc kubenswrapper[4749]: I0219 20:00:40.865330 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa69ffc1-303c-4ae2-b012-34440282b0fb-host\") pod \"aa69ffc1-303c-4ae2-b012-34440282b0fb\" (UID: \"aa69ffc1-303c-4ae2-b012-34440282b0fb\") " Feb 19 20:00:40 crc kubenswrapper[4749]: I0219 20:00:40.865496 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d76q6\" (UniqueName: \"kubernetes.io/projected/aa69ffc1-303c-4ae2-b012-34440282b0fb-kube-api-access-d76q6\") pod \"aa69ffc1-303c-4ae2-b012-34440282b0fb\" (UID: \"aa69ffc1-303c-4ae2-b012-34440282b0fb\") " Feb 19 20:00:40 crc kubenswrapper[4749]: I0219 20:00:40.865581 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aa69ffc1-303c-4ae2-b012-34440282b0fb-host" (OuterVolumeSpecName: "host") pod "aa69ffc1-303c-4ae2-b012-34440282b0fb" (UID: "aa69ffc1-303c-4ae2-b012-34440282b0fb"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 20:00:40 crc kubenswrapper[4749]: I0219 20:00:40.866175 4749 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa69ffc1-303c-4ae2-b012-34440282b0fb-host\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:40 crc kubenswrapper[4749]: I0219 20:00:40.871681 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa69ffc1-303c-4ae2-b012-34440282b0fb-kube-api-access-d76q6" (OuterVolumeSpecName: "kube-api-access-d76q6") pod "aa69ffc1-303c-4ae2-b012-34440282b0fb" (UID: "aa69ffc1-303c-4ae2-b012-34440282b0fb"). InnerVolumeSpecName "kube-api-access-d76q6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:00:40 crc kubenswrapper[4749]: I0219 20:00:40.968330 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d76q6\" (UniqueName: \"kubernetes.io/projected/aa69ffc1-303c-4ae2-b012-34440282b0fb-kube-api-access-d76q6\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:41 crc kubenswrapper[4749]: I0219 20:00:41.618290 4749 scope.go:117] "RemoveContainer" containerID="0ea42cd3709568a1abeb3f697901e7587bf541776035caf7a33cc0e2002ed261" Feb 19 20:00:41 crc kubenswrapper[4749]: I0219 20:00:41.618338 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sfq27/crc-debug-wfq66" Feb 19 20:00:42 crc kubenswrapper[4749]: I0219 20:00:42.694268 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa69ffc1-303c-4ae2-b012-34440282b0fb" path="/var/lib/kubelet/pods/aa69ffc1-303c-4ae2-b012-34440282b0fb/volumes" Feb 19 20:00:54 crc kubenswrapper[4749]: I0219 20:00:54.725044 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 20:00:54 crc kubenswrapper[4749]: I0219 20:00:54.725585 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 20:01:00 crc kubenswrapper[4749]: I0219 20:01:00.165304 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29525521-n6rn5"] Feb 19 20:01:00 crc kubenswrapper[4749]: E0219 20:01:00.167123 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa69ffc1-303c-4ae2-b012-34440282b0fb" containerName="container-00" Feb 19 20:01:00 crc kubenswrapper[4749]: I0219 20:01:00.167208 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa69ffc1-303c-4ae2-b012-34440282b0fb" containerName="container-00" Feb 19 20:01:00 crc kubenswrapper[4749]: I0219 20:01:00.167506 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa69ffc1-303c-4ae2-b012-34440282b0fb" containerName="container-00" Feb 19 20:01:00 crc kubenswrapper[4749]: I0219 20:01:00.169232 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525521-n6rn5" Feb 19 20:01:00 crc kubenswrapper[4749]: I0219 20:01:00.182451 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29525521-n6rn5"] Feb 19 20:01:00 crc kubenswrapper[4749]: I0219 20:01:00.209239 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9759f20d-bfd3-4538-b451-418ffaa00853-combined-ca-bundle\") pod \"keystone-cron-29525521-n6rn5\" (UID: \"9759f20d-bfd3-4538-b451-418ffaa00853\") " pod="openstack/keystone-cron-29525521-n6rn5" Feb 19 20:01:00 crc kubenswrapper[4749]: I0219 20:01:00.209411 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl29c\" (UniqueName: \"kubernetes.io/projected/9759f20d-bfd3-4538-b451-418ffaa00853-kube-api-access-sl29c\") pod \"keystone-cron-29525521-n6rn5\" (UID: \"9759f20d-bfd3-4538-b451-418ffaa00853\") " pod="openstack/keystone-cron-29525521-n6rn5" Feb 19 20:01:00 crc kubenswrapper[4749]: I0219 20:01:00.209596 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9759f20d-bfd3-4538-b451-418ffaa00853-fernet-keys\") pod \"keystone-cron-29525521-n6rn5\" (UID: \"9759f20d-bfd3-4538-b451-418ffaa00853\") " pod="openstack/keystone-cron-29525521-n6rn5" Feb 19 20:01:00 crc kubenswrapper[4749]: I0219 20:01:00.209652 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9759f20d-bfd3-4538-b451-418ffaa00853-config-data\") pod \"keystone-cron-29525521-n6rn5\" (UID: \"9759f20d-bfd3-4538-b451-418ffaa00853\") " pod="openstack/keystone-cron-29525521-n6rn5" Feb 19 20:01:00 crc kubenswrapper[4749]: I0219 20:01:00.310832 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9759f20d-bfd3-4538-b451-418ffaa00853-combined-ca-bundle\") pod \"keystone-cron-29525521-n6rn5\" (UID: \"9759f20d-bfd3-4538-b451-418ffaa00853\") " pod="openstack/keystone-cron-29525521-n6rn5" Feb 19 20:01:00 crc kubenswrapper[4749]: I0219 20:01:00.310910 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl29c\" (UniqueName: \"kubernetes.io/projected/9759f20d-bfd3-4538-b451-418ffaa00853-kube-api-access-sl29c\") pod \"keystone-cron-29525521-n6rn5\" (UID: \"9759f20d-bfd3-4538-b451-418ffaa00853\") " pod="openstack/keystone-cron-29525521-n6rn5" Feb 19 20:01:00 crc kubenswrapper[4749]: I0219 20:01:00.311008 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9759f20d-bfd3-4538-b451-418ffaa00853-fernet-keys\") pod \"keystone-cron-29525521-n6rn5\" (UID: \"9759f20d-bfd3-4538-b451-418ffaa00853\") " pod="openstack/keystone-cron-29525521-n6rn5" Feb 19 20:01:00 crc kubenswrapper[4749]: I0219 20:01:00.311068 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9759f20d-bfd3-4538-b451-418ffaa00853-config-data\") pod \"keystone-cron-29525521-n6rn5\" (UID: \"9759f20d-bfd3-4538-b451-418ffaa00853\") " pod="openstack/keystone-cron-29525521-n6rn5" Feb 19 20:01:00 crc kubenswrapper[4749]: I0219 20:01:00.536995 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9759f20d-bfd3-4538-b451-418ffaa00853-config-data\") pod \"keystone-cron-29525521-n6rn5\" (UID: \"9759f20d-bfd3-4538-b451-418ffaa00853\") " pod="openstack/keystone-cron-29525521-n6rn5" Feb 19 20:01:00 crc kubenswrapper[4749]: I0219 20:01:00.537055 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9759f20d-bfd3-4538-b451-418ffaa00853-fernet-keys\") pod \"keystone-cron-29525521-n6rn5\" (UID: \"9759f20d-bfd3-4538-b451-418ffaa00853\") " pod="openstack/keystone-cron-29525521-n6rn5" Feb 19 20:01:00 crc kubenswrapper[4749]: I0219 20:01:00.537238 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl29c\" (UniqueName: \"kubernetes.io/projected/9759f20d-bfd3-4538-b451-418ffaa00853-kube-api-access-sl29c\") pod \"keystone-cron-29525521-n6rn5\" (UID: \"9759f20d-bfd3-4538-b451-418ffaa00853\") " pod="openstack/keystone-cron-29525521-n6rn5" Feb 19 20:01:00 crc kubenswrapper[4749]: I0219 20:01:00.537239 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9759f20d-bfd3-4538-b451-418ffaa00853-combined-ca-bundle\") pod \"keystone-cron-29525521-n6rn5\" (UID: \"9759f20d-bfd3-4538-b451-418ffaa00853\") " pod="openstack/keystone-cron-29525521-n6rn5" Feb 19 20:01:00 crc kubenswrapper[4749]: I0219 20:01:00.798917 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525521-n6rn5" Feb 19 20:01:01 crc kubenswrapper[4749]: I0219 20:01:01.289238 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29525521-n6rn5"] Feb 19 20:01:01 crc kubenswrapper[4749]: I0219 20:01:01.847620 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525521-n6rn5" event={"ID":"9759f20d-bfd3-4538-b451-418ffaa00853","Type":"ContainerStarted","Data":"30f36ead3af091933807571a7b79e3b82cb3f7734451ac97be499ef5c1b3ac19"} Feb 19 20:01:01 crc kubenswrapper[4749]: I0219 20:01:01.848039 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525521-n6rn5" event={"ID":"9759f20d-bfd3-4538-b451-418ffaa00853","Type":"ContainerStarted","Data":"0e3fa315fdc1e0edb6ed1c375e79cde96417de92b054ce2d0e0115cafbf07fce"} Feb 19 20:01:01 crc kubenswrapper[4749]: I0219 20:01:01.866623 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29525521-n6rn5" podStartSLOduration=1.866599846 podStartE2EDuration="1.866599846s" podCreationTimestamp="2026-02-19 20:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 20:01:01.86510783 +0000 UTC m=+5235.826327794" watchObservedRunningTime="2026-02-19 20:01:01.866599846 +0000 UTC m=+5235.827819800" Feb 19 20:01:05 crc kubenswrapper[4749]: I0219 20:01:05.879890 4749 generic.go:334] "Generic (PLEG): container finished" podID="9759f20d-bfd3-4538-b451-418ffaa00853" containerID="30f36ead3af091933807571a7b79e3b82cb3f7734451ac97be499ef5c1b3ac19" exitCode=0 Feb 19 20:01:05 crc kubenswrapper[4749]: I0219 20:01:05.879974 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525521-n6rn5" event={"ID":"9759f20d-bfd3-4538-b451-418ffaa00853","Type":"ContainerDied","Data":"30f36ead3af091933807571a7b79e3b82cb3f7734451ac97be499ef5c1b3ac19"} Feb 19 20:01:07 crc kubenswrapper[4749]: I0219 20:01:07.232064 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525521-n6rn5" Feb 19 20:01:07 crc kubenswrapper[4749]: I0219 20:01:07.275570 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9759f20d-bfd3-4538-b451-418ffaa00853-fernet-keys\") pod \"9759f20d-bfd3-4538-b451-418ffaa00853\" (UID: \"9759f20d-bfd3-4538-b451-418ffaa00853\") " Feb 19 20:01:07 crc kubenswrapper[4749]: I0219 20:01:07.275634 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9759f20d-bfd3-4538-b451-418ffaa00853-config-data\") pod \"9759f20d-bfd3-4538-b451-418ffaa00853\" (UID: \"9759f20d-bfd3-4538-b451-418ffaa00853\") " Feb 19 20:01:07 crc kubenswrapper[4749]: I0219 20:01:07.275682 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9759f20d-bfd3-4538-b451-418ffaa00853-combined-ca-bundle\") pod \"9759f20d-bfd3-4538-b451-418ffaa00853\" (UID: \"9759f20d-bfd3-4538-b451-418ffaa00853\") " Feb 19 20:01:07 crc kubenswrapper[4749]: I0219 20:01:07.275823 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl29c\" (UniqueName: \"kubernetes.io/projected/9759f20d-bfd3-4538-b451-418ffaa00853-kube-api-access-sl29c\") pod \"9759f20d-bfd3-4538-b451-418ffaa00853\" (UID: \"9759f20d-bfd3-4538-b451-418ffaa00853\") " Feb 19 20:01:07 crc kubenswrapper[4749]: I0219 20:01:07.286249 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9759f20d-bfd3-4538-b451-418ffaa00853-kube-api-access-sl29c" (OuterVolumeSpecName: "kube-api-access-sl29c") pod "9759f20d-bfd3-4538-b451-418ffaa00853" (UID: "9759f20d-bfd3-4538-b451-418ffaa00853"). InnerVolumeSpecName "kube-api-access-sl29c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:01:07 crc kubenswrapper[4749]: I0219 20:01:07.286879 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9759f20d-bfd3-4538-b451-418ffaa00853-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9759f20d-bfd3-4538-b451-418ffaa00853" (UID: "9759f20d-bfd3-4538-b451-418ffaa00853"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:01:07 crc kubenswrapper[4749]: I0219 20:01:07.320589 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9759f20d-bfd3-4538-b451-418ffaa00853-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9759f20d-bfd3-4538-b451-418ffaa00853" (UID: "9759f20d-bfd3-4538-b451-418ffaa00853"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:01:07 crc kubenswrapper[4749]: I0219 20:01:07.337558 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9759f20d-bfd3-4538-b451-418ffaa00853-config-data" (OuterVolumeSpecName: "config-data") pod "9759f20d-bfd3-4538-b451-418ffaa00853" (UID: "9759f20d-bfd3-4538-b451-418ffaa00853"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:01:07 crc kubenswrapper[4749]: I0219 20:01:07.377501 4749 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9759f20d-bfd3-4538-b451-418ffaa00853-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 20:01:07 crc kubenswrapper[4749]: I0219 20:01:07.377530 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9759f20d-bfd3-4538-b451-418ffaa00853-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 20:01:07 crc kubenswrapper[4749]: I0219 20:01:07.377539 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9759f20d-bfd3-4538-b451-418ffaa00853-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 20:01:07 crc kubenswrapper[4749]: I0219 20:01:07.377551 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl29c\" (UniqueName: \"kubernetes.io/projected/9759f20d-bfd3-4538-b451-418ffaa00853-kube-api-access-sl29c\") on node \"crc\" DevicePath \"\"" Feb 19 20:01:07 crc kubenswrapper[4749]: I0219 20:01:07.911904 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525521-n6rn5" event={"ID":"9759f20d-bfd3-4538-b451-418ffaa00853","Type":"ContainerDied","Data":"0e3fa315fdc1e0edb6ed1c375e79cde96417de92b054ce2d0e0115cafbf07fce"} Feb 19 20:01:07 crc kubenswrapper[4749]: I0219 20:01:07.912238 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e3fa315fdc1e0edb6ed1c375e79cde96417de92b054ce2d0e0115cafbf07fce" Feb 19 20:01:07 crc kubenswrapper[4749]: I0219 20:01:07.912388 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525521-n6rn5" Feb 19 20:01:21 crc kubenswrapper[4749]: I0219 20:01:21.237896 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6d97b8448-pl5l5_f19ae41b-3804-434e-b4a6-d461167f9548/barbican-api/0.log" Feb 19 20:01:21 crc kubenswrapper[4749]: I0219 20:01:21.470646 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6d97b8448-pl5l5_f19ae41b-3804-434e-b4a6-d461167f9548/barbican-api-log/0.log" Feb 19 20:01:21 crc kubenswrapper[4749]: I0219 20:01:21.517339 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d44554668-c49q8_2ea2def9-7751-439c-8c18-05f3568cae9f/barbican-keystone-listener/0.log" Feb 19 20:01:21 crc kubenswrapper[4749]: I0219 20:01:21.633483 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d44554668-c49q8_2ea2def9-7751-439c-8c18-05f3568cae9f/barbican-keystone-listener-log/0.log" Feb 19 20:01:21 crc kubenswrapper[4749]: I0219 20:01:21.761610 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7b845bddc9-bzwtz_16aa5e20-01b7-401e-abfd-161e81af9c70/barbican-worker-log/0.log" Feb 19 20:01:21 crc kubenswrapper[4749]: I0219 20:01:21.773450 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7b845bddc9-bzwtz_16aa5e20-01b7-401e-abfd-161e81af9c70/barbican-worker/0.log" Feb 19 20:01:21 crc kubenswrapper[4749]: I0219 20:01:21.949796 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq_22b6e4d8-5ab3-4a92-b8a4-e38b68e59744/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:01:22 crc kubenswrapper[4749]: I0219 20:01:22.108045 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e481767e-68e7-4396-b8aa-51956e378132/ceilometer-central-agent/0.log" Feb 19 20:01:22 crc kubenswrapper[4749]: I0219 20:01:22.145246 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e481767e-68e7-4396-b8aa-51956e378132/ceilometer-notification-agent/0.log" Feb 19 20:01:22 crc kubenswrapper[4749]: I0219 20:01:22.200982 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e481767e-68e7-4396-b8aa-51956e378132/sg-core/0.log" Feb 19 20:01:22 crc kubenswrapper[4749]: I0219 20:01:22.226608 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e481767e-68e7-4396-b8aa-51956e378132/proxy-httpd/0.log" Feb 19 20:01:22 crc kubenswrapper[4749]: I0219 20:01:22.491275 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_fa8559a3-5137-4d82-a189-18e060db5fa5/cinder-api-log/0.log" Feb 19 20:01:22 crc kubenswrapper[4749]: I0219 20:01:22.818384 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_94d055c5-b069-494b-a250-27a5c39826c2/probe/0.log" Feb 19 20:01:22 crc kubenswrapper[4749]: I0219 20:01:22.926958 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_fa8559a3-5137-4d82-a189-18e060db5fa5/cinder-api/0.log" Feb 19 20:01:22 crc kubenswrapper[4749]: I0219 20:01:22.941830 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_94d055c5-b069-494b-a250-27a5c39826c2/cinder-backup/0.log" Feb 19 20:01:23 crc kubenswrapper[4749]: I0219 20:01:23.045565 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f19d3222-dbed-44bf-94e0-7a17f5906051/cinder-scheduler/0.log" Feb 19 20:01:23 crc kubenswrapper[4749]: I0219 20:01:23.119559 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f19d3222-dbed-44bf-94e0-7a17f5906051/probe/0.log" Feb 19 20:01:23 crc kubenswrapper[4749]: I0219 20:01:23.321785 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_afe70cf9-b491-4c93-8c94-ae052eb02db4/cinder-volume/0.log" Feb 19 20:01:23 crc kubenswrapper[4749]: I0219 20:01:23.581840 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_afe70cf9-b491-4c93-8c94-ae052eb02db4/probe/0.log" Feb 19 20:01:23 crc kubenswrapper[4749]: I0219 20:01:23.765239 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_f2a5ecf9-3280-4da3-9ea6-6491401e4daa/probe/0.log" Feb 19 20:01:23 crc kubenswrapper[4749]: I0219 20:01:23.828979 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z_3c39d9c2-1081-4cf9-96c7-746c51a42207/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:01:23 crc kubenswrapper[4749]: I0219 20:01:23.933186 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_f2a5ecf9-3280-4da3-9ea6-6491401e4daa/cinder-volume/0.log" Feb 19 20:01:24 crc kubenswrapper[4749]: I0219 20:01:24.035153 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-qw75z_780425c2-7f97-4d00-a992-bf0ef5be3876/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:01:24 crc kubenswrapper[4749]: I0219 20:01:24.160239 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7c96bd5bf7-wmjrv_7707fe3e-adab-4755-bf50-f74bb3924913/init/0.log" Feb 19 20:01:24 crc kubenswrapper[4749]: I0219 20:01:24.409393 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7c96bd5bf7-wmjrv_7707fe3e-adab-4755-bf50-f74bb3924913/init/0.log" Feb 19 20:01:24 crc kubenswrapper[4749]: I0219 20:01:24.439038 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k_fe29f0cb-bc56-4d7b-983c-52667a6c4ceb/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:01:24 crc kubenswrapper[4749]: I0219 20:01:24.481210 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7c96bd5bf7-wmjrv_7707fe3e-adab-4755-bf50-f74bb3924913/dnsmasq-dns/0.log" Feb 19 20:01:24 crc kubenswrapper[4749]: I0219 20:01:24.646018 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_b40edc19-78bc-456e-ad9f-c3dcae644950/glance-log/0.log" Feb 19 20:01:24 crc kubenswrapper[4749]: I0219 20:01:24.654150 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_b40edc19-78bc-456e-ad9f-c3dcae644950/glance-httpd/0.log" Feb 19 20:01:24 crc kubenswrapper[4749]: I0219 20:01:24.725554 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 20:01:24 crc kubenswrapper[4749]: I0219 20:01:24.725627 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 20:01:24 crc kubenswrapper[4749]: I0219 20:01:24.725682 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 20:01:24 crc kubenswrapper[4749]: I0219 20:01:24.726610 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5"} pod="openshift-machine-config-operator/machine-config-daemon-nzldt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 20:01:24 crc kubenswrapper[4749]: I0219 20:01:24.726685 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" containerID="cri-o://872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5" gracePeriod=600 Feb 19 20:01:24 crc kubenswrapper[4749]: I0219 20:01:24.777864 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_4d9d8ca6-8ca8-415e-9120-5c48d275052c/glance-httpd/0.log" Feb 19 20:01:24 crc kubenswrapper[4749]: E0219 20:01:24.853309 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:01:24 crc kubenswrapper[4749]: I0219 20:01:24.882089 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_4d9d8ca6-8ca8-415e-9120-5c48d275052c/glance-log/0.log" Feb 19 20:01:24 crc kubenswrapper[4749]: I0219 20:01:24.951197 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7b4d589db8-c89ft_8812ac95-8284-4b4f-a838-b5ab30a55fad/horizon/0.log" Feb 19 20:01:25 crc kubenswrapper[4749]: I0219 20:01:25.060482 4749 generic.go:334] "Generic (PLEG): container finished" podID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerID="872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5" exitCode=0 Feb 19 20:01:25 crc kubenswrapper[4749]: I0219 20:01:25.060526 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerDied","Data":"872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5"} Feb 19 20:01:25 crc kubenswrapper[4749]: I0219 20:01:25.060588 4749 scope.go:117] "RemoveContainer" containerID="67efe7d26f397e5c6c83b5c604adacb30f5ae2a6ee4a04b1e3e91db09ca88945" Feb 19 20:01:25 crc kubenswrapper[4749]: I0219 20:01:25.061385 4749 scope.go:117] "RemoveContainer" containerID="872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5" Feb 19 20:01:25 crc kubenswrapper[4749]: E0219 20:01:25.061697 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:01:25 crc kubenswrapper[4749]: I0219 20:01:25.132810 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2_486b7134-04b2-4255-b831-c7da1c6fdcfe/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:01:25 crc kubenswrapper[4749]: I0219 20:01:25.264989 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-hkxkr_59c9b6e2-493f-4c1a-ae9b-47dca8a2658d/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:01:25 crc kubenswrapper[4749]: I0219 20:01:25.513202 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29525461-vkh7z_bdfb9341-a712-489d-ba8e-01ce41b5d1cb/keystone-cron/0.log" Feb 19 20:01:25 crc kubenswrapper[4749]: I0219 20:01:25.718302 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29525521-n6rn5_9759f20d-bfd3-4538-b451-418ffaa00853/keystone-cron/0.log" Feb 19 20:01:25 crc kubenswrapper[4749]: I0219 20:01:25.834807 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7b4d589db8-c89ft_8812ac95-8284-4b4f-a838-b5ab30a55fad/horizon-log/0.log" Feb 19 20:01:25 crc kubenswrapper[4749]: I0219 20:01:25.961917 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_e6855d59-78cd-4386-b41b-8670ebdadafb/kube-state-metrics/0.log" Feb 19 20:01:26 crc kubenswrapper[4749]: I0219 20:01:26.006576 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-59976cccdd-n7pz6_559ba668-4259-4ec1-a8b7-e6ab6b78d4b6/keystone-api/0.log" Feb 19 20:01:26 crc kubenswrapper[4749]: I0219 20:01:26.053910 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4_1032ad4c-247e-48e1-805c-31aadc54415d/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:01:26 crc kubenswrapper[4749]: I0219 20:01:26.531146 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27_cdcbdafe-8bad-41be-91bb-59bc54994227/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:01:26 crc kubenswrapper[4749]: I0219 20:01:26.596127 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-85c7d94649-hz2gq_3bc4f02b-0135-46b5-ad46-aa2a9ce82f54/neutron-httpd/0.log" Feb 19 20:01:26 crc kubenswrapper[4749]: I0219 20:01:26.660247 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-85c7d94649-hz2gq_3bc4f02b-0135-46b5-ad46-aa2a9ce82f54/neutron-api/0.log" Feb 19 20:01:26 crc kubenswrapper[4749]: I0219 20:01:26.785987 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_27dfe8e9-686d-4703-b36d-df6b94491b40/setup-container/0.log" Feb 19 20:01:27 crc kubenswrapper[4749]: I0219 20:01:27.236735 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_27dfe8e9-686d-4703-b36d-df6b94491b40/setup-container/0.log" Feb 19 20:01:27 crc kubenswrapper[4749]: I0219 20:01:27.291981 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_27dfe8e9-686d-4703-b36d-df6b94491b40/rabbitmq/0.log" Feb 19 20:01:27 crc kubenswrapper[4749]: I0219 20:01:27.900059 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_4ccfb5c8-e819-4a4c-bf02-d7dd004d970e/nova-cell0-conductor-conductor/0.log" Feb 19 20:01:28 crc kubenswrapper[4749]: I0219 20:01:28.350021 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_0780a73c-852b-470b-9de7-61afde153d72/nova-cell1-conductor-conductor/0.log" Feb 19 20:01:28 crc kubenswrapper[4749]: I0219 20:01:28.642963 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_a05f7aea-8655-4484-ad07-c9c6f0e98880/nova-cell1-novncproxy-novncproxy/0.log" Feb 19 20:01:28 crc kubenswrapper[4749]: I0219 20:01:28.711326 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_26c708f4-7611-4c40-9dc9-8b941ff97b87/nova-api-log/0.log" Feb 19 20:01:28 crc kubenswrapper[4749]: I0219 20:01:28.879468 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-zr2gj_e120e358-960c-435a-9655-35499a01c0c0/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:01:29 crc kubenswrapper[4749]: I0219 20:01:29.013429 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_26c708f4-7611-4c40-9dc9-8b941ff97b87/nova-api-api/0.log" Feb 19 20:01:29 crc kubenswrapper[4749]: I0219 20:01:29.020529 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_3380219c-08a7-4ecd-8646-6e39cb13137b/nova-metadata-log/0.log" Feb 19 20:01:29 crc kubenswrapper[4749]: I0219 20:01:29.795211 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e84776ec-57db-4685-84f6-f86655d9f079/mysql-bootstrap/0.log" Feb 19 20:01:30 crc kubenswrapper[4749]: I0219 20:01:30.049831 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e84776ec-57db-4685-84f6-f86655d9f079/mysql-bootstrap/0.log" Feb 19 20:01:30 crc kubenswrapper[4749]: I0219 20:01:30.050765 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_cfdcc5f4-f33b-4d0a-bab9-0c5ac64c9704/nova-scheduler-scheduler/0.log" Feb 19 20:01:30 crc kubenswrapper[4749]: I0219 20:01:30.115779 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e84776ec-57db-4685-84f6-f86655d9f079/galera/0.log" Feb 19 20:01:30 crc kubenswrapper[4749]: I0219 20:01:30.319888 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_affb1316-cbf5-4641-bdd7-186e390b9e7e/mysql-bootstrap/0.log" Feb 19 20:01:30 crc kubenswrapper[4749]: I0219 20:01:30.500087 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_affb1316-cbf5-4641-bdd7-186e390b9e7e/mysql-bootstrap/0.log" Feb 19 20:01:30 crc kubenswrapper[4749]: I0219 20:01:30.590842 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_affb1316-cbf5-4641-bdd7-186e390b9e7e/galera/0.log" Feb 19 20:01:30 crc kubenswrapper[4749]: I0219 20:01:30.730016 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_5d676f6e-9d56-41ab-9689-a19a0b9665f7/openstackclient/0.log" Feb 19 20:01:30 crc kubenswrapper[4749]: I0219 20:01:30.930640 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-sbvrv_59199734-1adc-46b0-9208-75331e4b868c/openstack-network-exporter/0.log" Feb 19 20:01:31 crc kubenswrapper[4749]: I0219 20:01:31.154508 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-sn6j7_091cdb0e-a88c-4731-9a77-6a3c41e0fc1a/ovsdb-server-init/0.log" Feb 19 20:01:31 crc kubenswrapper[4749]: I0219 20:01:31.185086 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_3380219c-08a7-4ecd-8646-6e39cb13137b/nova-metadata-metadata/0.log" Feb 19 20:01:31 crc kubenswrapper[4749]: I0219 20:01:31.332805 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-sn6j7_091cdb0e-a88c-4731-9a77-6a3c41e0fc1a/ovsdb-server-init/0.log" Feb 19 20:01:31 crc kubenswrapper[4749]: I0219 20:01:31.410342 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-sn6j7_091cdb0e-a88c-4731-9a77-6a3c41e0fc1a/ovsdb-server/0.log" Feb 19 20:01:31 crc kubenswrapper[4749]: I0219 20:01:31.809248 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ztkm4_7232b466-ffe3-4eab-ad4c-bb2ccac65929/ovn-controller/0.log" Feb 19 20:01:31 crc kubenswrapper[4749]: I0219 20:01:31.823210 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-sn6j7_091cdb0e-a88c-4731-9a77-6a3c41e0fc1a/ovs-vswitchd/0.log" Feb 19 20:01:31 crc kubenswrapper[4749]: I0219 20:01:31.936379 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-qtct9_8c86e6e6-0776-48c9-9c58-e1b2d41a4552/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:01:32 crc kubenswrapper[4749]: I0219 20:01:32.051997 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2231660b-7776-4cf8-a793-7d592dd23ecf/openstack-network-exporter/0.log" Feb 19 20:01:32 crc kubenswrapper[4749]: I0219 20:01:32.110541 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2231660b-7776-4cf8-a793-7d592dd23ecf/ovn-northd/0.log" Feb 19 20:01:32 crc kubenswrapper[4749]: I0219 20:01:32.228059 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0c7f450e-9c65-4f47-a259-c6e667660b59/openstack-network-exporter/0.log" Feb 19 20:01:32 crc kubenswrapper[4749]: I0219 20:01:32.387543 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0c7f450e-9c65-4f47-a259-c6e667660b59/ovsdbserver-nb/0.log" Feb 19 20:01:32 crc kubenswrapper[4749]: I0219 20:01:32.398711 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e2f6471c-3fea-45fc-8702-9022ff831352/openstack-network-exporter/0.log" Feb 19 20:01:32 crc kubenswrapper[4749]: I0219 20:01:32.487360 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e2f6471c-3fea-45fc-8702-9022ff831352/ovsdbserver-sb/0.log" Feb 19 20:01:32 crc kubenswrapper[4749]: I0219 20:01:32.722056 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-548647668b-bwckt_d7dd258c-a64d-49cc-acf0-5bf79f10e8a5/placement-api/0.log" Feb 19 20:01:32 crc kubenswrapper[4749]: I0219 20:01:32.894918 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-548647668b-bwckt_d7dd258c-a64d-49cc-acf0-5bf79f10e8a5/placement-log/0.log" Feb 19 20:01:32 crc kubenswrapper[4749]: I0219 20:01:32.943341 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f/init-config-reloader/0.log" Feb 19 20:01:33 crc kubenswrapper[4749]: I0219 20:01:33.066302 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f/config-reloader/0.log" Feb 19 20:01:33 crc kubenswrapper[4749]: I0219 20:01:33.093930 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f/init-config-reloader/0.log" Feb 19 20:01:33 crc kubenswrapper[4749]: I0219 20:01:33.177728 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f/prometheus/0.log" Feb 19 20:01:33 crc kubenswrapper[4749]: I0219 20:01:33.203602 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f/thanos-sidecar/0.log" Feb 19 20:01:33 crc kubenswrapper[4749]: I0219 20:01:33.351192 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6783e255-9125-478b-8c87-61176c735e2c/setup-container/0.log" Feb 19 20:01:33 crc kubenswrapper[4749]: I0219 20:01:33.531229 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6783e255-9125-478b-8c87-61176c735e2c/setup-container/0.log" Feb 19 20:01:33 crc kubenswrapper[4749]: I0219 20:01:33.595784 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8f675112-5cb9-4988-b346-b29f1e2699f9/setup-container/0.log" Feb 19 20:01:33 crc kubenswrapper[4749]: I0219 20:01:33.667061 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6783e255-9125-478b-8c87-61176c735e2c/rabbitmq/0.log" Feb 19 20:01:33 crc kubenswrapper[4749]: I0219 20:01:33.928364 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8f675112-5cb9-4988-b346-b29f1e2699f9/rabbitmq/0.log" Feb 19 20:01:33 crc kubenswrapper[4749]: I0219 20:01:33.939078 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8f675112-5cb9-4988-b346-b29f1e2699f9/setup-container/0.log" Feb 19 20:01:33 crc kubenswrapper[4749]: I0219 20:01:33.948756 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd_52a3df8c-e606-4fe8-990f-cef2a807956d/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:01:34 crc kubenswrapper[4749]: I0219 20:01:34.212270 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-5cmm6_ab3e2ecf-4d0f-4482-87fc-1098e7b8818a/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:01:34 crc kubenswrapper[4749]: I0219 20:01:34.238366 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42_bf17385f-8d5b-43a5-82c8-9d8bd893e056/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:01:34 crc kubenswrapper[4749]: I0219 20:01:34.413165 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_7a890ae9-2fb7-4410-b2a5-3374f5555b0f/memcached/0.log" Feb 19 20:01:34 crc kubenswrapper[4749]: I0219 20:01:34.430417 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-bqhvv_b354c1a0-43cd-442f-b818-54fc0bc89cad/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:01:34 crc kubenswrapper[4749]: I0219 20:01:34.535184 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-dz7fw_31a27783-6092-4112-97a0-2335f4f251b4/ssh-known-hosts-edpm-deployment/0.log" Feb 19 20:01:34 crc kubenswrapper[4749]: I0219 20:01:34.726372 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6966bc7795-zbh89_3ebc6a8f-ca72-408a-8add-2a21e7a4c803/proxy-server/0.log" Feb 19 20:01:34 crc kubenswrapper[4749]: I0219 20:01:34.785728 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6966bc7795-zbh89_3ebc6a8f-ca72-408a-8add-2a21e7a4c803/proxy-httpd/0.log" Feb 19 20:01:34 crc kubenswrapper[4749]: I0219 20:01:34.788574 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-czb4z_f1234ce5-5e40-4f76-a3b5-8b47853bf147/swift-ring-rebalance/0.log" Feb 19 20:01:34 crc kubenswrapper[4749]: I0219 20:01:34.925830 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/account-auditor/0.log" Feb 19 20:01:34 crc kubenswrapper[4749]: I0219 20:01:34.976471 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/account-reaper/0.log" Feb 19 20:01:35 crc kubenswrapper[4749]: I0219 20:01:35.005886 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/account-replicator/0.log" Feb 19 20:01:35 crc kubenswrapper[4749]: I0219 20:01:35.021173 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/account-server/0.log" Feb 19 20:01:35 crc kubenswrapper[4749]: I0219 20:01:35.122794 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/container-auditor/0.log" Feb 19 20:01:35 crc kubenswrapper[4749]: I0219 20:01:35.159266 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/container-server/0.log" Feb 19 20:01:35 crc kubenswrapper[4749]: I0219 20:01:35.172080 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/container-replicator/0.log" Feb 19 20:01:35 crc kubenswrapper[4749]: I0219 20:01:35.214797 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/container-updater/0.log" Feb 19 20:01:35 crc kubenswrapper[4749]: I0219 20:01:35.277771 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/object-auditor/0.log" Feb 19 20:01:35 crc kubenswrapper[4749]: I0219 20:01:35.370165 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/object-expirer/0.log" Feb 19 20:01:35 crc kubenswrapper[4749]: I0219 20:01:35.401995 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/object-server/0.log" Feb 19 20:01:35 crc kubenswrapper[4749]: I0219 20:01:35.419830 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/object-replicator/0.log" Feb 19 20:01:35 crc kubenswrapper[4749]: I0219 20:01:35.459884 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/object-updater/0.log" Feb 19 20:01:35 crc kubenswrapper[4749]: I0219 20:01:35.486258 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/rsync/0.log" Feb 19 20:01:35 crc kubenswrapper[4749]: I0219 20:01:35.584018 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/swift-recon-cron/0.log" Feb 19 20:01:35 crc kubenswrapper[4749]: I0219 20:01:35.664733 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-tsflr_39f74bf8-e240-408d-a674-c61fcf66fd06/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:01:35 crc kubenswrapper[4749]: I0219 20:01:35.690361 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_656c9f00-c5aa-4d25-b425-84c0ce173433/tempest-tests-tempest-tests-runner/0.log" Feb 19 20:01:36 crc kubenswrapper[4749]: I0219 20:01:36.102122 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_00d5b3f4-f6de-4204-a2a8-633a9d9041e3/test-operator-logs-container/0.log" Feb 19 20:01:36 crc kubenswrapper[4749]: I0219 20:01:36.189394 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf_91c1231f-dc2c-4c68-ba4f-e6d99913bd60/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:01:36 crc kubenswrapper[4749]: I0219 20:01:36.795486 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_f97a54ef-7ca6-4ad1-951b-5c05572b591a/watcher-applier/0.log" Feb 19 20:01:37 crc kubenswrapper[4749]: I0219 20:01:37.313685 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6/watcher-api-log/0.log" Feb 19 20:01:38 crc kubenswrapper[4749]: I0219 20:01:38.678477 4749 scope.go:117] "RemoveContainer" containerID="872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5" Feb 19 20:01:38 crc kubenswrapper[4749]: E0219 20:01:38.679059 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:01:39 crc kubenswrapper[4749]: I0219 20:01:39.814482 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_5f18804d-f75a-4e9c-ba11-ba225b074df7/watcher-decision-engine/0.log" Feb 19 20:01:40 crc kubenswrapper[4749]: I0219 20:01:40.288806 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6/watcher-api/0.log" Feb 19 20:01:52 crc kubenswrapper[4749]: I0219 20:01:52.679237 4749 scope.go:117] "RemoveContainer" containerID="872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5" Feb 19 20:01:52 crc kubenswrapper[4749]: E0219 20:01:52.679920 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:02:02 crc kubenswrapper[4749]: I0219 20:02:02.899218 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548_a22a9813-be4b-4990-b503-3a1c2c30ea1d/util/0.log" Feb 19 20:02:03 crc kubenswrapper[4749]: I0219 20:02:03.016875 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548_a22a9813-be4b-4990-b503-3a1c2c30ea1d/util/0.log" Feb 19 20:02:03 crc kubenswrapper[4749]: I0219 20:02:03.043805 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548_a22a9813-be4b-4990-b503-3a1c2c30ea1d/pull/0.log" Feb 19 20:02:03 crc kubenswrapper[4749]: I0219 20:02:03.084370 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548_a22a9813-be4b-4990-b503-3a1c2c30ea1d/pull/0.log" Feb 19 20:02:03 crc kubenswrapper[4749]: I0219 20:02:03.243624 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548_a22a9813-be4b-4990-b503-3a1c2c30ea1d/util/0.log" Feb 19 20:02:03 crc kubenswrapper[4749]: I0219 20:02:03.257070 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548_a22a9813-be4b-4990-b503-3a1c2c30ea1d/extract/0.log" Feb 19 20:02:03 crc kubenswrapper[4749]: I0219 20:02:03.285522 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548_a22a9813-be4b-4990-b503-3a1c2c30ea1d/pull/0.log" Feb 19 20:02:03 crc kubenswrapper[4749]: I0219 20:02:03.674222 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-5k6kx_82c521c0-6968-4298-afc8-e4aac617b61d/manager/0.log" Feb 19 20:02:04 crc kubenswrapper[4749]: I0219 20:02:04.036259 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-kqx52_872c81d0-4024-4678-a081-6698ee2fe586/manager/0.log" Feb 19 20:02:04 crc kubenswrapper[4749]: I0219 20:02:04.284303 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-xmlcc_b5e9baf7-cc67-4cde-ba8f-256eb3c5601f/manager/0.log" Feb 19 20:02:04 crc kubenswrapper[4749]: I0219 20:02:04.529573 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-g44zq_95419345-6f7d-4cb6-b0c6-75bdebf35ade/manager/0.log" Feb 19 20:02:04 crc kubenswrapper[4749]: I0219 20:02:04.678662 4749 scope.go:117] "RemoveContainer" containerID="872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5" Feb 19 20:02:04 crc kubenswrapper[4749]: E0219 20:02:04.678943 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:02:04 crc kubenswrapper[4749]: I0219 20:02:04.926714 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-66xhq_184b233b-5456-42e8-a09b-61221754095e/manager/0.log" Feb 19 20:02:05 crc kubenswrapper[4749]: I0219 20:02:05.393657 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-7c76f_bda10183-e834-4f98-a0cf-47ce14f1d333/manager/0.log" Feb 19 20:02:05 crc kubenswrapper[4749]: I0219 20:02:05.447330 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-c5zvr_0c080714-223f-4954-81ad-0fbf2d7ceff1/manager/0.log" Feb 19 20:02:05 crc kubenswrapper[4749]: I0219 20:02:05.657736 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-g4tkq_eaf1fbac-75fa-4442-811d-8f51e3a1e66b/manager/0.log" Feb 19 20:02:05 crc kubenswrapper[4749]: I0219 20:02:05.905789 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-5rm6j_0e32dc41-84cc-42d1-bbf8-be4aa4d4b010/manager/0.log" Feb 19 20:02:05 crc kubenswrapper[4749]: I0219 20:02:05.969109 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-nlt7x_7938ade1-7dc4-4927-b620-4cdcb7125a94/manager/0.log" Feb 19 20:02:06 crc kubenswrapper[4749]: I0219 20:02:06.275726 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-k6k2h_aa7dbc9c-deb4-49ae-a0c7-2130343cae10/manager/0.log" Feb 19 20:02:06 crc kubenswrapper[4749]: I0219 20:02:06.495544 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-x87ll_655da61c-572a-43c2-8d53-c3a3e0f95d43/manager/0.log" Feb 19 20:02:07 crc kubenswrapper[4749]: I0219 20:02:07.156564 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9cft725_b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d/manager/0.log" Feb 19 20:02:07 crc kubenswrapper[4749]: I0219 20:02:07.583556 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-857dd64d7c-c9mq9_83f4c4a7-f126-44f4-9780-82e159ec9ec7/operator/0.log" Feb 19 20:02:07 crc kubenswrapper[4749]: I0219 20:02:07.791989 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-n89m9_c177988f-7956-4b60-aaf0-0ece549e28cb/registry-server/0.log" Feb 19 20:02:08 crc kubenswrapper[4749]: I0219 20:02:08.122710 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-972mw_61c9ed9f-0dff-4560-a6d3-a621e1a6ff09/manager/0.log" Feb 19 20:02:08 crc kubenswrapper[4749]: I0219 20:02:08.323051 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-2w6pw_f7febf8e-9f11-446d-b8f1-4bc8e0e2dce3/manager/0.log" Feb 19 20:02:08 crc kubenswrapper[4749]: I0219 20:02:08.885672 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-rtd4f_4a2510e7-7b2d-445a-b092-74831cb6701e/operator/0.log" Feb 19 20:02:09 crc kubenswrapper[4749]: I0219 20:02:09.104902 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-mcgwg_c8870691-c7b5-4715-8db1-2ac0f6c56ad9/manager/0.log" Feb 19 20:02:09 crc kubenswrapper[4749]: I0219 20:02:09.433743 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-875j6_b31580cf-6da4-442c-aa01-bed52414bf52/manager/0.log" Feb 19 20:02:09 crc kubenswrapper[4749]: I0219 20:02:09.666366 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-7cjck_a8475ec0-8fed-454b-9d2e-7008db016ae4/manager/0.log" Feb 19 20:02:09 crc kubenswrapper[4749]: I0219 20:02:09.753523 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-xx724_e6eac27e-c253-4729-9171-7adca82bbf48/manager/0.log" Feb 19 20:02:09 crc kubenswrapper[4749]: I0219 20:02:09.786445 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-c59d96f56-stlgf_4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382/manager/0.log" Feb 19 20:02:09 crc kubenswrapper[4749]: I0219 20:02:09.983361 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-56fd5cc5c9-s7k5m_aaf03c23-f79b-4c42-9350-dd35ace208e3/manager/0.log" Feb 19 20:02:15 crc kubenswrapper[4749]: I0219 20:02:15.896400 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-54ths_ed276a06-3dcf-475c-8d9c-1ee1c364f783/manager/0.log" Feb 19 20:02:19 crc kubenswrapper[4749]: I0219 20:02:19.679237 4749 scope.go:117] "RemoveContainer" containerID="872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5" Feb 19 20:02:19 crc kubenswrapper[4749]: E0219 20:02:19.680110 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:02:21 crc kubenswrapper[4749]: I0219 20:02:21.253534 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sp8np"] Feb 19 20:02:21 crc kubenswrapper[4749]: E0219 20:02:21.254562 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9759f20d-bfd3-4538-b451-418ffaa00853" containerName="keystone-cron" Feb 19 20:02:21 crc kubenswrapper[4749]: I0219 20:02:21.254583 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9759f20d-bfd3-4538-b451-418ffaa00853" containerName="keystone-cron" Feb 19 20:02:21 crc kubenswrapper[4749]: I0219 20:02:21.254852 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9759f20d-bfd3-4538-b451-418ffaa00853" containerName="keystone-cron" Feb 19 20:02:21 crc kubenswrapper[4749]: I0219 20:02:21.256703 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sp8np" Feb 19 20:02:21 crc kubenswrapper[4749]: I0219 20:02:21.262566 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sp8np"] Feb 19 20:02:21 crc kubenswrapper[4749]: I0219 20:02:21.341743 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bed62dc1-2555-4e13-8404-1e1cf40dd8a6-catalog-content\") pod \"community-operators-sp8np\" (UID: \"bed62dc1-2555-4e13-8404-1e1cf40dd8a6\") " pod="openshift-marketplace/community-operators-sp8np" Feb 19 20:02:21 crc kubenswrapper[4749]: I0219 20:02:21.341871 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bed62dc1-2555-4e13-8404-1e1cf40dd8a6-utilities\") pod \"community-operators-sp8np\" (UID: \"bed62dc1-2555-4e13-8404-1e1cf40dd8a6\") " pod="openshift-marketplace/community-operators-sp8np" Feb 19 20:02:21 crc kubenswrapper[4749]: I0219 20:02:21.341990 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8n45\" (UniqueName: \"kubernetes.io/projected/bed62dc1-2555-4e13-8404-1e1cf40dd8a6-kube-api-access-r8n45\") pod \"community-operators-sp8np\" (UID: \"bed62dc1-2555-4e13-8404-1e1cf40dd8a6\") " pod="openshift-marketplace/community-operators-sp8np" Feb 19 20:02:21 crc kubenswrapper[4749]: I0219 20:02:21.443639 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bed62dc1-2555-4e13-8404-1e1cf40dd8a6-catalog-content\") pod \"community-operators-sp8np\" (UID: \"bed62dc1-2555-4e13-8404-1e1cf40dd8a6\") " pod="openshift-marketplace/community-operators-sp8np" Feb 19 20:02:21 crc kubenswrapper[4749]: I0219 20:02:21.443754 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bed62dc1-2555-4e13-8404-1e1cf40dd8a6-utilities\") pod \"community-operators-sp8np\" (UID: \"bed62dc1-2555-4e13-8404-1e1cf40dd8a6\") " pod="openshift-marketplace/community-operators-sp8np" Feb 19 20:02:21 crc kubenswrapper[4749]: I0219 20:02:21.443820 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8n45\" (UniqueName: \"kubernetes.io/projected/bed62dc1-2555-4e13-8404-1e1cf40dd8a6-kube-api-access-r8n45\") pod \"community-operators-sp8np\" (UID: \"bed62dc1-2555-4e13-8404-1e1cf40dd8a6\") " pod="openshift-marketplace/community-operators-sp8np" Feb 19 20:02:21 crc kubenswrapper[4749]: I0219 20:02:21.444274 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bed62dc1-2555-4e13-8404-1e1cf40dd8a6-catalog-content\") pod \"community-operators-sp8np\" (UID: \"bed62dc1-2555-4e13-8404-1e1cf40dd8a6\") " pod="openshift-marketplace/community-operators-sp8np" Feb 19 20:02:21 crc kubenswrapper[4749]: I0219 20:02:21.444336 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bed62dc1-2555-4e13-8404-1e1cf40dd8a6-utilities\") pod \"community-operators-sp8np\" (UID: \"bed62dc1-2555-4e13-8404-1e1cf40dd8a6\") " pod="openshift-marketplace/community-operators-sp8np" Feb 19 20:02:21 crc kubenswrapper[4749]: I0219 20:02:21.464211 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8n45\" (UniqueName: \"kubernetes.io/projected/bed62dc1-2555-4e13-8404-1e1cf40dd8a6-kube-api-access-r8n45\") pod \"community-operators-sp8np\" (UID: \"bed62dc1-2555-4e13-8404-1e1cf40dd8a6\") " pod="openshift-marketplace/community-operators-sp8np" Feb 19 20:02:21 crc kubenswrapper[4749]: I0219 20:02:21.575689 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sp8np" Feb 19 20:02:22 crc kubenswrapper[4749]: I0219 20:02:22.112198 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sp8np"] Feb 19 20:02:22 crc kubenswrapper[4749]: I0219 20:02:22.611912 4749 generic.go:334] "Generic (PLEG): container finished" podID="bed62dc1-2555-4e13-8404-1e1cf40dd8a6" containerID="60ad564fa704d640cff82652bdf14c93893f2df1ad7009868a56d6da6671ae51" exitCode=0 Feb 19 20:02:22 crc kubenswrapper[4749]: I0219 20:02:22.612001 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sp8np" event={"ID":"bed62dc1-2555-4e13-8404-1e1cf40dd8a6","Type":"ContainerDied","Data":"60ad564fa704d640cff82652bdf14c93893f2df1ad7009868a56d6da6671ae51"} Feb 19 20:02:22 crc kubenswrapper[4749]: I0219 20:02:22.612559 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sp8np" event={"ID":"bed62dc1-2555-4e13-8404-1e1cf40dd8a6","Type":"ContainerStarted","Data":"4ba2188729279b9c320031dead257140b437eba35df26a1a8c29b85313127b55"} Feb 19 20:02:23 crc kubenswrapper[4749]: I0219 20:02:23.625451 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sp8np" event={"ID":"bed62dc1-2555-4e13-8404-1e1cf40dd8a6","Type":"ContainerStarted","Data":"99fc5db888a60c89a4c814a6edc679689c458c4fb195f6bb613c79b15364037e"} Feb 19 20:02:26 crc kubenswrapper[4749]: I0219 20:02:26.658595 4749 generic.go:334] "Generic (PLEG): container finished" podID="bed62dc1-2555-4e13-8404-1e1cf40dd8a6" containerID="99fc5db888a60c89a4c814a6edc679689c458c4fb195f6bb613c79b15364037e" exitCode=0 Feb 19 20:02:26 crc kubenswrapper[4749]: I0219 20:02:26.658630 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sp8np" event={"ID":"bed62dc1-2555-4e13-8404-1e1cf40dd8a6","Type":"ContainerDied","Data":"99fc5db888a60c89a4c814a6edc679689c458c4fb195f6bb613c79b15364037e"} Feb 19 20:02:27 crc kubenswrapper[4749]: I0219 20:02:27.671139 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sp8np" event={"ID":"bed62dc1-2555-4e13-8404-1e1cf40dd8a6","Type":"ContainerStarted","Data":"5e3ef17817e5904ad667c7f2d901d068876d4e4e3d9de54f3426c6cf4c38a39d"} Feb 19 20:02:27 crc kubenswrapper[4749]: I0219 20:02:27.692960 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sp8np" podStartSLOduration=2.191745456 podStartE2EDuration="6.692941589s" podCreationTimestamp="2026-02-19 20:02:21 +0000 UTC" firstStartedPulling="2026-02-19 20:02:22.614883408 +0000 UTC m=+5316.576103362" lastFinishedPulling="2026-02-19 20:02:27.116079541 +0000 UTC m=+5321.077299495" observedRunningTime="2026-02-19 20:02:27.69175888 +0000 UTC m=+5321.652978834" watchObservedRunningTime="2026-02-19 20:02:27.692941589 +0000 UTC m=+5321.654161543" Feb 19 20:02:31 crc kubenswrapper[4749]: I0219 20:02:31.575833 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sp8np" Feb 19 20:02:31 crc kubenswrapper[4749]: I0219 20:02:31.576321 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sp8np" Feb 19 20:02:31 crc kubenswrapper[4749]: I0219 20:02:31.625208 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sp8np" Feb 19 20:02:33 crc kubenswrapper[4749]: I0219 20:02:33.679561 4749 scope.go:117] "RemoveContainer" containerID="872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5" Feb 19 20:02:33 crc kubenswrapper[4749]: E0219 20:02:33.680220 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:02:33 crc kubenswrapper[4749]: I0219 20:02:33.973266 4749 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.294922821s: [/var/lib/containers/storage/overlay/3c68d512a61f648dcb6147160b8ba73f971982c2339144d416fce9505d9c6cd5/diff /var/log/pods/openstack_watcher-api-0_6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6/watcher-api/0.log]; will not log again for this container unless duration exceeds 2s Feb 19 20:02:34 crc kubenswrapper[4749]: I0219 20:02:34.159521 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-nl4bn_18ce2742-770a-492b-a2c1-b1c615b27c71/control-plane-machine-set-operator/0.log" Feb 19 20:02:34 crc kubenswrapper[4749]: I0219 20:02:34.227134 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qbnpr_fc385e2a-5c57-49bc-a308-57a35663a452/kube-rbac-proxy/0.log" Feb 19 20:02:34 crc kubenswrapper[4749]: I0219 20:02:34.376926 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qbnpr_fc385e2a-5c57-49bc-a308-57a35663a452/machine-api-operator/0.log" Feb 19 20:02:41 crc kubenswrapper[4749]: I0219 20:02:41.635346 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sp8np" Feb 19 20:02:41 crc kubenswrapper[4749]: I0219 20:02:41.933006 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sp8np"] Feb 19 20:02:41 crc kubenswrapper[4749]: I0219 20:02:41.933287 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sp8np" podUID="bed62dc1-2555-4e13-8404-1e1cf40dd8a6" containerName="registry-server" containerID="cri-o://5e3ef17817e5904ad667c7f2d901d068876d4e4e3d9de54f3426c6cf4c38a39d" gracePeriod=2 Feb 19 20:02:42 crc kubenswrapper[4749]: I0219 20:02:42.813223 4749 generic.go:334] "Generic (PLEG): container finished" podID="bed62dc1-2555-4e13-8404-1e1cf40dd8a6" containerID="5e3ef17817e5904ad667c7f2d901d068876d4e4e3d9de54f3426c6cf4c38a39d" exitCode=0 Feb 19 20:02:42 crc kubenswrapper[4749]: I0219 20:02:42.813300 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sp8np" event={"ID":"bed62dc1-2555-4e13-8404-1e1cf40dd8a6","Type":"ContainerDied","Data":"5e3ef17817e5904ad667c7f2d901d068876d4e4e3d9de54f3426c6cf4c38a39d"} Feb 19 20:02:43 crc kubenswrapper[4749]: I0219 20:02:43.443138 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sp8np" Feb 19 20:02:43 crc kubenswrapper[4749]: I0219 20:02:43.607211 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bed62dc1-2555-4e13-8404-1e1cf40dd8a6-catalog-content\") pod \"bed62dc1-2555-4e13-8404-1e1cf40dd8a6\" (UID: \"bed62dc1-2555-4e13-8404-1e1cf40dd8a6\") " Feb 19 20:02:43 crc kubenswrapper[4749]: I0219 20:02:43.607484 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8n45\" (UniqueName: \"kubernetes.io/projected/bed62dc1-2555-4e13-8404-1e1cf40dd8a6-kube-api-access-r8n45\") pod \"bed62dc1-2555-4e13-8404-1e1cf40dd8a6\" (UID: \"bed62dc1-2555-4e13-8404-1e1cf40dd8a6\") " Feb 19 20:02:43 crc kubenswrapper[4749]: I0219 20:02:43.607590 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bed62dc1-2555-4e13-8404-1e1cf40dd8a6-utilities\") pod \"bed62dc1-2555-4e13-8404-1e1cf40dd8a6\" (UID: \"bed62dc1-2555-4e13-8404-1e1cf40dd8a6\") " Feb 19 20:02:43 crc kubenswrapper[4749]: I0219 20:02:43.608366 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bed62dc1-2555-4e13-8404-1e1cf40dd8a6-utilities" (OuterVolumeSpecName: "utilities") pod "bed62dc1-2555-4e13-8404-1e1cf40dd8a6" (UID: "bed62dc1-2555-4e13-8404-1e1cf40dd8a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:02:43 crc kubenswrapper[4749]: I0219 20:02:43.612819 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bed62dc1-2555-4e13-8404-1e1cf40dd8a6-kube-api-access-r8n45" (OuterVolumeSpecName: "kube-api-access-r8n45") pod "bed62dc1-2555-4e13-8404-1e1cf40dd8a6" (UID: "bed62dc1-2555-4e13-8404-1e1cf40dd8a6"). InnerVolumeSpecName "kube-api-access-r8n45". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:02:43 crc kubenswrapper[4749]: I0219 20:02:43.657732 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bed62dc1-2555-4e13-8404-1e1cf40dd8a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bed62dc1-2555-4e13-8404-1e1cf40dd8a6" (UID: "bed62dc1-2555-4e13-8404-1e1cf40dd8a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:02:43 crc kubenswrapper[4749]: I0219 20:02:43.710257 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8n45\" (UniqueName: \"kubernetes.io/projected/bed62dc1-2555-4e13-8404-1e1cf40dd8a6-kube-api-access-r8n45\") on node \"crc\" DevicePath \"\"" Feb 19 20:02:43 crc kubenswrapper[4749]: I0219 20:02:43.710301 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bed62dc1-2555-4e13-8404-1e1cf40dd8a6-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 20:02:43 crc kubenswrapper[4749]: I0219 20:02:43.710315 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bed62dc1-2555-4e13-8404-1e1cf40dd8a6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 20:02:43 crc kubenswrapper[4749]: I0219 20:02:43.826672 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sp8np" event={"ID":"bed62dc1-2555-4e13-8404-1e1cf40dd8a6","Type":"ContainerDied","Data":"4ba2188729279b9c320031dead257140b437eba35df26a1a8c29b85313127b55"} Feb 19 20:02:43 crc kubenswrapper[4749]: I0219 20:02:43.826753 4749 scope.go:117] "RemoveContainer" containerID="5e3ef17817e5904ad667c7f2d901d068876d4e4e3d9de54f3426c6cf4c38a39d" Feb 19 20:02:43 crc kubenswrapper[4749]: I0219 20:02:43.826780 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sp8np" Feb 19 20:02:43 crc kubenswrapper[4749]: I0219 20:02:43.856873 4749 scope.go:117] "RemoveContainer" containerID="99fc5db888a60c89a4c814a6edc679689c458c4fb195f6bb613c79b15364037e" Feb 19 20:02:43 crc kubenswrapper[4749]: I0219 20:02:43.889412 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sp8np"] Feb 19 20:02:43 crc kubenswrapper[4749]: I0219 20:02:43.901065 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sp8np"] Feb 19 20:02:44 crc kubenswrapper[4749]: I0219 20:02:44.250794 4749 scope.go:117] "RemoveContainer" containerID="60ad564fa704d640cff82652bdf14c93893f2df1ad7009868a56d6da6671ae51" Feb 19 20:02:44 crc kubenswrapper[4749]: I0219 20:02:44.678916 4749 scope.go:117] "RemoveContainer" containerID="872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5" Feb 19 20:02:44 crc kubenswrapper[4749]: E0219 20:02:44.679187 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:02:44 crc kubenswrapper[4749]: I0219 20:02:44.695412 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bed62dc1-2555-4e13-8404-1e1cf40dd8a6" path="/var/lib/kubelet/pods/bed62dc1-2555-4e13-8404-1e1cf40dd8a6/volumes" Feb 19 20:02:47 crc kubenswrapper[4749]: I0219 20:02:47.344522 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-7ngvd_30e026fc-9274-4942-bf3d-68740957aeec/cert-manager-controller/0.log" Feb 19 20:02:47 crc kubenswrapper[4749]: I0219 20:02:47.577246 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-gqvc7_fc78af5c-d237-4523-8035-d8992d4b539c/cert-manager-webhook/0.log" Feb 19 20:02:47 crc kubenswrapper[4749]: I0219 20:02:47.587761 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-pbj6v_b4703030-f4cb-4751-a9e1-5a6c1c9f4332/cert-manager-cainjector/0.log" Feb 19 20:02:58 crc kubenswrapper[4749]: I0219 20:02:58.678924 4749 scope.go:117] "RemoveContainer" containerID="872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5" Feb 19 20:02:58 crc kubenswrapper[4749]: E0219 20:02:58.679799 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:03:00 crc kubenswrapper[4749]: I0219 20:03:00.600747 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-b57rs_84b14d9d-fa97-4391-9bd8-f3c5680ad7d1/nmstate-console-plugin/0.log" Feb 19 20:03:00 crc kubenswrapper[4749]: I0219 20:03:00.769418 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-vg82k_6cacdce3-57f6-4ae5-bcdd-6d94b938a155/nmstate-handler/0.log" Feb 19 20:03:00 crc kubenswrapper[4749]: I0219 20:03:00.788424 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-kc58p_2f498838-9a5f-4320-9044-3602de46b7cb/kube-rbac-proxy/0.log" Feb 19 20:03:00 crc kubenswrapper[4749]: I0219 20:03:00.856132 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-kc58p_2f498838-9a5f-4320-9044-3602de46b7cb/nmstate-metrics/0.log" Feb 19 20:03:00 crc kubenswrapper[4749]: I0219 20:03:00.968208 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-4thb7_0339e5b6-a614-4424-8375-01b24fd90b54/nmstate-operator/0.log" Feb 19 20:03:01 crc kubenswrapper[4749]: I0219 20:03:01.074644 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-9lfmz_9c7d501a-b552-4c50-960c-63ae5826b93a/nmstate-webhook/0.log" Feb 19 20:03:12 crc kubenswrapper[4749]: I0219 20:03:12.193313 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-2w6pw" podUID="f7febf8e-9f11-446d-b8f1-4bc8e0e2dce3" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.94:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 20:03:12 crc kubenswrapper[4749]: I0219 20:03:12.679677 4749 scope.go:117] "RemoveContainer" containerID="872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5" Feb 19 20:03:12 crc kubenswrapper[4749]: E0219 20:03:12.680292 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:03:16 crc kubenswrapper[4749]: I0219 20:03:16.310454 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-58c6cdd4c6-lhtjb_6107d2c9-e758-426c-8c42-d8a9241b1ce8/prometheus-operator-admission-webhook/0.log" Feb 19 20:03:18 crc kubenswrapper[4749]: I0219 20:03:18.742538 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="e481767e-68e7-4396-b8aa-51956e378132" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Feb 19 20:03:18 crc kubenswrapper[4749]: I0219 20:03:18.914614 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-qs22p_456b4e23-1427-4b46-9672-87cff5dd12b9/perses-operator/0.log" Feb 19 20:03:23 crc kubenswrapper[4749]: I0219 20:03:23.679260 4749 scope.go:117] "RemoveContainer" containerID="872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5" Feb 19 20:03:23 crc kubenswrapper[4749]: E0219 20:03:23.680006 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:03:23 crc kubenswrapper[4749]: I0219 20:03:23.737315 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="e481767e-68e7-4396-b8aa-51956e378132" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Feb 19 20:03:28 crc kubenswrapper[4749]: I0219 20:03:28.738365 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="e481767e-68e7-4396-b8aa-51956e378132" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Feb 19 20:03:28 crc kubenswrapper[4749]: I0219 20:03:28.739078 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Feb 19 20:03:28 crc kubenswrapper[4749]: I0219 20:03:28.740282 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-central-agent" containerStatusID={"Type":"cri-o","ID":"fdcb8ebd474c6d3a8a3b25a862388573fc16e225042117f39f94ed4de889e608"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-central-agent failed liveness probe, will be restarted" Feb 19 20:03:28 crc kubenswrapper[4749]: I0219 20:03:28.740502 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e481767e-68e7-4396-b8aa-51956e378132" containerName="ceilometer-central-agent" containerID="cri-o://fdcb8ebd474c6d3a8a3b25a862388573fc16e225042117f39f94ed4de889e608" gracePeriod=30 Feb 19 20:03:40 crc kubenswrapper[4749]: I0219 20:03:31.416121 4749 patch_prober.go:28] interesting pod/route-controller-manager-6754d5984f-zksdl container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 20:03:40 crc kubenswrapper[4749]: I0219 20:03:31.416982 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6754d5984f-zksdl" podUID="772b35c8-0ee5-4853-846f-1332e07b872e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 20:03:40 crc kubenswrapper[4749]: I0219 20:03:32.263357 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-gqvc7" podUID="fc78af5c-d237-4523-8035-d8992d4b539c" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.78:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 20:03:40 crc kubenswrapper[4749]: I0219 20:03:37.305219 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="cert-manager/cert-manager-webhook-687f57d79b-gqvc7" podUID="fc78af5c-d237-4523-8035-d8992d4b539c" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.78:6080/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 20:03:40 crc kubenswrapper[4749]: I0219 20:03:37.305244 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-gqvc7" podUID="fc78af5c-d237-4523-8035-d8992d4b539c" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.78:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 20:03:40 crc kubenswrapper[4749]: I0219 20:03:38.541960 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-58c6cdd4c6-pb2c8_8b734a5b-56c7-4001-8c16-e4a75f50afb3/prometheus-operator-admission-webhook/0.log" Feb 19 20:03:40 crc kubenswrapper[4749]: I0219 20:03:38.589150 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-hcn88_f02e6039-b225-4177-9704-4cdd8b15f297/prometheus-operator/0.log" Feb 19 20:03:40 crc kubenswrapper[4749]: I0219 20:03:38.597621 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-74zcg_74ad505e-d15c-43b4-b072-444ffdedf939/operator/0.log" Feb 19 20:03:40 crc kubenswrapper[4749]: I0219 20:03:38.679845 4749 scope.go:117] "RemoveContainer" containerID="872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5" Feb 19 20:03:40 crc kubenswrapper[4749]: E0219 20:03:38.680197 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:03:40 crc kubenswrapper[4749]: I0219 20:03:40.406063 4749 generic.go:334] "Generic (PLEG): container finished" podID="e481767e-68e7-4396-b8aa-51956e378132" containerID="fdcb8ebd474c6d3a8a3b25a862388573fc16e225042117f39f94ed4de889e608" exitCode=0 Feb 19 20:03:40 crc kubenswrapper[4749]: I0219 20:03:40.406609 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e481767e-68e7-4396-b8aa-51956e378132","Type":"ContainerDied","Data":"fdcb8ebd474c6d3a8a3b25a862388573fc16e225042117f39f94ed4de889e608"} Feb 19 20:03:42 crc kubenswrapper[4749]: I0219 20:03:42.428825 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e481767e-68e7-4396-b8aa-51956e378132","Type":"ContainerStarted","Data":"db9c66dc1a327019530c2311e38cec97ec030bf04a4a814ff7a6a4a2b92912a8"} Feb 19 20:03:45 crc kubenswrapper[4749]: I0219 20:03:45.994865 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-trs7x_0920f1a5-2f5c-4ec0-b20a-4c1e2a152ccc/kube-rbac-proxy/0.log" Feb 19 20:03:46 crc kubenswrapper[4749]: I0219 20:03:46.076683 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-trs7x_0920f1a5-2f5c-4ec0-b20a-4c1e2a152ccc/controller/0.log" Feb 19 20:03:46 crc kubenswrapper[4749]: I0219 20:03:46.231233 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/cp-frr-files/0.log" Feb 19 20:03:46 crc kubenswrapper[4749]: I0219 20:03:46.450900 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/cp-reloader/0.log" Feb 19 20:03:46 crc kubenswrapper[4749]: I0219 20:03:46.455753 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/cp-frr-files/0.log" Feb 19 20:03:46 crc kubenswrapper[4749]: I0219 20:03:46.468747 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/cp-metrics/0.log" Feb 19 20:03:46 crc kubenswrapper[4749]: I0219 20:03:46.479717 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/cp-reloader/0.log" Feb 19 20:03:46 crc kubenswrapper[4749]: I0219 20:03:46.649127 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/cp-frr-files/0.log" Feb 19 20:03:46 crc kubenswrapper[4749]: I0219 20:03:46.740397 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/cp-metrics/0.log" Feb 19 20:03:46 crc kubenswrapper[4749]: I0219 20:03:46.740655 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/cp-reloader/0.log" Feb 19 20:03:46 crc kubenswrapper[4749]: I0219 20:03:46.746388 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/cp-metrics/0.log" Feb 19 20:03:46 crc kubenswrapper[4749]: I0219 20:03:46.900830 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/cp-reloader/0.log" Feb 19 20:03:46 crc kubenswrapper[4749]: I0219 20:03:46.908975 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/cp-frr-files/0.log" Feb 19 20:03:46 crc kubenswrapper[4749]: I0219 20:03:46.941295 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/controller/0.log" Feb 19 20:03:46 crc kubenswrapper[4749]: I0219 20:03:46.975147 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/cp-metrics/0.log" Feb 19 20:03:47 crc kubenswrapper[4749]: I0219 20:03:47.106320 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/frr-metrics/0.log" Feb 19 20:03:47 crc kubenswrapper[4749]: I0219 20:03:47.118705 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/kube-rbac-proxy/0.log" Feb 19 20:03:47 crc kubenswrapper[4749]: I0219 20:03:47.202428 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/kube-rbac-proxy-frr/0.log" Feb 19 20:03:47 crc kubenswrapper[4749]: I0219 20:03:47.384842 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/reloader/0.log" Feb 19 20:03:47 crc kubenswrapper[4749]: I0219 20:03:47.425920 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-66v4b_ec4d9a98-6ae9-4282-8b6e-eb107c8fadc5/frr-k8s-webhook-server/0.log" Feb 19 20:03:48 crc kubenswrapper[4749]: I0219 20:03:48.060494 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-66b7b94c9b-n69x7_9c6d5734-2093-42d7-a330-59c6dc0dc138/webhook-server/0.log" Feb 19 20:03:48 crc kubenswrapper[4749]: I0219 20:03:48.116837 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-74989bddb6-dcst5_bcecd22a-15ba-4bca-8be3-9cc08843c86d/manager/0.log" Feb 19 20:03:48 crc kubenswrapper[4749]: I0219 20:03:48.641294 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-6x54g_d4ac8583-8e6a-4c40-9903-37fe6f82d038/kube-rbac-proxy/0.log" Feb 19 20:03:49 crc kubenswrapper[4749]: I0219 20:03:49.521680 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-6x54g_d4ac8583-8e6a-4c40-9903-37fe6f82d038/speaker/0.log" Feb 19 20:03:49 crc kubenswrapper[4749]: I0219 20:03:49.975096 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/frr/0.log" Feb 19 20:03:53 crc kubenswrapper[4749]: I0219 20:03:53.679209 4749 scope.go:117] "RemoveContainer" containerID="872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5" Feb 19 20:03:53 crc kubenswrapper[4749]: E0219 20:03:53.679740 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:04:01 crc kubenswrapper[4749]: I0219 20:04:01.144138 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5_0278ac54-8aaa-4f8e-ba20-8986a1467d1f/util/0.log" Feb 19 20:04:01 crc kubenswrapper[4749]: I0219 20:04:01.327231 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5_0278ac54-8aaa-4f8e-ba20-8986a1467d1f/util/0.log" Feb 19 20:04:01 crc kubenswrapper[4749]: I0219 20:04:01.354254 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5_0278ac54-8aaa-4f8e-ba20-8986a1467d1f/pull/0.log" Feb 19 20:04:01 crc kubenswrapper[4749]: I0219 20:04:01.359271 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5_0278ac54-8aaa-4f8e-ba20-8986a1467d1f/pull/0.log" Feb 19 20:04:01 crc kubenswrapper[4749]: I0219 20:04:01.549893 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5_0278ac54-8aaa-4f8e-ba20-8986a1467d1f/extract/0.log" Feb 19 20:04:01 crc kubenswrapper[4749]: I0219 20:04:01.556957 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5_0278ac54-8aaa-4f8e-ba20-8986a1467d1f/util/0.log" Feb 19 20:04:01 crc kubenswrapper[4749]: I0219 20:04:01.598979 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5_0278ac54-8aaa-4f8e-ba20-8986a1467d1f/pull/0.log" Feb 19 20:04:01 crc kubenswrapper[4749]: I0219 20:04:01.719890 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b_8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe/util/0.log" Feb 19 20:04:01 crc kubenswrapper[4749]: I0219 20:04:01.888680 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b_8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe/util/0.log" Feb 19 20:04:01 crc kubenswrapper[4749]: I0219 20:04:01.898257 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b_8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe/pull/0.log" Feb 19 20:04:01 crc kubenswrapper[4749]: I0219 20:04:01.900391 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b_8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe/pull/0.log" Feb 19 20:04:02 crc kubenswrapper[4749]: I0219 20:04:02.035152 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b_8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe/util/0.log" Feb 19 20:04:02 crc kubenswrapper[4749]: I0219 20:04:02.053329 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b_8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe/extract/0.log" Feb 19 20:04:02 crc kubenswrapper[4749]: I0219 20:04:02.113088 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b_8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe/pull/0.log" Feb 19 20:04:02 crc kubenswrapper[4749]: I0219 20:04:02.239977 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chgqt_00a20f87-31e9-444d-a98f-588258a67d7d/extract-utilities/0.log" Feb 19 20:04:02 crc kubenswrapper[4749]: I0219 20:04:02.406484 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chgqt_00a20f87-31e9-444d-a98f-588258a67d7d/extract-utilities/0.log" Feb 19 20:04:02 crc kubenswrapper[4749]: I0219 20:04:02.428148 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chgqt_00a20f87-31e9-444d-a98f-588258a67d7d/extract-content/0.log" Feb 19 20:04:02 crc kubenswrapper[4749]: I0219 20:04:02.438107 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chgqt_00a20f87-31e9-444d-a98f-588258a67d7d/extract-content/0.log" Feb 19 20:04:02 crc kubenswrapper[4749]: I0219 20:04:02.577103 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chgqt_00a20f87-31e9-444d-a98f-588258a67d7d/extract-utilities/0.log" Feb 19 20:04:02 crc kubenswrapper[4749]: I0219 20:04:02.612124 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chgqt_00a20f87-31e9-444d-a98f-588258a67d7d/extract-content/0.log" Feb 19 20:04:02 crc kubenswrapper[4749]: I0219 20:04:02.844080 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g6rth_ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9/extract-utilities/0.log" Feb 19 20:04:02 crc kubenswrapper[4749]: I0219 20:04:02.999047 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g6rth_ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9/extract-utilities/0.log" Feb 19 20:04:03 crc kubenswrapper[4749]: I0219 20:04:03.064592 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g6rth_ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9/extract-content/0.log" Feb 19 20:04:03 crc kubenswrapper[4749]: I0219 20:04:03.162526 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g6rth_ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9/extract-content/0.log" Feb 19 20:04:03 crc kubenswrapper[4749]: I0219 20:04:03.327347 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chgqt_00a20f87-31e9-444d-a98f-588258a67d7d/registry-server/0.log" Feb 19 20:04:03 crc kubenswrapper[4749]: I0219 20:04:03.414230 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g6rth_ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9/extract-utilities/0.log" Feb 19 20:04:03 crc kubenswrapper[4749]: I0219 20:04:03.445963 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g6rth_ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9/extract-content/0.log" Feb 19 20:04:03 crc kubenswrapper[4749]: I0219 20:04:03.735978 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr_9bc71b96-3cbf-4481-a9ac-d77071db9e39/util/0.log" Feb 19 20:04:03 crc kubenswrapper[4749]: I0219 20:04:03.959019 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr_9bc71b96-3cbf-4481-a9ac-d77071db9e39/util/0.log" Feb 19 20:04:03 crc kubenswrapper[4749]: I0219 20:04:03.967855 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr_9bc71b96-3cbf-4481-a9ac-d77071db9e39/pull/0.log" Feb 19 20:04:04 crc kubenswrapper[4749]: I0219 20:04:04.208902 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr_9bc71b96-3cbf-4481-a9ac-d77071db9e39/pull/0.log" Feb 19 20:04:04 crc kubenswrapper[4749]: I0219 20:04:04.346016 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr_9bc71b96-3cbf-4481-a9ac-d77071db9e39/util/0.log" Feb 19 20:04:04 crc kubenswrapper[4749]: I0219 20:04:04.421200 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr_9bc71b96-3cbf-4481-a9ac-d77071db9e39/pull/0.log" Feb 19 20:04:04 crc kubenswrapper[4749]: I0219 20:04:04.450503 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr_9bc71b96-3cbf-4481-a9ac-d77071db9e39/extract/0.log" Feb 19 20:04:04 crc kubenswrapper[4749]: I0219 20:04:04.478748 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g6rth_ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9/registry-server/0.log" Feb 19 20:04:04 crc kubenswrapper[4749]: I0219 20:04:04.602623 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-wd24f_22a5e8d0-f222-4a7c-8bb7-51689ef460a8/marketplace-operator/0.log" Feb 19 20:04:04 crc kubenswrapper[4749]: I0219 20:04:04.679621 4749 scope.go:117] "RemoveContainer" containerID="872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5" Feb 19 20:04:04 crc kubenswrapper[4749]: E0219 20:04:04.679873 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:04:04 crc kubenswrapper[4749]: I0219 20:04:04.683906 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rt4s8_73ceef31-959a-49ce-9f27-0b41330d330b/extract-utilities/0.log" Feb 19 20:04:04 crc kubenswrapper[4749]: I0219 20:04:04.831165 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rt4s8_73ceef31-959a-49ce-9f27-0b41330d330b/extract-utilities/0.log" Feb 19 20:04:04 crc kubenswrapper[4749]: I0219 20:04:04.843287 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rt4s8_73ceef31-959a-49ce-9f27-0b41330d330b/extract-content/0.log" Feb 19 20:04:04 crc kubenswrapper[4749]: I0219 20:04:04.878712 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rt4s8_73ceef31-959a-49ce-9f27-0b41330d330b/extract-content/0.log" Feb 19 20:04:05 crc kubenswrapper[4749]: I0219 20:04:05.062963 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rt4s8_73ceef31-959a-49ce-9f27-0b41330d330b/extract-content/0.log" Feb 19 20:04:05 crc kubenswrapper[4749]: I0219 20:04:05.063443 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rt4s8_73ceef31-959a-49ce-9f27-0b41330d330b/extract-utilities/0.log" Feb 19 20:04:05 crc kubenswrapper[4749]: I0219 20:04:05.327304 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rt4s8_73ceef31-959a-49ce-9f27-0b41330d330b/registry-server/0.log" Feb 19 20:04:05 crc kubenswrapper[4749]: I0219 20:04:05.380878 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d8w5c_50a852ed-343b-45a6-987c-d6a4a98446dd/extract-utilities/0.log" Feb 19 20:04:05 crc kubenswrapper[4749]: I0219 20:04:05.556593 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d8w5c_50a852ed-343b-45a6-987c-d6a4a98446dd/extract-content/0.log" Feb 19 20:04:05 crc kubenswrapper[4749]: I0219 20:04:05.565718 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d8w5c_50a852ed-343b-45a6-987c-d6a4a98446dd/extract-utilities/0.log" Feb 19 20:04:05 crc kubenswrapper[4749]: I0219 20:04:05.569060 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d8w5c_50a852ed-343b-45a6-987c-d6a4a98446dd/extract-content/0.log" Feb 19 20:04:05 crc kubenswrapper[4749]: I0219 20:04:05.723373 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d8w5c_50a852ed-343b-45a6-987c-d6a4a98446dd/extract-utilities/0.log" Feb 19 20:04:05 crc kubenswrapper[4749]: I0219 20:04:05.774998 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d8w5c_50a852ed-343b-45a6-987c-d6a4a98446dd/extract-content/0.log" Feb 19 20:04:06 crc kubenswrapper[4749]: I0219 20:04:06.527342 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d8w5c_50a852ed-343b-45a6-987c-d6a4a98446dd/registry-server/0.log" Feb 19 20:04:17 crc kubenswrapper[4749]: I0219 20:04:17.679446 4749 scope.go:117] "RemoveContainer" containerID="872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5" Feb 19 20:04:17 crc kubenswrapper[4749]: E0219 20:04:17.680231 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:04:18 crc kubenswrapper[4749]: I0219 20:04:18.406907 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-58c6cdd4c6-lhtjb_6107d2c9-e758-426c-8c42-d8a9241b1ce8/prometheus-operator-admission-webhook/0.log" Feb 19 20:04:18 crc kubenswrapper[4749]: I0219 20:04:18.447858 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-hcn88_f02e6039-b225-4177-9704-4cdd8b15f297/prometheus-operator/0.log" Feb 19 20:04:18 crc kubenswrapper[4749]: I0219 20:04:18.460255 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-58c6cdd4c6-pb2c8_8b734a5b-56c7-4001-8c16-e4a75f50afb3/prometheus-operator-admission-webhook/0.log" Feb 19 20:04:18 crc kubenswrapper[4749]: I0219 20:04:18.647280 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-qs22p_456b4e23-1427-4b46-9672-87cff5dd12b9/perses-operator/0.log" Feb 19 20:04:18 crc kubenswrapper[4749]: I0219 20:04:18.658345 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-74zcg_74ad505e-d15c-43b4-b072-444ffdedf939/operator/0.log" Feb 19 20:04:28 crc kubenswrapper[4749]: I0219 20:04:28.678848 4749 scope.go:117] "RemoveContainer" containerID="872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5" Feb 19 20:04:28 crc kubenswrapper[4749]: E0219 20:04:28.679731 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:04:28 crc kubenswrapper[4749]: E0219 20:04:28.896443 4749 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.128:60724->38.102.83.128:36573: read tcp 38.102.83.128:60724->38.102.83.128:36573: read: connection reset by peer Feb 19 20:04:41 crc kubenswrapper[4749]: I0219 20:04:41.142462 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cgp7x"] Feb 19 20:04:41 crc kubenswrapper[4749]: E0219 20:04:41.143386 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bed62dc1-2555-4e13-8404-1e1cf40dd8a6" containerName="registry-server" Feb 19 20:04:41 crc kubenswrapper[4749]: I0219 20:04:41.143399 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bed62dc1-2555-4e13-8404-1e1cf40dd8a6" containerName="registry-server" Feb 19 20:04:41 crc kubenswrapper[4749]: E0219 20:04:41.143411 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bed62dc1-2555-4e13-8404-1e1cf40dd8a6" containerName="extract-utilities" Feb 19 20:04:41 crc kubenswrapper[4749]: I0219 20:04:41.143419 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bed62dc1-2555-4e13-8404-1e1cf40dd8a6" containerName="extract-utilities" Feb 19 20:04:41 crc kubenswrapper[4749]: E0219 20:04:41.143446 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bed62dc1-2555-4e13-8404-1e1cf40dd8a6" containerName="extract-content" Feb 19 20:04:41 crc kubenswrapper[4749]: I0219 20:04:41.143452 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bed62dc1-2555-4e13-8404-1e1cf40dd8a6" containerName="extract-content" Feb 19 20:04:41 crc kubenswrapper[4749]: I0219 20:04:41.143618 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bed62dc1-2555-4e13-8404-1e1cf40dd8a6" containerName="registry-server" Feb 19 20:04:41 crc kubenswrapper[4749]: I0219 20:04:41.145001 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cgp7x" Feb 19 20:04:41 crc kubenswrapper[4749]: I0219 20:04:41.155797 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cgp7x"] Feb 19 20:04:41 crc kubenswrapper[4749]: I0219 20:04:41.207081 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt2jh\" (UniqueName: \"kubernetes.io/projected/b498d48e-a104-44d3-a210-417da368615b-kube-api-access-pt2jh\") pod \"redhat-operators-cgp7x\" (UID: \"b498d48e-a104-44d3-a210-417da368615b\") " pod="openshift-marketplace/redhat-operators-cgp7x" Feb 19 20:04:41 crc kubenswrapper[4749]: I0219 20:04:41.207167 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b498d48e-a104-44d3-a210-417da368615b-utilities\") pod \"redhat-operators-cgp7x\" (UID: \"b498d48e-a104-44d3-a210-417da368615b\") " pod="openshift-marketplace/redhat-operators-cgp7x" Feb 19 20:04:41 crc kubenswrapper[4749]: I0219 20:04:41.207274 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b498d48e-a104-44d3-a210-417da368615b-catalog-content\") pod \"redhat-operators-cgp7x\" (UID: \"b498d48e-a104-44d3-a210-417da368615b\") " pod="openshift-marketplace/redhat-operators-cgp7x" Feb 19 20:04:41 crc kubenswrapper[4749]: I0219 20:04:41.309672 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b498d48e-a104-44d3-a210-417da368615b-catalog-content\") pod \"redhat-operators-cgp7x\" (UID: \"b498d48e-a104-44d3-a210-417da368615b\") " pod="openshift-marketplace/redhat-operators-cgp7x" Feb 19 20:04:41 crc kubenswrapper[4749]: I0219 20:04:41.309854 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt2jh\" (UniqueName: \"kubernetes.io/projected/b498d48e-a104-44d3-a210-417da368615b-kube-api-access-pt2jh\") pod \"redhat-operators-cgp7x\" (UID: \"b498d48e-a104-44d3-a210-417da368615b\") " pod="openshift-marketplace/redhat-operators-cgp7x" Feb 19 20:04:41 crc kubenswrapper[4749]: I0219 20:04:41.309927 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b498d48e-a104-44d3-a210-417da368615b-utilities\") pod \"redhat-operators-cgp7x\" (UID: \"b498d48e-a104-44d3-a210-417da368615b\") " pod="openshift-marketplace/redhat-operators-cgp7x" Feb 19 20:04:41 crc kubenswrapper[4749]: I0219 20:04:41.310196 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b498d48e-a104-44d3-a210-417da368615b-catalog-content\") pod \"redhat-operators-cgp7x\" (UID: \"b498d48e-a104-44d3-a210-417da368615b\") " pod="openshift-marketplace/redhat-operators-cgp7x" Feb 19 20:04:41 crc kubenswrapper[4749]: I0219 20:04:41.310522 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b498d48e-a104-44d3-a210-417da368615b-utilities\") pod \"redhat-operators-cgp7x\" (UID: \"b498d48e-a104-44d3-a210-417da368615b\") " pod="openshift-marketplace/redhat-operators-cgp7x" Feb 19 20:04:41 crc kubenswrapper[4749]: I0219 20:04:41.347015 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt2jh\" (UniqueName: \"kubernetes.io/projected/b498d48e-a104-44d3-a210-417da368615b-kube-api-access-pt2jh\") pod \"redhat-operators-cgp7x\" (UID: \"b498d48e-a104-44d3-a210-417da368615b\") " pod="openshift-marketplace/redhat-operators-cgp7x" Feb 19 20:04:41 crc kubenswrapper[4749]: I0219 20:04:41.470865 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cgp7x" Feb 19 20:04:42 crc kubenswrapper[4749]: I0219 20:04:42.017772 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cgp7x"] Feb 19 20:04:42 crc kubenswrapper[4749]: I0219 20:04:42.679479 4749 scope.go:117] "RemoveContainer" containerID="872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5" Feb 19 20:04:42 crc kubenswrapper[4749]: E0219 20:04:42.680232 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:04:42 crc kubenswrapper[4749]: I0219 20:04:42.982851 4749 generic.go:334] "Generic (PLEG): container finished" podID="b498d48e-a104-44d3-a210-417da368615b" containerID="5e9cb642d2285be60de8334bc2edc9fced2ef63d27d26ff53c4ad95b93d19ee3" exitCode=0 Feb 19 20:04:42 crc kubenswrapper[4749]: I0219 20:04:42.982922 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgp7x" event={"ID":"b498d48e-a104-44d3-a210-417da368615b","Type":"ContainerDied","Data":"5e9cb642d2285be60de8334bc2edc9fced2ef63d27d26ff53c4ad95b93d19ee3"} Feb 19 20:04:42 crc kubenswrapper[4749]: I0219 20:04:42.982981 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgp7x" event={"ID":"b498d48e-a104-44d3-a210-417da368615b","Type":"ContainerStarted","Data":"3973a290750c92d97691c3fa04b79769a75f5610b47a48725b472e75e60ec566"} Feb 19 20:04:42 crc kubenswrapper[4749]: I0219 20:04:42.985302 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 20:04:43 crc kubenswrapper[4749]: I0219 20:04:43.993743 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgp7x" event={"ID":"b498d48e-a104-44d3-a210-417da368615b","Type":"ContainerStarted","Data":"52cadead7f4c02c6c60cc988d1e02c8b14609fdbc6706dafbb0f02e17e2b47a1"} Feb 19 20:04:48 crc kubenswrapper[4749]: I0219 20:04:48.031698 4749 generic.go:334] "Generic (PLEG): container finished" podID="b498d48e-a104-44d3-a210-417da368615b" containerID="52cadead7f4c02c6c60cc988d1e02c8b14609fdbc6706dafbb0f02e17e2b47a1" exitCode=0 Feb 19 20:04:48 crc kubenswrapper[4749]: I0219 20:04:48.031756 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgp7x" event={"ID":"b498d48e-a104-44d3-a210-417da368615b","Type":"ContainerDied","Data":"52cadead7f4c02c6c60cc988d1e02c8b14609fdbc6706dafbb0f02e17e2b47a1"} Feb 19 20:04:49 crc kubenswrapper[4749]: I0219 20:04:49.043589 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgp7x" event={"ID":"b498d48e-a104-44d3-a210-417da368615b","Type":"ContainerStarted","Data":"0b0250616537b9714078f7092738f1b27b56c70f2f512dcc93f76e05713dba62"} Feb 19 20:04:49 crc kubenswrapper[4749]: I0219 20:04:49.075595 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cgp7x" podStartSLOduration=2.378085769 podStartE2EDuration="8.07557473s" podCreationTimestamp="2026-02-19 20:04:41 +0000 UTC" firstStartedPulling="2026-02-19 20:04:42.985042097 +0000 UTC m=+5456.946262051" lastFinishedPulling="2026-02-19 20:04:48.682531018 +0000 UTC m=+5462.643751012" observedRunningTime="2026-02-19 20:04:49.069351989 +0000 UTC m=+5463.030571973" watchObservedRunningTime="2026-02-19 20:04:49.07557473 +0000 UTC m=+5463.036794684" Feb 19 20:04:51 crc kubenswrapper[4749]: I0219 20:04:51.471142 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cgp7x" Feb 19 20:04:51 crc kubenswrapper[4749]: I0219 20:04:51.472403 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cgp7x" Feb 19 20:04:52 crc kubenswrapper[4749]: I0219 20:04:52.516727 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cgp7x" podUID="b498d48e-a104-44d3-a210-417da368615b" containerName="registry-server" probeResult="failure" output=< Feb 19 20:04:52 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Feb 19 20:04:52 crc kubenswrapper[4749]: > Feb 19 20:04:54 crc kubenswrapper[4749]: I0219 20:04:54.680420 4749 scope.go:117] "RemoveContainer" containerID="872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5" Feb 19 20:04:54 crc kubenswrapper[4749]: E0219 20:04:54.680915 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:05:01 crc kubenswrapper[4749]: I0219 20:05:01.519961 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cgp7x" Feb 19 20:05:01 crc kubenswrapper[4749]: I0219 20:05:01.585550 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cgp7x" Feb 19 20:05:01 crc kubenswrapper[4749]: I0219 20:05:01.763453 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cgp7x"] Feb 19 20:05:03 crc kubenswrapper[4749]: I0219 20:05:03.166004 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cgp7x" podUID="b498d48e-a104-44d3-a210-417da368615b" containerName="registry-server" containerID="cri-o://0b0250616537b9714078f7092738f1b27b56c70f2f512dcc93f76e05713dba62" gracePeriod=2 Feb 19 20:05:03 crc kubenswrapper[4749]: I0219 20:05:03.700688 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cgp7x" Feb 19 20:05:03 crc kubenswrapper[4749]: I0219 20:05:03.853801 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b498d48e-a104-44d3-a210-417da368615b-catalog-content\") pod \"b498d48e-a104-44d3-a210-417da368615b\" (UID: \"b498d48e-a104-44d3-a210-417da368615b\") " Feb 19 20:05:03 crc kubenswrapper[4749]: I0219 20:05:03.854244 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b498d48e-a104-44d3-a210-417da368615b-utilities\") pod \"b498d48e-a104-44d3-a210-417da368615b\" (UID: \"b498d48e-a104-44d3-a210-417da368615b\") " Feb 19 20:05:03 crc kubenswrapper[4749]: I0219 20:05:03.854349 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt2jh\" (UniqueName: \"kubernetes.io/projected/b498d48e-a104-44d3-a210-417da368615b-kube-api-access-pt2jh\") pod \"b498d48e-a104-44d3-a210-417da368615b\" (UID: \"b498d48e-a104-44d3-a210-417da368615b\") " Feb 19 20:05:03 crc kubenswrapper[4749]: I0219 20:05:03.857577 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b498d48e-a104-44d3-a210-417da368615b-utilities" (OuterVolumeSpecName: "utilities") pod "b498d48e-a104-44d3-a210-417da368615b" (UID: "b498d48e-a104-44d3-a210-417da368615b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:05:03 crc kubenswrapper[4749]: I0219 20:05:03.867254 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b498d48e-a104-44d3-a210-417da368615b-kube-api-access-pt2jh" (OuterVolumeSpecName: "kube-api-access-pt2jh") pod "b498d48e-a104-44d3-a210-417da368615b" (UID: "b498d48e-a104-44d3-a210-417da368615b"). InnerVolumeSpecName "kube-api-access-pt2jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:05:03 crc kubenswrapper[4749]: I0219 20:05:03.956861 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt2jh\" (UniqueName: \"kubernetes.io/projected/b498d48e-a104-44d3-a210-417da368615b-kube-api-access-pt2jh\") on node \"crc\" DevicePath \"\"" Feb 19 20:05:03 crc kubenswrapper[4749]: I0219 20:05:03.956898 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b498d48e-a104-44d3-a210-417da368615b-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 20:05:03 crc kubenswrapper[4749]: I0219 20:05:03.983485 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b498d48e-a104-44d3-a210-417da368615b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b498d48e-a104-44d3-a210-417da368615b" (UID: "b498d48e-a104-44d3-a210-417da368615b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:05:04 crc kubenswrapper[4749]: I0219 20:05:04.058971 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b498d48e-a104-44d3-a210-417da368615b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 20:05:04 crc kubenswrapper[4749]: I0219 20:05:04.176450 4749 generic.go:334] "Generic (PLEG): container finished" podID="b498d48e-a104-44d3-a210-417da368615b" containerID="0b0250616537b9714078f7092738f1b27b56c70f2f512dcc93f76e05713dba62" exitCode=0 Feb 19 20:05:04 crc kubenswrapper[4749]: I0219 20:05:04.176527 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgp7x" event={"ID":"b498d48e-a104-44d3-a210-417da368615b","Type":"ContainerDied","Data":"0b0250616537b9714078f7092738f1b27b56c70f2f512dcc93f76e05713dba62"} Feb 19 20:05:04 crc kubenswrapper[4749]: I0219 20:05:04.176617 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgp7x" event={"ID":"b498d48e-a104-44d3-a210-417da368615b","Type":"ContainerDied","Data":"3973a290750c92d97691c3fa04b79769a75f5610b47a48725b472e75e60ec566"} Feb 19 20:05:04 crc kubenswrapper[4749]: I0219 20:05:04.176568 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cgp7x" Feb 19 20:05:04 crc kubenswrapper[4749]: I0219 20:05:04.185990 4749 scope.go:117] "RemoveContainer" containerID="0b0250616537b9714078f7092738f1b27b56c70f2f512dcc93f76e05713dba62" Feb 19 20:05:04 crc kubenswrapper[4749]: I0219 20:05:04.220607 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cgp7x"] Feb 19 20:05:04 crc kubenswrapper[4749]: I0219 20:05:04.223384 4749 scope.go:117] "RemoveContainer" containerID="52cadead7f4c02c6c60cc988d1e02c8b14609fdbc6706dafbb0f02e17e2b47a1" Feb 19 20:05:04 crc kubenswrapper[4749]: I0219 20:05:04.230990 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cgp7x"] Feb 19 20:05:04 crc kubenswrapper[4749]: I0219 20:05:04.277672 4749 scope.go:117] "RemoveContainer" containerID="5e9cb642d2285be60de8334bc2edc9fced2ef63d27d26ff53c4ad95b93d19ee3" Feb 19 20:05:04 crc kubenswrapper[4749]: I0219 20:05:04.322922 4749 scope.go:117] "RemoveContainer" containerID="0b0250616537b9714078f7092738f1b27b56c70f2f512dcc93f76e05713dba62" Feb 19 20:05:04 crc kubenswrapper[4749]: E0219 20:05:04.323390 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b0250616537b9714078f7092738f1b27b56c70f2f512dcc93f76e05713dba62\": container with ID starting with 0b0250616537b9714078f7092738f1b27b56c70f2f512dcc93f76e05713dba62 not found: ID does not exist" containerID="0b0250616537b9714078f7092738f1b27b56c70f2f512dcc93f76e05713dba62" Feb 19 20:05:04 crc kubenswrapper[4749]: I0219 20:05:04.323427 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b0250616537b9714078f7092738f1b27b56c70f2f512dcc93f76e05713dba62"} err="failed to get container status \"0b0250616537b9714078f7092738f1b27b56c70f2f512dcc93f76e05713dba62\": rpc error: code = NotFound desc = could not find container \"0b0250616537b9714078f7092738f1b27b56c70f2f512dcc93f76e05713dba62\": container with ID starting with 0b0250616537b9714078f7092738f1b27b56c70f2f512dcc93f76e05713dba62 not found: ID does not exist" Feb 19 20:05:04 crc kubenswrapper[4749]: I0219 20:05:04.323449 4749 scope.go:117] "RemoveContainer" containerID="52cadead7f4c02c6c60cc988d1e02c8b14609fdbc6706dafbb0f02e17e2b47a1" Feb 19 20:05:04 crc kubenswrapper[4749]: E0219 20:05:04.323795 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52cadead7f4c02c6c60cc988d1e02c8b14609fdbc6706dafbb0f02e17e2b47a1\": container with ID starting with 52cadead7f4c02c6c60cc988d1e02c8b14609fdbc6706dafbb0f02e17e2b47a1 not found: ID does not exist" containerID="52cadead7f4c02c6c60cc988d1e02c8b14609fdbc6706dafbb0f02e17e2b47a1" Feb 19 20:05:04 crc kubenswrapper[4749]: I0219 20:05:04.323819 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52cadead7f4c02c6c60cc988d1e02c8b14609fdbc6706dafbb0f02e17e2b47a1"} err="failed to get container status \"52cadead7f4c02c6c60cc988d1e02c8b14609fdbc6706dafbb0f02e17e2b47a1\": rpc error: code = NotFound desc = could not find container \"52cadead7f4c02c6c60cc988d1e02c8b14609fdbc6706dafbb0f02e17e2b47a1\": container with ID starting with 52cadead7f4c02c6c60cc988d1e02c8b14609fdbc6706dafbb0f02e17e2b47a1 not found: ID does not exist" Feb 19 20:05:04 crc kubenswrapper[4749]: I0219 20:05:04.323832 4749 scope.go:117] "RemoveContainer" containerID="5e9cb642d2285be60de8334bc2edc9fced2ef63d27d26ff53c4ad95b93d19ee3" Feb 19 20:05:04 crc kubenswrapper[4749]: E0219 20:05:04.324919 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e9cb642d2285be60de8334bc2edc9fced2ef63d27d26ff53c4ad95b93d19ee3\": container with ID starting with 5e9cb642d2285be60de8334bc2edc9fced2ef63d27d26ff53c4ad95b93d19ee3 not found: ID does not exist" containerID="5e9cb642d2285be60de8334bc2edc9fced2ef63d27d26ff53c4ad95b93d19ee3" Feb 19 20:05:04 crc kubenswrapper[4749]: I0219 20:05:04.324944 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e9cb642d2285be60de8334bc2edc9fced2ef63d27d26ff53c4ad95b93d19ee3"} err="failed to get container status \"5e9cb642d2285be60de8334bc2edc9fced2ef63d27d26ff53c4ad95b93d19ee3\": rpc error: code = NotFound desc = could not find container \"5e9cb642d2285be60de8334bc2edc9fced2ef63d27d26ff53c4ad95b93d19ee3\": container with ID starting with 5e9cb642d2285be60de8334bc2edc9fced2ef63d27d26ff53c4ad95b93d19ee3 not found: ID does not exist" Feb 19 20:05:04 crc kubenswrapper[4749]: I0219 20:05:04.691664 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b498d48e-a104-44d3-a210-417da368615b" path="/var/lib/kubelet/pods/b498d48e-a104-44d3-a210-417da368615b/volumes" Feb 19 20:05:08 crc kubenswrapper[4749]: I0219 20:05:08.679826 4749 scope.go:117] "RemoveContainer" containerID="872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5" Feb 19 20:05:08 crc kubenswrapper[4749]: E0219 20:05:08.680696 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:05:23 crc kubenswrapper[4749]: I0219 20:05:23.678512 4749 scope.go:117] "RemoveContainer" containerID="872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5" Feb 19 20:05:23 crc kubenswrapper[4749]: E0219 20:05:23.679361 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:05:23 crc kubenswrapper[4749]: I0219 20:05:23.796891 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2tjbw"] Feb 19 20:05:23 crc kubenswrapper[4749]: E0219 20:05:23.797388 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b498d48e-a104-44d3-a210-417da368615b" containerName="registry-server" Feb 19 20:05:23 crc kubenswrapper[4749]: I0219 20:05:23.797409 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b498d48e-a104-44d3-a210-417da368615b" containerName="registry-server" Feb 19 20:05:23 crc kubenswrapper[4749]: E0219 20:05:23.797431 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b498d48e-a104-44d3-a210-417da368615b" containerName="extract-utilities" Feb 19 20:05:23 crc kubenswrapper[4749]: I0219 20:05:23.797437 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b498d48e-a104-44d3-a210-417da368615b" containerName="extract-utilities" Feb 19 20:05:23 crc kubenswrapper[4749]: E0219 20:05:23.797476 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b498d48e-a104-44d3-a210-417da368615b" containerName="extract-content" Feb 19 20:05:23 crc kubenswrapper[4749]: I0219 20:05:23.797482 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b498d48e-a104-44d3-a210-417da368615b" containerName="extract-content" Feb 19 20:05:23 crc kubenswrapper[4749]: I0219 20:05:23.797664 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b498d48e-a104-44d3-a210-417da368615b" containerName="registry-server" Feb 19 20:05:23 crc kubenswrapper[4749]: I0219 20:05:23.799075 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2tjbw" Feb 19 20:05:23 crc kubenswrapper[4749]: I0219 20:05:23.813564 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2tjbw"] Feb 19 20:05:23 crc kubenswrapper[4749]: I0219 20:05:23.907249 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9c6891a-c73f-4e30-a775-4bd50e6698f6-utilities\") pod \"certified-operators-2tjbw\" (UID: \"f9c6891a-c73f-4e30-a775-4bd50e6698f6\") " pod="openshift-marketplace/certified-operators-2tjbw" Feb 19 20:05:23 crc kubenswrapper[4749]: I0219 20:05:23.907693 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klpft\" (UniqueName: \"kubernetes.io/projected/f9c6891a-c73f-4e30-a775-4bd50e6698f6-kube-api-access-klpft\") pod \"certified-operators-2tjbw\" (UID: \"f9c6891a-c73f-4e30-a775-4bd50e6698f6\") " pod="openshift-marketplace/certified-operators-2tjbw" Feb 19 20:05:23 crc kubenswrapper[4749]: I0219 20:05:23.907786 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9c6891a-c73f-4e30-a775-4bd50e6698f6-catalog-content\") pod \"certified-operators-2tjbw\" (UID: \"f9c6891a-c73f-4e30-a775-4bd50e6698f6\") " pod="openshift-marketplace/certified-operators-2tjbw" Feb 19 20:05:24 crc kubenswrapper[4749]: I0219 20:05:24.009866 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klpft\" (UniqueName: \"kubernetes.io/projected/f9c6891a-c73f-4e30-a775-4bd50e6698f6-kube-api-access-klpft\") pod \"certified-operators-2tjbw\" (UID: \"f9c6891a-c73f-4e30-a775-4bd50e6698f6\") " pod="openshift-marketplace/certified-operators-2tjbw" Feb 19 20:05:24 crc kubenswrapper[4749]: I0219 20:05:24.010275 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9c6891a-c73f-4e30-a775-4bd50e6698f6-catalog-content\") pod \"certified-operators-2tjbw\" (UID: \"f9c6891a-c73f-4e30-a775-4bd50e6698f6\") " pod="openshift-marketplace/certified-operators-2tjbw" Feb 19 20:05:24 crc kubenswrapper[4749]: I0219 20:05:24.010397 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9c6891a-c73f-4e30-a775-4bd50e6698f6-utilities\") pod \"certified-operators-2tjbw\" (UID: \"f9c6891a-c73f-4e30-a775-4bd50e6698f6\") " pod="openshift-marketplace/certified-operators-2tjbw" Feb 19 20:05:24 crc kubenswrapper[4749]: I0219 20:05:24.010829 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9c6891a-c73f-4e30-a775-4bd50e6698f6-catalog-content\") pod \"certified-operators-2tjbw\" (UID: \"f9c6891a-c73f-4e30-a775-4bd50e6698f6\") " pod="openshift-marketplace/certified-operators-2tjbw" Feb 19 20:05:24 crc kubenswrapper[4749]: I0219 20:05:24.010931 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9c6891a-c73f-4e30-a775-4bd50e6698f6-utilities\") pod \"certified-operators-2tjbw\" (UID: \"f9c6891a-c73f-4e30-a775-4bd50e6698f6\") " pod="openshift-marketplace/certified-operators-2tjbw" Feb 19 20:05:24 crc kubenswrapper[4749]: I0219 20:05:24.033520 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klpft\" (UniqueName: \"kubernetes.io/projected/f9c6891a-c73f-4e30-a775-4bd50e6698f6-kube-api-access-klpft\") pod \"certified-operators-2tjbw\" (UID: \"f9c6891a-c73f-4e30-a775-4bd50e6698f6\") " pod="openshift-marketplace/certified-operators-2tjbw" Feb 19 20:05:24 crc kubenswrapper[4749]: I0219 20:05:24.130832 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2tjbw" Feb 19 20:05:24 crc kubenswrapper[4749]: I0219 20:05:24.786159 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2tjbw"] Feb 19 20:05:25 crc kubenswrapper[4749]: I0219 20:05:25.441299 4749 generic.go:334] "Generic (PLEG): container finished" podID="f9c6891a-c73f-4e30-a775-4bd50e6698f6" containerID="99487dd3f401652e5cb6739ec26e86a43309d6cc955cebcc0b0281adf696bb2f" exitCode=0 Feb 19 20:05:25 crc kubenswrapper[4749]: I0219 20:05:25.441351 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2tjbw" event={"ID":"f9c6891a-c73f-4e30-a775-4bd50e6698f6","Type":"ContainerDied","Data":"99487dd3f401652e5cb6739ec26e86a43309d6cc955cebcc0b0281adf696bb2f"} Feb 19 20:05:25 crc kubenswrapper[4749]: I0219 20:05:25.441574 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2tjbw" event={"ID":"f9c6891a-c73f-4e30-a775-4bd50e6698f6","Type":"ContainerStarted","Data":"9cbf4c94e7d401dee963e0807762c7e27cf85e29d324cf16867c240c31c3749a"} Feb 19 20:05:26 crc kubenswrapper[4749]: I0219 20:05:26.451722 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2tjbw" event={"ID":"f9c6891a-c73f-4e30-a775-4bd50e6698f6","Type":"ContainerStarted","Data":"a79ba37e538733256f14076eae906805d606934af46f492fca1a7af277798ee7"} Feb 19 20:05:27 crc kubenswrapper[4749]: I0219 20:05:27.465679 4749 generic.go:334] "Generic (PLEG): container finished" podID="f9c6891a-c73f-4e30-a775-4bd50e6698f6" containerID="a79ba37e538733256f14076eae906805d606934af46f492fca1a7af277798ee7" exitCode=0 Feb 19 20:05:27 crc kubenswrapper[4749]: I0219 20:05:27.465768 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2tjbw" event={"ID":"f9c6891a-c73f-4e30-a775-4bd50e6698f6","Type":"ContainerDied","Data":"a79ba37e538733256f14076eae906805d606934af46f492fca1a7af277798ee7"} Feb 19 20:05:28 crc kubenswrapper[4749]: I0219 20:05:28.478352 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2tjbw" event={"ID":"f9c6891a-c73f-4e30-a775-4bd50e6698f6","Type":"ContainerStarted","Data":"239fb5a1face921b38435768c548c0b5538ea26a93bc40881a3a5c539bb43870"} Feb 19 20:05:34 crc kubenswrapper[4749]: I0219 20:05:34.131328 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2tjbw" Feb 19 20:05:34 crc kubenswrapper[4749]: I0219 20:05:34.131905 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2tjbw" Feb 19 20:05:34 crc kubenswrapper[4749]: I0219 20:05:34.197442 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2tjbw" Feb 19 20:05:34 crc kubenswrapper[4749]: I0219 20:05:34.219735 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2tjbw" podStartSLOduration=8.797174993 podStartE2EDuration="11.219714381s" podCreationTimestamp="2026-02-19 20:05:23 +0000 UTC" firstStartedPulling="2026-02-19 20:05:25.44530028 +0000 UTC m=+5499.406520234" lastFinishedPulling="2026-02-19 20:05:27.867839668 +0000 UTC m=+5501.829059622" observedRunningTime="2026-02-19 20:05:28.50519166 +0000 UTC m=+5502.466411624" watchObservedRunningTime="2026-02-19 20:05:34.219714381 +0000 UTC m=+5508.180934335" Feb 19 20:05:34 crc kubenswrapper[4749]: I0219 20:05:34.587362 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2tjbw" Feb 19 20:05:34 crc kubenswrapper[4749]: I0219 20:05:34.632538 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2tjbw"] Feb 19 20:05:34 crc kubenswrapper[4749]: I0219 20:05:34.678872 4749 scope.go:117] "RemoveContainer" containerID="872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5" Feb 19 20:05:34 crc kubenswrapper[4749]: E0219 20:05:34.679163 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:05:36 crc kubenswrapper[4749]: I0219 20:05:36.552746 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2tjbw" podUID="f9c6891a-c73f-4e30-a775-4bd50e6698f6" containerName="registry-server" containerID="cri-o://239fb5a1face921b38435768c548c0b5538ea26a93bc40881a3a5c539bb43870" gracePeriod=2 Feb 19 20:05:37 crc kubenswrapper[4749]: I0219 20:05:37.049837 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2tjbw" Feb 19 20:05:37 crc kubenswrapper[4749]: I0219 20:05:37.127111 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9c6891a-c73f-4e30-a775-4bd50e6698f6-utilities\") pod \"f9c6891a-c73f-4e30-a775-4bd50e6698f6\" (UID: \"f9c6891a-c73f-4e30-a775-4bd50e6698f6\") " Feb 19 20:05:37 crc kubenswrapper[4749]: I0219 20:05:37.127226 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klpft\" (UniqueName: \"kubernetes.io/projected/f9c6891a-c73f-4e30-a775-4bd50e6698f6-kube-api-access-klpft\") pod \"f9c6891a-c73f-4e30-a775-4bd50e6698f6\" (UID: \"f9c6891a-c73f-4e30-a775-4bd50e6698f6\") " Feb 19 20:05:37 crc kubenswrapper[4749]: I0219 20:05:37.127386 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9c6891a-c73f-4e30-a775-4bd50e6698f6-catalog-content\") pod \"f9c6891a-c73f-4e30-a775-4bd50e6698f6\" (UID: \"f9c6891a-c73f-4e30-a775-4bd50e6698f6\") " Feb 19 20:05:37 crc kubenswrapper[4749]: I0219 20:05:37.127997 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9c6891a-c73f-4e30-a775-4bd50e6698f6-utilities" (OuterVolumeSpecName: "utilities") pod "f9c6891a-c73f-4e30-a775-4bd50e6698f6" (UID: "f9c6891a-c73f-4e30-a775-4bd50e6698f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:05:37 crc kubenswrapper[4749]: I0219 20:05:37.132938 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9c6891a-c73f-4e30-a775-4bd50e6698f6-kube-api-access-klpft" (OuterVolumeSpecName: "kube-api-access-klpft") pod "f9c6891a-c73f-4e30-a775-4bd50e6698f6" (UID: "f9c6891a-c73f-4e30-a775-4bd50e6698f6"). InnerVolumeSpecName "kube-api-access-klpft". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:05:37 crc kubenswrapper[4749]: I0219 20:05:37.218390 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9c6891a-c73f-4e30-a775-4bd50e6698f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9c6891a-c73f-4e30-a775-4bd50e6698f6" (UID: "f9c6891a-c73f-4e30-a775-4bd50e6698f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:05:37 crc kubenswrapper[4749]: I0219 20:05:37.229994 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9c6891a-c73f-4e30-a775-4bd50e6698f6-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 20:05:37 crc kubenswrapper[4749]: I0219 20:05:37.230243 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klpft\" (UniqueName: \"kubernetes.io/projected/f9c6891a-c73f-4e30-a775-4bd50e6698f6-kube-api-access-klpft\") on node \"crc\" DevicePath \"\"" Feb 19 20:05:37 crc kubenswrapper[4749]: I0219 20:05:37.230304 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9c6891a-c73f-4e30-a775-4bd50e6698f6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 20:05:37 crc kubenswrapper[4749]: I0219 20:05:37.565767 4749 generic.go:334] "Generic (PLEG): container finished" podID="f9c6891a-c73f-4e30-a775-4bd50e6698f6" containerID="239fb5a1face921b38435768c548c0b5538ea26a93bc40881a3a5c539bb43870" exitCode=0 Feb 19 20:05:37 crc kubenswrapper[4749]: I0219 20:05:37.565815 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2tjbw" event={"ID":"f9c6891a-c73f-4e30-a775-4bd50e6698f6","Type":"ContainerDied","Data":"239fb5a1face921b38435768c548c0b5538ea26a93bc40881a3a5c539bb43870"} Feb 19 20:05:37 crc kubenswrapper[4749]: I0219 20:05:37.565841 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2tjbw" event={"ID":"f9c6891a-c73f-4e30-a775-4bd50e6698f6","Type":"ContainerDied","Data":"9cbf4c94e7d401dee963e0807762c7e27cf85e29d324cf16867c240c31c3749a"} Feb 19 20:05:37 crc kubenswrapper[4749]: I0219 20:05:37.565859 4749 scope.go:117] "RemoveContainer" containerID="239fb5a1face921b38435768c548c0b5538ea26a93bc40881a3a5c539bb43870" Feb 19 20:05:37 crc kubenswrapper[4749]: I0219 20:05:37.565996 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2tjbw" Feb 19 20:05:37 crc kubenswrapper[4749]: I0219 20:05:37.605407 4749 scope.go:117] "RemoveContainer" containerID="a79ba37e538733256f14076eae906805d606934af46f492fca1a7af277798ee7" Feb 19 20:05:37 crc kubenswrapper[4749]: I0219 20:05:37.607947 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2tjbw"] Feb 19 20:05:37 crc kubenswrapper[4749]: I0219 20:05:37.618173 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2tjbw"] Feb 19 20:05:37 crc kubenswrapper[4749]: I0219 20:05:37.635893 4749 scope.go:117] "RemoveContainer" containerID="99487dd3f401652e5cb6739ec26e86a43309d6cc955cebcc0b0281adf696bb2f" Feb 19 20:05:37 crc kubenswrapper[4749]: I0219 20:05:37.683614 4749 scope.go:117] "RemoveContainer" containerID="239fb5a1face921b38435768c548c0b5538ea26a93bc40881a3a5c539bb43870" Feb 19 20:05:37 crc kubenswrapper[4749]: E0219 20:05:37.684116 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"239fb5a1face921b38435768c548c0b5538ea26a93bc40881a3a5c539bb43870\": container with ID starting with 239fb5a1face921b38435768c548c0b5538ea26a93bc40881a3a5c539bb43870 not found: ID does not exist" containerID="239fb5a1face921b38435768c548c0b5538ea26a93bc40881a3a5c539bb43870" Feb 19 20:05:37 crc kubenswrapper[4749]: I0219 20:05:37.684153 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"239fb5a1face921b38435768c548c0b5538ea26a93bc40881a3a5c539bb43870"} err="failed to get container status \"239fb5a1face921b38435768c548c0b5538ea26a93bc40881a3a5c539bb43870\": rpc error: code = NotFound desc = could not find container \"239fb5a1face921b38435768c548c0b5538ea26a93bc40881a3a5c539bb43870\": container with ID starting with 239fb5a1face921b38435768c548c0b5538ea26a93bc40881a3a5c539bb43870 not found: ID does not exist" Feb 19 20:05:37 crc kubenswrapper[4749]: I0219 20:05:37.684175 4749 scope.go:117] "RemoveContainer" containerID="a79ba37e538733256f14076eae906805d606934af46f492fca1a7af277798ee7" Feb 19 20:05:37 crc kubenswrapper[4749]: E0219 20:05:37.684562 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a79ba37e538733256f14076eae906805d606934af46f492fca1a7af277798ee7\": container with ID starting with a79ba37e538733256f14076eae906805d606934af46f492fca1a7af277798ee7 not found: ID does not exist" containerID="a79ba37e538733256f14076eae906805d606934af46f492fca1a7af277798ee7" Feb 19 20:05:37 crc kubenswrapper[4749]: I0219 20:05:37.684593 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a79ba37e538733256f14076eae906805d606934af46f492fca1a7af277798ee7"} err="failed to get container status \"a79ba37e538733256f14076eae906805d606934af46f492fca1a7af277798ee7\": rpc error: code = NotFound desc = could not find container \"a79ba37e538733256f14076eae906805d606934af46f492fca1a7af277798ee7\": container with ID starting with a79ba37e538733256f14076eae906805d606934af46f492fca1a7af277798ee7 not found: ID does not exist" Feb 19 20:05:37 crc kubenswrapper[4749]: I0219 20:05:37.684611 4749 scope.go:117] "RemoveContainer" containerID="99487dd3f401652e5cb6739ec26e86a43309d6cc955cebcc0b0281adf696bb2f" Feb 19 20:05:37 crc kubenswrapper[4749]: E0219 20:05:37.685040 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99487dd3f401652e5cb6739ec26e86a43309d6cc955cebcc0b0281adf696bb2f\": container with ID starting with 99487dd3f401652e5cb6739ec26e86a43309d6cc955cebcc0b0281adf696bb2f not found: ID does not exist" containerID="99487dd3f401652e5cb6739ec26e86a43309d6cc955cebcc0b0281adf696bb2f" Feb 19 20:05:37 crc kubenswrapper[4749]: I0219 20:05:37.685092 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99487dd3f401652e5cb6739ec26e86a43309d6cc955cebcc0b0281adf696bb2f"} err="failed to get container status \"99487dd3f401652e5cb6739ec26e86a43309d6cc955cebcc0b0281adf696bb2f\": rpc error: code = NotFound desc = could not find container \"99487dd3f401652e5cb6739ec26e86a43309d6cc955cebcc0b0281adf696bb2f\": container with ID starting with 99487dd3f401652e5cb6739ec26e86a43309d6cc955cebcc0b0281adf696bb2f not found: ID does not exist" Feb 19 20:05:38 crc kubenswrapper[4749]: I0219 20:05:38.689995 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9c6891a-c73f-4e30-a775-4bd50e6698f6" path="/var/lib/kubelet/pods/f9c6891a-c73f-4e30-a775-4bd50e6698f6/volumes" Feb 19 20:05:45 crc kubenswrapper[4749]: I0219 20:05:45.679271 4749 scope.go:117] "RemoveContainer" containerID="872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5" Feb 19 20:05:45 crc kubenswrapper[4749]: E0219 20:05:45.680323 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:05:57 crc kubenswrapper[4749]: I0219 20:05:57.679347 4749 scope.go:117] "RemoveContainer" containerID="872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5" Feb 19 20:05:57 crc kubenswrapper[4749]: E0219 20:05:57.680242 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:06:12 crc kubenswrapper[4749]: I0219 20:06:12.679532 4749 scope.go:117] "RemoveContainer" containerID="872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5" Feb 19 20:06:12 crc kubenswrapper[4749]: E0219 20:06:12.680338 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:06:25 crc kubenswrapper[4749]: I0219 20:06:25.679049 4749 scope.go:117] "RemoveContainer" containerID="872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5" Feb 19 20:06:26 crc kubenswrapper[4749]: I0219 20:06:26.246734 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerStarted","Data":"f0ed5f6d4b268527a299c8e916a7c157586f7cffd5eaed0132646118a1ff546c"} Feb 19 20:06:26 crc kubenswrapper[4749]: I0219 20:06:26.659933 4749 scope.go:117] "RemoveContainer" containerID="ee6efd9227a60bdb65af71d3d6a049beea9d0a0df051d733be5335e9d7e86e69" Feb 19 20:06:28 crc kubenswrapper[4749]: I0219 20:06:28.266799 4749 generic.go:334] "Generic (PLEG): container finished" podID="6baeaaaa-644d-4cd2-bff2-ff6c5e696204" containerID="2eab541f23d459c8e4eafc1599a29ee6dab665bbfe1a0cc34cbdd34d4ffd60e5" exitCode=0 Feb 19 20:06:28 crc kubenswrapper[4749]: I0219 20:06:28.266882 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sfq27/must-gather-ghk8b" event={"ID":"6baeaaaa-644d-4cd2-bff2-ff6c5e696204","Type":"ContainerDied","Data":"2eab541f23d459c8e4eafc1599a29ee6dab665bbfe1a0cc34cbdd34d4ffd60e5"} Feb 19 20:06:28 crc kubenswrapper[4749]: I0219 20:06:28.268438 4749 scope.go:117] "RemoveContainer" containerID="2eab541f23d459c8e4eafc1599a29ee6dab665bbfe1a0cc34cbdd34d4ffd60e5" Feb 19 20:06:28 crc kubenswrapper[4749]: I0219 20:06:28.698169 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sfq27_must-gather-ghk8b_6baeaaaa-644d-4cd2-bff2-ff6c5e696204/gather/0.log" Feb 19 20:06:37 crc kubenswrapper[4749]: I0219 20:06:37.078708 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-sfq27/must-gather-ghk8b"] Feb 19 20:06:37 crc kubenswrapper[4749]: I0219 20:06:37.079522 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-sfq27/must-gather-ghk8b" podUID="6baeaaaa-644d-4cd2-bff2-ff6c5e696204" containerName="copy" containerID="cri-o://09fa0555e62df6e8c411b1580ab4181288469e95a923159ad6bb4dc613f6b9a9" gracePeriod=2 Feb 19 20:06:37 crc kubenswrapper[4749]: I0219 20:06:37.090307 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-sfq27/must-gather-ghk8b"] Feb 19 20:06:37 crc kubenswrapper[4749]: I0219 20:06:37.357589 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sfq27_must-gather-ghk8b_6baeaaaa-644d-4cd2-bff2-ff6c5e696204/copy/0.log" Feb 19 20:06:37 crc kubenswrapper[4749]: I0219 20:06:37.358542 4749 generic.go:334] "Generic (PLEG): container finished" podID="6baeaaaa-644d-4cd2-bff2-ff6c5e696204" containerID="09fa0555e62df6e8c411b1580ab4181288469e95a923159ad6bb4dc613f6b9a9" exitCode=143 Feb 19 20:06:37 crc kubenswrapper[4749]: I0219 20:06:37.517532 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sfq27_must-gather-ghk8b_6baeaaaa-644d-4cd2-bff2-ff6c5e696204/copy/0.log" Feb 19 20:06:37 crc kubenswrapper[4749]: I0219 20:06:37.517909 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sfq27/must-gather-ghk8b" Feb 19 20:06:37 crc kubenswrapper[4749]: I0219 20:06:37.659139 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvkdv\" (UniqueName: \"kubernetes.io/projected/6baeaaaa-644d-4cd2-bff2-ff6c5e696204-kube-api-access-tvkdv\") pod \"6baeaaaa-644d-4cd2-bff2-ff6c5e696204\" (UID: \"6baeaaaa-644d-4cd2-bff2-ff6c5e696204\") " Feb 19 20:06:37 crc kubenswrapper[4749]: I0219 20:06:37.659351 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6baeaaaa-644d-4cd2-bff2-ff6c5e696204-must-gather-output\") pod \"6baeaaaa-644d-4cd2-bff2-ff6c5e696204\" (UID: \"6baeaaaa-644d-4cd2-bff2-ff6c5e696204\") " Feb 19 20:06:37 crc kubenswrapper[4749]: I0219 20:06:37.666293 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6baeaaaa-644d-4cd2-bff2-ff6c5e696204-kube-api-access-tvkdv" (OuterVolumeSpecName: "kube-api-access-tvkdv") pod "6baeaaaa-644d-4cd2-bff2-ff6c5e696204" (UID: "6baeaaaa-644d-4cd2-bff2-ff6c5e696204"). InnerVolumeSpecName "kube-api-access-tvkdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:06:37 crc kubenswrapper[4749]: I0219 20:06:37.762179 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvkdv\" (UniqueName: \"kubernetes.io/projected/6baeaaaa-644d-4cd2-bff2-ff6c5e696204-kube-api-access-tvkdv\") on node \"crc\" DevicePath \"\"" Feb 19 20:06:37 crc kubenswrapper[4749]: I0219 20:06:37.862249 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6baeaaaa-644d-4cd2-bff2-ff6c5e696204-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "6baeaaaa-644d-4cd2-bff2-ff6c5e696204" (UID: "6baeaaaa-644d-4cd2-bff2-ff6c5e696204"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:06:37 crc kubenswrapper[4749]: I0219 20:06:37.863715 4749 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6baeaaaa-644d-4cd2-bff2-ff6c5e696204-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 19 20:06:38 crc kubenswrapper[4749]: I0219 20:06:38.368268 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sfq27_must-gather-ghk8b_6baeaaaa-644d-4cd2-bff2-ff6c5e696204/copy/0.log" Feb 19 20:06:38 crc kubenswrapper[4749]: I0219 20:06:38.368839 4749 scope.go:117] "RemoveContainer" containerID="09fa0555e62df6e8c411b1580ab4181288469e95a923159ad6bb4dc613f6b9a9" Feb 19 20:06:38 crc kubenswrapper[4749]: I0219 20:06:38.368873 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sfq27/must-gather-ghk8b" Feb 19 20:06:38 crc kubenswrapper[4749]: I0219 20:06:38.386520 4749 scope.go:117] "RemoveContainer" containerID="2eab541f23d459c8e4eafc1599a29ee6dab665bbfe1a0cc34cbdd34d4ffd60e5" Feb 19 20:06:38 crc kubenswrapper[4749]: I0219 20:06:38.691188 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6baeaaaa-644d-4cd2-bff2-ff6c5e696204" path="/var/lib/kubelet/pods/6baeaaaa-644d-4cd2-bff2-ff6c5e696204/volumes" Feb 19 20:07:26 crc kubenswrapper[4749]: I0219 20:07:26.780257 4749 scope.go:117] "RemoveContainer" containerID="01e4afbec47b99516ac12f1173a0d5a786ccf408a679c2e879582c2a68992a76" Feb 19 20:08:54 crc kubenswrapper[4749]: I0219 20:08:54.725052 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 20:08:54 crc kubenswrapper[4749]: I0219 20:08:54.725672 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 20:09:24 crc kubenswrapper[4749]: I0219 20:09:24.725080 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 20:09:24 crc kubenswrapper[4749]: I0219 20:09:24.725779 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 20:09:54 crc kubenswrapper[4749]: I0219 20:09:54.725252 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 20:09:54 crc kubenswrapper[4749]: I0219 20:09:54.725829 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 20:09:54 crc kubenswrapper[4749]: I0219 20:09:54.725902 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 20:09:54 crc kubenswrapper[4749]: I0219 20:09:54.726646 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f0ed5f6d4b268527a299c8e916a7c157586f7cffd5eaed0132646118a1ff546c"} pod="openshift-machine-config-operator/machine-config-daemon-nzldt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 20:09:54 crc kubenswrapper[4749]: I0219 20:09:54.726746 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" containerID="cri-o://f0ed5f6d4b268527a299c8e916a7c157586f7cffd5eaed0132646118a1ff546c" gracePeriod=600 Feb 19 20:09:55 crc kubenswrapper[4749]: I0219 20:09:55.253756 4749 generic.go:334] "Generic (PLEG): container finished" podID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerID="f0ed5f6d4b268527a299c8e916a7c157586f7cffd5eaed0132646118a1ff546c" exitCode=0 Feb 19 20:09:55 crc kubenswrapper[4749]: I0219 20:09:55.254059 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerDied","Data":"f0ed5f6d4b268527a299c8e916a7c157586f7cffd5eaed0132646118a1ff546c"} Feb 19 20:09:55 crc kubenswrapper[4749]: I0219 20:09:55.254098 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerStarted","Data":"a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507"} Feb 19 20:09:55 crc kubenswrapper[4749]: I0219 20:09:55.254115 4749 scope.go:117] "RemoveContainer" containerID="872c864f6d88380d8eaf74b12ba3710a1f61efcad865ef01a93bcd572710f0f5" Feb 19 20:10:14 crc kubenswrapper[4749]: I0219 20:10:14.445383 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qbkkx/must-gather-s5qs9"] Feb 19 20:10:14 crc kubenswrapper[4749]: E0219 20:10:14.446461 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6baeaaaa-644d-4cd2-bff2-ff6c5e696204" containerName="gather" Feb 19 20:10:14 crc kubenswrapper[4749]: I0219 20:10:14.446480 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6baeaaaa-644d-4cd2-bff2-ff6c5e696204" containerName="gather" Feb 19 20:10:14 crc kubenswrapper[4749]: E0219 20:10:14.446515 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6baeaaaa-644d-4cd2-bff2-ff6c5e696204" containerName="copy" Feb 19 20:10:14 crc kubenswrapper[4749]: I0219 20:10:14.446523 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6baeaaaa-644d-4cd2-bff2-ff6c5e696204" containerName="copy" Feb 19 20:10:14 crc kubenswrapper[4749]: E0219 20:10:14.446540 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c6891a-c73f-4e30-a775-4bd50e6698f6" containerName="extract-utilities" Feb 19 20:10:14 crc kubenswrapper[4749]: I0219 20:10:14.446548 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c6891a-c73f-4e30-a775-4bd50e6698f6" containerName="extract-utilities" Feb 19 20:10:14 crc kubenswrapper[4749]: E0219 20:10:14.446561 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c6891a-c73f-4e30-a775-4bd50e6698f6" containerName="registry-server" Feb 19 20:10:14 crc kubenswrapper[4749]: I0219 20:10:14.446569 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c6891a-c73f-4e30-a775-4bd50e6698f6" containerName="registry-server" Feb 19 20:10:14 crc kubenswrapper[4749]: E0219 20:10:14.446585 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c6891a-c73f-4e30-a775-4bd50e6698f6" containerName="extract-content" Feb 19 20:10:14 crc kubenswrapper[4749]: I0219 20:10:14.446592 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c6891a-c73f-4e30-a775-4bd50e6698f6" containerName="extract-content" Feb 19 20:10:14 crc kubenswrapper[4749]: I0219 20:10:14.446954 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6baeaaaa-644d-4cd2-bff2-ff6c5e696204" containerName="copy" Feb 19 20:10:14 crc kubenswrapper[4749]: I0219 20:10:14.447036 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9c6891a-c73f-4e30-a775-4bd50e6698f6" containerName="registry-server" Feb 19 20:10:14 crc kubenswrapper[4749]: I0219 20:10:14.447077 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6baeaaaa-644d-4cd2-bff2-ff6c5e696204" containerName="gather" Feb 19 20:10:14 crc kubenswrapper[4749]: I0219 20:10:14.448640 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qbkkx/must-gather-s5qs9" Feb 19 20:10:14 crc kubenswrapper[4749]: I0219 20:10:14.455677 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qbkkx"/"openshift-service-ca.crt" Feb 19 20:10:14 crc kubenswrapper[4749]: I0219 20:10:14.455731 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qbkkx"/"kube-root-ca.crt" Feb 19 20:10:14 crc kubenswrapper[4749]: I0219 20:10:14.455755 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-qbkkx"/"default-dockercfg-jgx2t" Feb 19 20:10:14 crc kubenswrapper[4749]: I0219 20:10:14.465603 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qbkkx/must-gather-s5qs9"] Feb 19 20:10:14 crc kubenswrapper[4749]: I0219 20:10:14.551811 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bd4g\" (UniqueName: \"kubernetes.io/projected/8e1e8621-901e-44d8-a2ba-7b56b11d302b-kube-api-access-4bd4g\") pod \"must-gather-s5qs9\" (UID: \"8e1e8621-901e-44d8-a2ba-7b56b11d302b\") " pod="openshift-must-gather-qbkkx/must-gather-s5qs9" Feb 19 20:10:14 crc kubenswrapper[4749]: I0219 20:10:14.551923 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8e1e8621-901e-44d8-a2ba-7b56b11d302b-must-gather-output\") pod \"must-gather-s5qs9\" (UID: \"8e1e8621-901e-44d8-a2ba-7b56b11d302b\") " pod="openshift-must-gather-qbkkx/must-gather-s5qs9" Feb 19 20:10:14 crc kubenswrapper[4749]: I0219 20:10:14.654243 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bd4g\" (UniqueName: \"kubernetes.io/projected/8e1e8621-901e-44d8-a2ba-7b56b11d302b-kube-api-access-4bd4g\") pod \"must-gather-s5qs9\" (UID: \"8e1e8621-901e-44d8-a2ba-7b56b11d302b\") " pod="openshift-must-gather-qbkkx/must-gather-s5qs9" Feb 19 20:10:14 crc kubenswrapper[4749]: I0219 20:10:14.654342 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8e1e8621-901e-44d8-a2ba-7b56b11d302b-must-gather-output\") pod \"must-gather-s5qs9\" (UID: \"8e1e8621-901e-44d8-a2ba-7b56b11d302b\") " pod="openshift-must-gather-qbkkx/must-gather-s5qs9" Feb 19 20:10:14 crc kubenswrapper[4749]: I0219 20:10:14.654829 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8e1e8621-901e-44d8-a2ba-7b56b11d302b-must-gather-output\") pod \"must-gather-s5qs9\" (UID: \"8e1e8621-901e-44d8-a2ba-7b56b11d302b\") " pod="openshift-must-gather-qbkkx/must-gather-s5qs9" Feb 19 20:10:14 crc kubenswrapper[4749]: I0219 20:10:14.674199 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bd4g\" (UniqueName: \"kubernetes.io/projected/8e1e8621-901e-44d8-a2ba-7b56b11d302b-kube-api-access-4bd4g\") pod \"must-gather-s5qs9\" (UID: \"8e1e8621-901e-44d8-a2ba-7b56b11d302b\") " pod="openshift-must-gather-qbkkx/must-gather-s5qs9" Feb 19 20:10:14 crc kubenswrapper[4749]: I0219 20:10:14.767337 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qbkkx/must-gather-s5qs9" Feb 19 20:10:15 crc kubenswrapper[4749]: I0219 20:10:15.244095 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qbkkx/must-gather-s5qs9"] Feb 19 20:10:15 crc kubenswrapper[4749]: W0219 20:10:15.246674 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e1e8621_901e_44d8_a2ba_7b56b11d302b.slice/crio-0ac44b808cdd8c071fbad991581b6f7d82174ef191f9771a1c1c71ef69b38c23 WatchSource:0}: Error finding container 0ac44b808cdd8c071fbad991581b6f7d82174ef191f9771a1c1c71ef69b38c23: Status 404 returned error can't find the container with id 0ac44b808cdd8c071fbad991581b6f7d82174ef191f9771a1c1c71ef69b38c23 Feb 19 20:10:15 crc kubenswrapper[4749]: I0219 20:10:15.443762 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qbkkx/must-gather-s5qs9" event={"ID":"8e1e8621-901e-44d8-a2ba-7b56b11d302b","Type":"ContainerStarted","Data":"0ac44b808cdd8c071fbad991581b6f7d82174ef191f9771a1c1c71ef69b38c23"} Feb 19 20:10:16 crc kubenswrapper[4749]: I0219 20:10:16.453971 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qbkkx/must-gather-s5qs9" event={"ID":"8e1e8621-901e-44d8-a2ba-7b56b11d302b","Type":"ContainerStarted","Data":"a7b40b66ddf30100178c4a7ddb28663308449515ba118b756caaf21fea308812"} Feb 19 20:10:16 crc kubenswrapper[4749]: I0219 20:10:16.454349 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qbkkx/must-gather-s5qs9" event={"ID":"8e1e8621-901e-44d8-a2ba-7b56b11d302b","Type":"ContainerStarted","Data":"3374dad2e38ffa4b1ac48acb3a4d0654ba45ec425de5d23303129c4ecbca7a59"} Feb 19 20:10:19 crc kubenswrapper[4749]: I0219 20:10:19.639284 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qbkkx/must-gather-s5qs9" podStartSLOduration=5.639264454 podStartE2EDuration="5.639264454s" podCreationTimestamp="2026-02-19 20:10:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 20:10:16.479941462 +0000 UTC m=+5790.441161416" watchObservedRunningTime="2026-02-19 20:10:19.639264454 +0000 UTC m=+5793.600484418" Feb 19 20:10:19 crc kubenswrapper[4749]: I0219 20:10:19.643524 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qbkkx/crc-debug-5l9x9"] Feb 19 20:10:19 crc kubenswrapper[4749]: I0219 20:10:19.646459 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qbkkx/crc-debug-5l9x9" Feb 19 20:10:19 crc kubenswrapper[4749]: I0219 20:10:19.767729 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs5fx\" (UniqueName: \"kubernetes.io/projected/ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1-kube-api-access-fs5fx\") pod \"crc-debug-5l9x9\" (UID: \"ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1\") " pod="openshift-must-gather-qbkkx/crc-debug-5l9x9" Feb 19 20:10:19 crc kubenswrapper[4749]: I0219 20:10:19.768197 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1-host\") pod \"crc-debug-5l9x9\" (UID: \"ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1\") " pod="openshift-must-gather-qbkkx/crc-debug-5l9x9" Feb 19 20:10:19 crc kubenswrapper[4749]: I0219 20:10:19.869672 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs5fx\" (UniqueName: \"kubernetes.io/projected/ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1-kube-api-access-fs5fx\") pod \"crc-debug-5l9x9\" (UID: \"ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1\") " pod="openshift-must-gather-qbkkx/crc-debug-5l9x9" Feb 19 20:10:19 crc kubenswrapper[4749]: I0219 20:10:19.869749 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1-host\") pod \"crc-debug-5l9x9\" (UID: \"ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1\") " pod="openshift-must-gather-qbkkx/crc-debug-5l9x9" Feb 19 20:10:19 crc kubenswrapper[4749]: I0219 20:10:19.869943 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1-host\") pod \"crc-debug-5l9x9\" (UID: \"ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1\") " pod="openshift-must-gather-qbkkx/crc-debug-5l9x9" Feb 19 20:10:19 crc kubenswrapper[4749]: I0219 20:10:19.889740 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs5fx\" (UniqueName: \"kubernetes.io/projected/ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1-kube-api-access-fs5fx\") pod \"crc-debug-5l9x9\" (UID: \"ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1\") " pod="openshift-must-gather-qbkkx/crc-debug-5l9x9" Feb 19 20:10:19 crc kubenswrapper[4749]: I0219 20:10:19.969726 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qbkkx/crc-debug-5l9x9" Feb 19 20:10:19 crc kubenswrapper[4749]: W0219 20:10:19.996270 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff79f60e_94c5_4c5c_822b_60a8c0c9b8c1.slice/crio-1e1f99f27bab284310828149e8d6be6816f381605dbf513bdc1290c5378f70cf WatchSource:0}: Error finding container 1e1f99f27bab284310828149e8d6be6816f381605dbf513bdc1290c5378f70cf: Status 404 returned error can't find the container with id 1e1f99f27bab284310828149e8d6be6816f381605dbf513bdc1290c5378f70cf Feb 19 20:10:20 crc kubenswrapper[4749]: I0219 20:10:20.491438 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qbkkx/crc-debug-5l9x9" event={"ID":"ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1","Type":"ContainerStarted","Data":"606d946adb7abd8be1c12fc4c9338149adc81017f1cb0ad459a0d34e26bed187"} Feb 19 20:10:20 crc kubenswrapper[4749]: I0219 20:10:20.491928 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qbkkx/crc-debug-5l9x9" event={"ID":"ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1","Type":"ContainerStarted","Data":"1e1f99f27bab284310828149e8d6be6816f381605dbf513bdc1290c5378f70cf"} Feb 19 20:10:20 crc kubenswrapper[4749]: I0219 20:10:20.513951 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qbkkx/crc-debug-5l9x9" podStartSLOduration=1.513923398 podStartE2EDuration="1.513923398s" podCreationTimestamp="2026-02-19 20:10:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 20:10:20.504404818 +0000 UTC m=+5794.465624792" watchObservedRunningTime="2026-02-19 20:10:20.513923398 +0000 UTC m=+5794.475143362" Feb 19 20:11:00 crc kubenswrapper[4749]: I0219 20:11:00.909230 4749 generic.go:334] "Generic (PLEG): container finished" podID="ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1" containerID="606d946adb7abd8be1c12fc4c9338149adc81017f1cb0ad459a0d34e26bed187" exitCode=0 Feb 19 20:11:00 crc kubenswrapper[4749]: I0219 20:11:00.909294 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qbkkx/crc-debug-5l9x9" event={"ID":"ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1","Type":"ContainerDied","Data":"606d946adb7abd8be1c12fc4c9338149adc81017f1cb0ad459a0d34e26bed187"} Feb 19 20:11:02 crc kubenswrapper[4749]: I0219 20:11:02.050440 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qbkkx/crc-debug-5l9x9" Feb 19 20:11:02 crc kubenswrapper[4749]: I0219 20:11:02.066307 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fs5fx\" (UniqueName: \"kubernetes.io/projected/ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1-kube-api-access-fs5fx\") pod \"ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1\" (UID: \"ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1\") " Feb 19 20:11:02 crc kubenswrapper[4749]: I0219 20:11:02.066350 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1-host\") pod \"ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1\" (UID: \"ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1\") " Feb 19 20:11:02 crc kubenswrapper[4749]: I0219 20:11:02.066549 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1-host" (OuterVolumeSpecName: "host") pod "ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1" (UID: "ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 20:11:02 crc kubenswrapper[4749]: I0219 20:11:02.067125 4749 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1-host\") on node \"crc\" DevicePath \"\"" Feb 19 20:11:02 crc kubenswrapper[4749]: I0219 20:11:02.073376 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1-kube-api-access-fs5fx" (OuterVolumeSpecName: "kube-api-access-fs5fx") pod "ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1" (UID: "ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1"). InnerVolumeSpecName "kube-api-access-fs5fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:11:02 crc kubenswrapper[4749]: I0219 20:11:02.098535 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qbkkx/crc-debug-5l9x9"] Feb 19 20:11:02 crc kubenswrapper[4749]: I0219 20:11:02.108790 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qbkkx/crc-debug-5l9x9"] Feb 19 20:11:02 crc kubenswrapper[4749]: I0219 20:11:02.169984 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fs5fx\" (UniqueName: \"kubernetes.io/projected/ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1-kube-api-access-fs5fx\") on node \"crc\" DevicePath \"\"" Feb 19 20:11:02 crc kubenswrapper[4749]: I0219 20:11:02.690087 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1" path="/var/lib/kubelet/pods/ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1/volumes" Feb 19 20:11:02 crc kubenswrapper[4749]: I0219 20:11:02.928745 4749 scope.go:117] "RemoveContainer" containerID="606d946adb7abd8be1c12fc4c9338149adc81017f1cb0ad459a0d34e26bed187" Feb 19 20:11:02 crc kubenswrapper[4749]: I0219 20:11:02.928821 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qbkkx/crc-debug-5l9x9" Feb 19 20:11:03 crc kubenswrapper[4749]: I0219 20:11:03.343797 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qbkkx/crc-debug-rb979"] Feb 19 20:11:03 crc kubenswrapper[4749]: E0219 20:11:03.344240 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1" containerName="container-00" Feb 19 20:11:03 crc kubenswrapper[4749]: I0219 20:11:03.344252 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1" containerName="container-00" Feb 19 20:11:03 crc kubenswrapper[4749]: I0219 20:11:03.344528 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff79f60e-94c5-4c5c-822b-60a8c0c9b8c1" containerName="container-00" Feb 19 20:11:03 crc kubenswrapper[4749]: I0219 20:11:03.345183 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qbkkx/crc-debug-rb979" Feb 19 20:11:03 crc kubenswrapper[4749]: I0219 20:11:03.403375 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6thng\" (UniqueName: \"kubernetes.io/projected/41f56c15-e8f6-45aa-8a4a-e45022cc616b-kube-api-access-6thng\") pod \"crc-debug-rb979\" (UID: \"41f56c15-e8f6-45aa-8a4a-e45022cc616b\") " pod="openshift-must-gather-qbkkx/crc-debug-rb979" Feb 19 20:11:03 crc kubenswrapper[4749]: I0219 20:11:03.403480 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41f56c15-e8f6-45aa-8a4a-e45022cc616b-host\") pod \"crc-debug-rb979\" (UID: \"41f56c15-e8f6-45aa-8a4a-e45022cc616b\") " pod="openshift-must-gather-qbkkx/crc-debug-rb979" Feb 19 20:11:03 crc kubenswrapper[4749]: I0219 20:11:03.505797 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6thng\" (UniqueName: \"kubernetes.io/projected/41f56c15-e8f6-45aa-8a4a-e45022cc616b-kube-api-access-6thng\") pod \"crc-debug-rb979\" (UID: \"41f56c15-e8f6-45aa-8a4a-e45022cc616b\") " pod="openshift-must-gather-qbkkx/crc-debug-rb979" Feb 19 20:11:03 crc kubenswrapper[4749]: I0219 20:11:03.505935 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41f56c15-e8f6-45aa-8a4a-e45022cc616b-host\") pod \"crc-debug-rb979\" (UID: \"41f56c15-e8f6-45aa-8a4a-e45022cc616b\") " pod="openshift-must-gather-qbkkx/crc-debug-rb979" Feb 19 20:11:03 crc kubenswrapper[4749]: I0219 20:11:03.506174 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41f56c15-e8f6-45aa-8a4a-e45022cc616b-host\") pod \"crc-debug-rb979\" (UID: \"41f56c15-e8f6-45aa-8a4a-e45022cc616b\") " pod="openshift-must-gather-qbkkx/crc-debug-rb979" Feb 19 20:11:03 crc kubenswrapper[4749]: I0219 20:11:03.539239 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6thng\" (UniqueName: \"kubernetes.io/projected/41f56c15-e8f6-45aa-8a4a-e45022cc616b-kube-api-access-6thng\") pod \"crc-debug-rb979\" (UID: \"41f56c15-e8f6-45aa-8a4a-e45022cc616b\") " pod="openshift-must-gather-qbkkx/crc-debug-rb979" Feb 19 20:11:03 crc kubenswrapper[4749]: I0219 20:11:03.663363 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qbkkx/crc-debug-rb979" Feb 19 20:11:03 crc kubenswrapper[4749]: I0219 20:11:03.941857 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qbkkx/crc-debug-rb979" event={"ID":"41f56c15-e8f6-45aa-8a4a-e45022cc616b","Type":"ContainerStarted","Data":"4c7384631682fbb33ff7da4899d106fed791d4dd6d8d04b487667f066a9c2bf1"} Feb 19 20:11:04 crc kubenswrapper[4749]: I0219 20:11:04.954043 4749 generic.go:334] "Generic (PLEG): container finished" podID="41f56c15-e8f6-45aa-8a4a-e45022cc616b" containerID="df8298cb06d59b2583597f23c71fc10924009b208f4624f81a268634f1cc3fa0" exitCode=0 Feb 19 20:11:04 crc kubenswrapper[4749]: I0219 20:11:04.954147 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qbkkx/crc-debug-rb979" event={"ID":"41f56c15-e8f6-45aa-8a4a-e45022cc616b","Type":"ContainerDied","Data":"df8298cb06d59b2583597f23c71fc10924009b208f4624f81a268634f1cc3fa0"} Feb 19 20:11:05 crc kubenswrapper[4749]: I0219 20:11:05.596356 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jp7w4"] Feb 19 20:11:05 crc kubenswrapper[4749]: I0219 20:11:05.601114 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jp7w4" Feb 19 20:11:05 crc kubenswrapper[4749]: I0219 20:11:05.615511 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jp7w4"] Feb 19 20:11:05 crc kubenswrapper[4749]: I0219 20:11:05.647970 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b999dc79-b1cf-4ab3-99c2-19092badce3c-catalog-content\") pod \"redhat-marketplace-jp7w4\" (UID: \"b999dc79-b1cf-4ab3-99c2-19092badce3c\") " pod="openshift-marketplace/redhat-marketplace-jp7w4" Feb 19 20:11:05 crc kubenswrapper[4749]: I0219 20:11:05.648268 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh9rb\" (UniqueName: \"kubernetes.io/projected/b999dc79-b1cf-4ab3-99c2-19092badce3c-kube-api-access-nh9rb\") pod \"redhat-marketplace-jp7w4\" (UID: \"b999dc79-b1cf-4ab3-99c2-19092badce3c\") " pod="openshift-marketplace/redhat-marketplace-jp7w4" Feb 19 20:11:05 crc kubenswrapper[4749]: I0219 20:11:05.648378 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b999dc79-b1cf-4ab3-99c2-19092badce3c-utilities\") pod \"redhat-marketplace-jp7w4\" (UID: \"b999dc79-b1cf-4ab3-99c2-19092badce3c\") " pod="openshift-marketplace/redhat-marketplace-jp7w4" Feb 19 20:11:05 crc kubenswrapper[4749]: I0219 20:11:05.749840 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh9rb\" (UniqueName: \"kubernetes.io/projected/b999dc79-b1cf-4ab3-99c2-19092badce3c-kube-api-access-nh9rb\") pod \"redhat-marketplace-jp7w4\" (UID: \"b999dc79-b1cf-4ab3-99c2-19092badce3c\") " pod="openshift-marketplace/redhat-marketplace-jp7w4" Feb 19 20:11:05 crc kubenswrapper[4749]: I0219 20:11:05.750193 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b999dc79-b1cf-4ab3-99c2-19092badce3c-utilities\") pod \"redhat-marketplace-jp7w4\" (UID: \"b999dc79-b1cf-4ab3-99c2-19092badce3c\") " pod="openshift-marketplace/redhat-marketplace-jp7w4" Feb 19 20:11:05 crc kubenswrapper[4749]: I0219 20:11:05.750234 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b999dc79-b1cf-4ab3-99c2-19092badce3c-catalog-content\") pod \"redhat-marketplace-jp7w4\" (UID: \"b999dc79-b1cf-4ab3-99c2-19092badce3c\") " pod="openshift-marketplace/redhat-marketplace-jp7w4" Feb 19 20:11:05 crc kubenswrapper[4749]: I0219 20:11:05.751699 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b999dc79-b1cf-4ab3-99c2-19092badce3c-utilities\") pod \"redhat-marketplace-jp7w4\" (UID: \"b999dc79-b1cf-4ab3-99c2-19092badce3c\") " pod="openshift-marketplace/redhat-marketplace-jp7w4" Feb 19 20:11:05 crc kubenswrapper[4749]: I0219 20:11:05.753339 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b999dc79-b1cf-4ab3-99c2-19092badce3c-catalog-content\") pod \"redhat-marketplace-jp7w4\" (UID: \"b999dc79-b1cf-4ab3-99c2-19092badce3c\") " pod="openshift-marketplace/redhat-marketplace-jp7w4" Feb 19 20:11:05 crc kubenswrapper[4749]: I0219 20:11:05.779008 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh9rb\" (UniqueName: \"kubernetes.io/projected/b999dc79-b1cf-4ab3-99c2-19092badce3c-kube-api-access-nh9rb\") pod \"redhat-marketplace-jp7w4\" (UID: \"b999dc79-b1cf-4ab3-99c2-19092badce3c\") " pod="openshift-marketplace/redhat-marketplace-jp7w4" Feb 19 20:11:05 crc kubenswrapper[4749]: I0219 20:11:05.921156 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jp7w4" Feb 19 20:11:06 crc kubenswrapper[4749]: I0219 20:11:06.107387 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qbkkx/crc-debug-rb979" Feb 19 20:11:06 crc kubenswrapper[4749]: I0219 20:11:06.163753 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41f56c15-e8f6-45aa-8a4a-e45022cc616b-host\") pod \"41f56c15-e8f6-45aa-8a4a-e45022cc616b\" (UID: \"41f56c15-e8f6-45aa-8a4a-e45022cc616b\") " Feb 19 20:11:06 crc kubenswrapper[4749]: I0219 20:11:06.163824 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6thng\" (UniqueName: \"kubernetes.io/projected/41f56c15-e8f6-45aa-8a4a-e45022cc616b-kube-api-access-6thng\") pod \"41f56c15-e8f6-45aa-8a4a-e45022cc616b\" (UID: \"41f56c15-e8f6-45aa-8a4a-e45022cc616b\") " Feb 19 20:11:06 crc kubenswrapper[4749]: I0219 20:11:06.164182 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41f56c15-e8f6-45aa-8a4a-e45022cc616b-host" (OuterVolumeSpecName: "host") pod "41f56c15-e8f6-45aa-8a4a-e45022cc616b" (UID: "41f56c15-e8f6-45aa-8a4a-e45022cc616b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 20:11:06 crc kubenswrapper[4749]: I0219 20:11:06.164342 4749 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41f56c15-e8f6-45aa-8a4a-e45022cc616b-host\") on node \"crc\" DevicePath \"\"" Feb 19 20:11:06 crc kubenswrapper[4749]: I0219 20:11:06.169251 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41f56c15-e8f6-45aa-8a4a-e45022cc616b-kube-api-access-6thng" (OuterVolumeSpecName: "kube-api-access-6thng") pod "41f56c15-e8f6-45aa-8a4a-e45022cc616b" (UID: "41f56c15-e8f6-45aa-8a4a-e45022cc616b"). InnerVolumeSpecName "kube-api-access-6thng". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:11:06 crc kubenswrapper[4749]: I0219 20:11:06.269597 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6thng\" (UniqueName: \"kubernetes.io/projected/41f56c15-e8f6-45aa-8a4a-e45022cc616b-kube-api-access-6thng\") on node \"crc\" DevicePath \"\"" Feb 19 20:11:06 crc kubenswrapper[4749]: I0219 20:11:06.464782 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jp7w4"] Feb 19 20:11:06 crc kubenswrapper[4749]: W0219 20:11:06.465736 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb999dc79_b1cf_4ab3_99c2_19092badce3c.slice/crio-f361703a662330a394120b0e07eef63bbe12db08d426e143a60f873f24ac02a2 WatchSource:0}: Error finding container f361703a662330a394120b0e07eef63bbe12db08d426e143a60f873f24ac02a2: Status 404 returned error can't find the container with id f361703a662330a394120b0e07eef63bbe12db08d426e143a60f873f24ac02a2 Feb 19 20:11:06 crc kubenswrapper[4749]: I0219 20:11:06.975349 4749 generic.go:334] "Generic (PLEG): container finished" podID="b999dc79-b1cf-4ab3-99c2-19092badce3c" containerID="395b38bb392c88642145a3a714bcd5dfee6f250655882e024b3226d23a882da8" exitCode=0 Feb 19 20:11:06 crc kubenswrapper[4749]: I0219 20:11:06.975419 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp7w4" event={"ID":"b999dc79-b1cf-4ab3-99c2-19092badce3c","Type":"ContainerDied","Data":"395b38bb392c88642145a3a714bcd5dfee6f250655882e024b3226d23a882da8"} Feb 19 20:11:06 crc kubenswrapper[4749]: I0219 20:11:06.975768 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp7w4" event={"ID":"b999dc79-b1cf-4ab3-99c2-19092badce3c","Type":"ContainerStarted","Data":"f361703a662330a394120b0e07eef63bbe12db08d426e143a60f873f24ac02a2"} Feb 19 20:11:06 crc kubenswrapper[4749]: I0219 20:11:06.977886 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qbkkx/crc-debug-rb979" event={"ID":"41f56c15-e8f6-45aa-8a4a-e45022cc616b","Type":"ContainerDied","Data":"4c7384631682fbb33ff7da4899d106fed791d4dd6d8d04b487667f066a9c2bf1"} Feb 19 20:11:06 crc kubenswrapper[4749]: I0219 20:11:06.977941 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c7384631682fbb33ff7da4899d106fed791d4dd6d8d04b487667f066a9c2bf1" Feb 19 20:11:06 crc kubenswrapper[4749]: I0219 20:11:06.977995 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qbkkx/crc-debug-rb979" Feb 19 20:11:06 crc kubenswrapper[4749]: I0219 20:11:06.978091 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 20:11:07 crc kubenswrapper[4749]: I0219 20:11:07.248513 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qbkkx/crc-debug-rb979"] Feb 19 20:11:07 crc kubenswrapper[4749]: I0219 20:11:07.259398 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qbkkx/crc-debug-rb979"] Feb 19 20:11:08 crc kubenswrapper[4749]: I0219 20:11:08.443887 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qbkkx/crc-debug-ccvrp"] Feb 19 20:11:08 crc kubenswrapper[4749]: E0219 20:11:08.444322 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41f56c15-e8f6-45aa-8a4a-e45022cc616b" containerName="container-00" Feb 19 20:11:08 crc kubenswrapper[4749]: I0219 20:11:08.444335 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f56c15-e8f6-45aa-8a4a-e45022cc616b" containerName="container-00" Feb 19 20:11:08 crc kubenswrapper[4749]: I0219 20:11:08.462377 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="41f56c15-e8f6-45aa-8a4a-e45022cc616b" containerName="container-00" Feb 19 20:11:08 crc kubenswrapper[4749]: I0219 20:11:08.463692 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qbkkx/crc-debug-ccvrp" Feb 19 20:11:08 crc kubenswrapper[4749]: I0219 20:11:08.526815 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsbmh\" (UniqueName: \"kubernetes.io/projected/0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a-kube-api-access-hsbmh\") pod \"crc-debug-ccvrp\" (UID: \"0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a\") " pod="openshift-must-gather-qbkkx/crc-debug-ccvrp" Feb 19 20:11:08 crc kubenswrapper[4749]: I0219 20:11:08.527401 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a-host\") pod \"crc-debug-ccvrp\" (UID: \"0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a\") " pod="openshift-must-gather-qbkkx/crc-debug-ccvrp" Feb 19 20:11:08 crc kubenswrapper[4749]: I0219 20:11:08.628934 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsbmh\" (UniqueName: \"kubernetes.io/projected/0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a-kube-api-access-hsbmh\") pod \"crc-debug-ccvrp\" (UID: \"0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a\") " pod="openshift-must-gather-qbkkx/crc-debug-ccvrp" Feb 19 20:11:08 crc kubenswrapper[4749]: I0219 20:11:08.629110 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a-host\") pod \"crc-debug-ccvrp\" (UID: \"0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a\") " pod="openshift-must-gather-qbkkx/crc-debug-ccvrp" Feb 19 20:11:08 crc kubenswrapper[4749]: I0219 20:11:08.629257 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a-host\") pod \"crc-debug-ccvrp\" (UID: \"0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a\") " pod="openshift-must-gather-qbkkx/crc-debug-ccvrp" Feb 19 20:11:08 crc kubenswrapper[4749]: I0219 20:11:08.650560 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsbmh\" (UniqueName: \"kubernetes.io/projected/0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a-kube-api-access-hsbmh\") pod \"crc-debug-ccvrp\" (UID: \"0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a\") " pod="openshift-must-gather-qbkkx/crc-debug-ccvrp" Feb 19 20:11:08 crc kubenswrapper[4749]: I0219 20:11:08.689281 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41f56c15-e8f6-45aa-8a4a-e45022cc616b" path="/var/lib/kubelet/pods/41f56c15-e8f6-45aa-8a4a-e45022cc616b/volumes" Feb 19 20:11:08 crc kubenswrapper[4749]: I0219 20:11:08.791543 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qbkkx/crc-debug-ccvrp" Feb 19 20:11:08 crc kubenswrapper[4749]: W0219 20:11:08.833441 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c58b35d_f43e_4fd0_9e6f_3695d4e4c24a.slice/crio-7026eb41c583ad041eac9a3c29c4cf7199b425a1693fe340fb395e381112bb05 WatchSource:0}: Error finding container 7026eb41c583ad041eac9a3c29c4cf7199b425a1693fe340fb395e381112bb05: Status 404 returned error can't find the container with id 7026eb41c583ad041eac9a3c29c4cf7199b425a1693fe340fb395e381112bb05 Feb 19 20:11:08 crc kubenswrapper[4749]: I0219 20:11:08.996077 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qbkkx/crc-debug-ccvrp" event={"ID":"0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a","Type":"ContainerStarted","Data":"7026eb41c583ad041eac9a3c29c4cf7199b425a1693fe340fb395e381112bb05"} Feb 19 20:11:10 crc kubenswrapper[4749]: I0219 20:11:10.007684 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp7w4" event={"ID":"b999dc79-b1cf-4ab3-99c2-19092badce3c","Type":"ContainerStarted","Data":"e421b171fa1814d7b6a97599957d502d875fa029074466109a6b390024160f1b"} Feb 19 20:11:10 crc kubenswrapper[4749]: I0219 20:11:10.010070 4749 generic.go:334] "Generic (PLEG): container finished" podID="0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a" containerID="3832534e1ed0601217cffba795d39e3cc7fa689316fb7c282d0cddf71a96ebeb" exitCode=0 Feb 19 20:11:10 crc kubenswrapper[4749]: I0219 20:11:10.010131 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qbkkx/crc-debug-ccvrp" event={"ID":"0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a","Type":"ContainerDied","Data":"3832534e1ed0601217cffba795d39e3cc7fa689316fb7c282d0cddf71a96ebeb"} Feb 19 20:11:10 crc kubenswrapper[4749]: I0219 20:11:10.070749 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qbkkx/crc-debug-ccvrp"] Feb 19 20:11:10 crc kubenswrapper[4749]: I0219 20:11:10.081063 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qbkkx/crc-debug-ccvrp"] Feb 19 20:11:11 crc kubenswrapper[4749]: I0219 20:11:11.135715 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qbkkx/crc-debug-ccvrp" Feb 19 20:11:11 crc kubenswrapper[4749]: I0219 20:11:11.282198 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a-host\") pod \"0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a\" (UID: \"0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a\") " Feb 19 20:11:11 crc kubenswrapper[4749]: I0219 20:11:11.282301 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a-host" (OuterVolumeSpecName: "host") pod "0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a" (UID: "0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 20:11:11 crc kubenswrapper[4749]: I0219 20:11:11.282847 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsbmh\" (UniqueName: \"kubernetes.io/projected/0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a-kube-api-access-hsbmh\") pod \"0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a\" (UID: \"0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a\") " Feb 19 20:11:11 crc kubenswrapper[4749]: I0219 20:11:11.283498 4749 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a-host\") on node \"crc\" DevicePath \"\"" Feb 19 20:11:11 crc kubenswrapper[4749]: I0219 20:11:11.292519 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a-kube-api-access-hsbmh" (OuterVolumeSpecName: "kube-api-access-hsbmh") pod "0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a" (UID: "0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a"). InnerVolumeSpecName "kube-api-access-hsbmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:11:11 crc kubenswrapper[4749]: I0219 20:11:11.389063 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsbmh\" (UniqueName: \"kubernetes.io/projected/0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a-kube-api-access-hsbmh\") on node \"crc\" DevicePath \"\"" Feb 19 20:11:12 crc kubenswrapper[4749]: I0219 20:11:12.033217 4749 scope.go:117] "RemoveContainer" containerID="3832534e1ed0601217cffba795d39e3cc7fa689316fb7c282d0cddf71a96ebeb" Feb 19 20:11:12 crc kubenswrapper[4749]: I0219 20:11:12.033380 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qbkkx/crc-debug-ccvrp" Feb 19 20:11:12 crc kubenswrapper[4749]: I0219 20:11:12.688642 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a" path="/var/lib/kubelet/pods/0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a/volumes" Feb 19 20:11:13 crc kubenswrapper[4749]: I0219 20:11:13.043861 4749 generic.go:334] "Generic (PLEG): container finished" podID="b999dc79-b1cf-4ab3-99c2-19092badce3c" containerID="e421b171fa1814d7b6a97599957d502d875fa029074466109a6b390024160f1b" exitCode=0 Feb 19 20:11:13 crc kubenswrapper[4749]: I0219 20:11:13.043936 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp7w4" event={"ID":"b999dc79-b1cf-4ab3-99c2-19092badce3c","Type":"ContainerDied","Data":"e421b171fa1814d7b6a97599957d502d875fa029074466109a6b390024160f1b"} Feb 19 20:11:15 crc kubenswrapper[4749]: I0219 20:11:15.064335 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp7w4" event={"ID":"b999dc79-b1cf-4ab3-99c2-19092badce3c","Type":"ContainerStarted","Data":"5cf19d9594e5e850ccdcfeb729068b2396605dc7b15f56e8db23af2cb7e63721"} Feb 19 20:11:15 crc kubenswrapper[4749]: I0219 20:11:15.094602 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jp7w4" podStartSLOduration=2.848280456 podStartE2EDuration="10.094579309s" podCreationTimestamp="2026-02-19 20:11:05 +0000 UTC" firstStartedPulling="2026-02-19 20:11:06.977737882 +0000 UTC m=+5840.938957836" lastFinishedPulling="2026-02-19 20:11:14.224036725 +0000 UTC m=+5848.185256689" observedRunningTime="2026-02-19 20:11:15.086177306 +0000 UTC m=+5849.047397260" watchObservedRunningTime="2026-02-19 20:11:15.094579309 +0000 UTC m=+5849.055799273" Feb 19 20:11:15 crc kubenswrapper[4749]: I0219 20:11:15.922065 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jp7w4" Feb 19 20:11:15 crc kubenswrapper[4749]: I0219 20:11:15.922429 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jp7w4" Feb 19 20:11:16 crc kubenswrapper[4749]: I0219 20:11:16.982201 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-jp7w4" podUID="b999dc79-b1cf-4ab3-99c2-19092badce3c" containerName="registry-server" probeResult="failure" output=< Feb 19 20:11:16 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Feb 19 20:11:16 crc kubenswrapper[4749]: > Feb 19 20:11:25 crc kubenswrapper[4749]: I0219 20:11:25.972978 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jp7w4" Feb 19 20:11:26 crc kubenswrapper[4749]: I0219 20:11:26.037006 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jp7w4" Feb 19 20:11:26 crc kubenswrapper[4749]: I0219 20:11:26.213246 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jp7w4"] Feb 19 20:11:27 crc kubenswrapper[4749]: I0219 20:11:27.169060 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jp7w4" podUID="b999dc79-b1cf-4ab3-99c2-19092badce3c" containerName="registry-server" containerID="cri-o://5cf19d9594e5e850ccdcfeb729068b2396605dc7b15f56e8db23af2cb7e63721" gracePeriod=2 Feb 19 20:11:27 crc kubenswrapper[4749]: I0219 20:11:27.670479 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jp7w4" Feb 19 20:11:27 crc kubenswrapper[4749]: I0219 20:11:27.815261 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b999dc79-b1cf-4ab3-99c2-19092badce3c-utilities\") pod \"b999dc79-b1cf-4ab3-99c2-19092badce3c\" (UID: \"b999dc79-b1cf-4ab3-99c2-19092badce3c\") " Feb 19 20:11:27 crc kubenswrapper[4749]: I0219 20:11:27.815816 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh9rb\" (UniqueName: \"kubernetes.io/projected/b999dc79-b1cf-4ab3-99c2-19092badce3c-kube-api-access-nh9rb\") pod \"b999dc79-b1cf-4ab3-99c2-19092badce3c\" (UID: \"b999dc79-b1cf-4ab3-99c2-19092badce3c\") " Feb 19 20:11:27 crc kubenswrapper[4749]: I0219 20:11:27.815951 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b999dc79-b1cf-4ab3-99c2-19092badce3c-catalog-content\") pod \"b999dc79-b1cf-4ab3-99c2-19092badce3c\" (UID: \"b999dc79-b1cf-4ab3-99c2-19092badce3c\") " Feb 19 20:11:27 crc kubenswrapper[4749]: I0219 20:11:27.816368 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b999dc79-b1cf-4ab3-99c2-19092badce3c-utilities" (OuterVolumeSpecName: "utilities") pod "b999dc79-b1cf-4ab3-99c2-19092badce3c" (UID: "b999dc79-b1cf-4ab3-99c2-19092badce3c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:11:27 crc kubenswrapper[4749]: I0219 20:11:27.822051 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b999dc79-b1cf-4ab3-99c2-19092badce3c-kube-api-access-nh9rb" (OuterVolumeSpecName: "kube-api-access-nh9rb") pod "b999dc79-b1cf-4ab3-99c2-19092badce3c" (UID: "b999dc79-b1cf-4ab3-99c2-19092badce3c"). InnerVolumeSpecName "kube-api-access-nh9rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:11:27 crc kubenswrapper[4749]: I0219 20:11:27.838847 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b999dc79-b1cf-4ab3-99c2-19092badce3c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b999dc79-b1cf-4ab3-99c2-19092badce3c" (UID: "b999dc79-b1cf-4ab3-99c2-19092badce3c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:11:27 crc kubenswrapper[4749]: I0219 20:11:27.917413 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b999dc79-b1cf-4ab3-99c2-19092badce3c-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 20:11:27 crc kubenswrapper[4749]: I0219 20:11:27.917448 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh9rb\" (UniqueName: \"kubernetes.io/projected/b999dc79-b1cf-4ab3-99c2-19092badce3c-kube-api-access-nh9rb\") on node \"crc\" DevicePath \"\"" Feb 19 20:11:27 crc kubenswrapper[4749]: I0219 20:11:27.917460 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b999dc79-b1cf-4ab3-99c2-19092badce3c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 20:11:28 crc kubenswrapper[4749]: I0219 20:11:28.182743 4749 generic.go:334] "Generic (PLEG): container finished" podID="b999dc79-b1cf-4ab3-99c2-19092badce3c" containerID="5cf19d9594e5e850ccdcfeb729068b2396605dc7b15f56e8db23af2cb7e63721" exitCode=0 Feb 19 20:11:28 crc kubenswrapper[4749]: I0219 20:11:28.182779 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp7w4" event={"ID":"b999dc79-b1cf-4ab3-99c2-19092badce3c","Type":"ContainerDied","Data":"5cf19d9594e5e850ccdcfeb729068b2396605dc7b15f56e8db23af2cb7e63721"} Feb 19 20:11:28 crc kubenswrapper[4749]: I0219 20:11:28.182824 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp7w4" event={"ID":"b999dc79-b1cf-4ab3-99c2-19092badce3c","Type":"ContainerDied","Data":"f361703a662330a394120b0e07eef63bbe12db08d426e143a60f873f24ac02a2"} Feb 19 20:11:28 crc kubenswrapper[4749]: I0219 20:11:28.182843 4749 scope.go:117] "RemoveContainer" containerID="5cf19d9594e5e850ccdcfeb729068b2396605dc7b15f56e8db23af2cb7e63721" Feb 19 20:11:28 crc kubenswrapper[4749]: I0219 20:11:28.182842 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jp7w4" Feb 19 20:11:28 crc kubenswrapper[4749]: I0219 20:11:28.218208 4749 scope.go:117] "RemoveContainer" containerID="e421b171fa1814d7b6a97599957d502d875fa029074466109a6b390024160f1b" Feb 19 20:11:28 crc kubenswrapper[4749]: I0219 20:11:28.222590 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jp7w4"] Feb 19 20:11:28 crc kubenswrapper[4749]: I0219 20:11:28.233813 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jp7w4"] Feb 19 20:11:28 crc kubenswrapper[4749]: I0219 20:11:28.239738 4749 scope.go:117] "RemoveContainer" containerID="395b38bb392c88642145a3a714bcd5dfee6f250655882e024b3226d23a882da8" Feb 19 20:11:28 crc kubenswrapper[4749]: I0219 20:11:28.287188 4749 scope.go:117] "RemoveContainer" containerID="5cf19d9594e5e850ccdcfeb729068b2396605dc7b15f56e8db23af2cb7e63721" Feb 19 20:11:28 crc kubenswrapper[4749]: E0219 20:11:28.287663 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cf19d9594e5e850ccdcfeb729068b2396605dc7b15f56e8db23af2cb7e63721\": container with ID starting with 5cf19d9594e5e850ccdcfeb729068b2396605dc7b15f56e8db23af2cb7e63721 not found: ID does not exist" containerID="5cf19d9594e5e850ccdcfeb729068b2396605dc7b15f56e8db23af2cb7e63721" Feb 19 20:11:28 crc kubenswrapper[4749]: I0219 20:11:28.287691 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cf19d9594e5e850ccdcfeb729068b2396605dc7b15f56e8db23af2cb7e63721"} err="failed to get container status \"5cf19d9594e5e850ccdcfeb729068b2396605dc7b15f56e8db23af2cb7e63721\": rpc error: code = NotFound desc = could not find container \"5cf19d9594e5e850ccdcfeb729068b2396605dc7b15f56e8db23af2cb7e63721\": container with ID starting with 5cf19d9594e5e850ccdcfeb729068b2396605dc7b15f56e8db23af2cb7e63721 not found: ID does not exist" Feb 19 20:11:28 crc kubenswrapper[4749]: I0219 20:11:28.287716 4749 scope.go:117] "RemoveContainer" containerID="e421b171fa1814d7b6a97599957d502d875fa029074466109a6b390024160f1b" Feb 19 20:11:28 crc kubenswrapper[4749]: E0219 20:11:28.288110 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e421b171fa1814d7b6a97599957d502d875fa029074466109a6b390024160f1b\": container with ID starting with e421b171fa1814d7b6a97599957d502d875fa029074466109a6b390024160f1b not found: ID does not exist" containerID="e421b171fa1814d7b6a97599957d502d875fa029074466109a6b390024160f1b" Feb 19 20:11:28 crc kubenswrapper[4749]: I0219 20:11:28.288156 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e421b171fa1814d7b6a97599957d502d875fa029074466109a6b390024160f1b"} err="failed to get container status \"e421b171fa1814d7b6a97599957d502d875fa029074466109a6b390024160f1b\": rpc error: code = NotFound desc = could not find container \"e421b171fa1814d7b6a97599957d502d875fa029074466109a6b390024160f1b\": container with ID starting with e421b171fa1814d7b6a97599957d502d875fa029074466109a6b390024160f1b not found: ID does not exist" Feb 19 20:11:28 crc kubenswrapper[4749]: I0219 20:11:28.288184 4749 scope.go:117] "RemoveContainer" containerID="395b38bb392c88642145a3a714bcd5dfee6f250655882e024b3226d23a882da8" Feb 19 20:11:28 crc kubenswrapper[4749]: E0219 20:11:28.288502 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"395b38bb392c88642145a3a714bcd5dfee6f250655882e024b3226d23a882da8\": container with ID starting with 395b38bb392c88642145a3a714bcd5dfee6f250655882e024b3226d23a882da8 not found: ID does not exist" containerID="395b38bb392c88642145a3a714bcd5dfee6f250655882e024b3226d23a882da8" Feb 19 20:11:28 crc kubenswrapper[4749]: I0219 20:11:28.288525 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"395b38bb392c88642145a3a714bcd5dfee6f250655882e024b3226d23a882da8"} err="failed to get container status \"395b38bb392c88642145a3a714bcd5dfee6f250655882e024b3226d23a882da8\": rpc error: code = NotFound desc = could not find container \"395b38bb392c88642145a3a714bcd5dfee6f250655882e024b3226d23a882da8\": container with ID starting with 395b38bb392c88642145a3a714bcd5dfee6f250655882e024b3226d23a882da8 not found: ID does not exist" Feb 19 20:11:28 crc kubenswrapper[4749]: I0219 20:11:28.689827 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b999dc79-b1cf-4ab3-99c2-19092badce3c" path="/var/lib/kubelet/pods/b999dc79-b1cf-4ab3-99c2-19092badce3c/volumes" Feb 19 20:11:54 crc kubenswrapper[4749]: I0219 20:11:54.062069 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6d97b8448-pl5l5_f19ae41b-3804-434e-b4a6-d461167f9548/barbican-api/0.log" Feb 19 20:11:54 crc kubenswrapper[4749]: I0219 20:11:54.272416 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d44554668-c49q8_2ea2def9-7751-439c-8c18-05f3568cae9f/barbican-keystone-listener/0.log" Feb 19 20:11:54 crc kubenswrapper[4749]: I0219 20:11:54.275867 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6d97b8448-pl5l5_f19ae41b-3804-434e-b4a6-d461167f9548/barbican-api-log/0.log" Feb 19 20:11:54 crc kubenswrapper[4749]: I0219 20:11:54.359382 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d44554668-c49q8_2ea2def9-7751-439c-8c18-05f3568cae9f/barbican-keystone-listener-log/0.log" Feb 19 20:11:54 crc kubenswrapper[4749]: I0219 20:11:54.486777 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7b845bddc9-bzwtz_16aa5e20-01b7-401e-abfd-161e81af9c70/barbican-worker/0.log" Feb 19 20:11:54 crc kubenswrapper[4749]: I0219 20:11:54.549194 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7b845bddc9-bzwtz_16aa5e20-01b7-401e-abfd-161e81af9c70/barbican-worker-log/0.log" Feb 19 20:11:54 crc kubenswrapper[4749]: I0219 20:11:54.714942 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-jz7jq_22b6e4d8-5ab3-4a92-b8a4-e38b68e59744/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:11:54 crc kubenswrapper[4749]: I0219 20:11:54.774785 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e481767e-68e7-4396-b8aa-51956e378132/ceilometer-central-agent/1.log" Feb 19 20:11:54 crc kubenswrapper[4749]: I0219 20:11:54.879960 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e481767e-68e7-4396-b8aa-51956e378132/ceilometer-central-agent/0.log" Feb 19 20:11:54 crc kubenswrapper[4749]: I0219 20:11:54.931220 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e481767e-68e7-4396-b8aa-51956e378132/ceilometer-notification-agent/0.log" Feb 19 20:11:54 crc kubenswrapper[4749]: I0219 20:11:54.987983 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e481767e-68e7-4396-b8aa-51956e378132/sg-core/0.log" Feb 19 20:11:55 crc kubenswrapper[4749]: I0219 20:11:55.004248 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e481767e-68e7-4396-b8aa-51956e378132/proxy-httpd/0.log" Feb 19 20:11:55 crc kubenswrapper[4749]: I0219 20:11:55.224941 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_fa8559a3-5137-4d82-a189-18e060db5fa5/cinder-api-log/0.log" Feb 19 20:11:55 crc kubenswrapper[4749]: I0219 20:11:55.489254 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_94d055c5-b069-494b-a250-27a5c39826c2/probe/0.log" Feb 19 20:11:55 crc kubenswrapper[4749]: I0219 20:11:55.681361 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_94d055c5-b069-494b-a250-27a5c39826c2/cinder-backup/0.log" Feb 19 20:11:55 crc kubenswrapper[4749]: I0219 20:11:55.771295 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f19d3222-dbed-44bf-94e0-7a17f5906051/cinder-scheduler/0.log" Feb 19 20:11:55 crc kubenswrapper[4749]: I0219 20:11:55.858185 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f19d3222-dbed-44bf-94e0-7a17f5906051/probe/0.log" Feb 19 20:11:55 crc kubenswrapper[4749]: I0219 20:11:55.980787 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_fa8559a3-5137-4d82-a189-18e060db5fa5/cinder-api/0.log" Feb 19 20:11:56 crc kubenswrapper[4749]: I0219 20:11:56.122164 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_afe70cf9-b491-4c93-8c94-ae052eb02db4/probe/0.log" Feb 19 20:11:56 crc kubenswrapper[4749]: I0219 20:11:56.180075 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_afe70cf9-b491-4c93-8c94-ae052eb02db4/cinder-volume/0.log" Feb 19 20:11:56 crc kubenswrapper[4749]: I0219 20:11:56.352274 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_f2a5ecf9-3280-4da3-9ea6-6491401e4daa/probe/0.log" Feb 19 20:11:56 crc kubenswrapper[4749]: I0219 20:11:56.427428 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_f2a5ecf9-3280-4da3-9ea6-6491401e4daa/cinder-volume/0.log" Feb 19 20:11:56 crc kubenswrapper[4749]: I0219 20:11:56.455976 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-k9k2z_3c39d9c2-1081-4cf9-96c7-746c51a42207/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:11:56 crc kubenswrapper[4749]: I0219 20:11:56.631628 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-qw75z_780425c2-7f97-4d00-a992-bf0ef5be3876/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:11:56 crc kubenswrapper[4749]: I0219 20:11:56.736670 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7c96bd5bf7-wmjrv_7707fe3e-adab-4755-bf50-f74bb3924913/init/0.log" Feb 19 20:11:56 crc kubenswrapper[4749]: I0219 20:11:56.873756 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7c96bd5bf7-wmjrv_7707fe3e-adab-4755-bf50-f74bb3924913/init/0.log" Feb 19 20:11:56 crc kubenswrapper[4749]: I0219 20:11:56.969092 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-5lc2k_fe29f0cb-bc56-4d7b-983c-52667a6c4ceb/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:11:57 crc kubenswrapper[4749]: I0219 20:11:57.028355 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7c96bd5bf7-wmjrv_7707fe3e-adab-4755-bf50-f74bb3924913/dnsmasq-dns/0.log" Feb 19 20:11:57 crc kubenswrapper[4749]: I0219 20:11:57.173353 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_b40edc19-78bc-456e-ad9f-c3dcae644950/glance-log/0.log" Feb 19 20:11:57 crc kubenswrapper[4749]: I0219 20:11:57.190013 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_b40edc19-78bc-456e-ad9f-c3dcae644950/glance-httpd/0.log" Feb 19 20:11:57 crc kubenswrapper[4749]: I0219 20:11:57.386357 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_4d9d8ca6-8ca8-415e-9120-5c48d275052c/glance-log/0.log" Feb 19 20:11:57 crc kubenswrapper[4749]: I0219 20:11:57.389444 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_4d9d8ca6-8ca8-415e-9120-5c48d275052c/glance-httpd/0.log" Feb 19 20:11:57 crc kubenswrapper[4749]: I0219 20:11:57.552753 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7b4d589db8-c89ft_8812ac95-8284-4b4f-a838-b5ab30a55fad/horizon/0.log" Feb 19 20:11:57 crc kubenswrapper[4749]: I0219 20:11:57.695236 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-qhgm2_486b7134-04b2-4255-b831-c7da1c6fdcfe/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:11:57 crc kubenswrapper[4749]: I0219 20:11:57.921592 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-hkxkr_59c9b6e2-493f-4c1a-ae9b-47dca8a2658d/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:11:58 crc kubenswrapper[4749]: I0219 20:11:58.149997 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29525461-vkh7z_bdfb9341-a712-489d-ba8e-01ce41b5d1cb/keystone-cron/0.log" Feb 19 20:11:58 crc kubenswrapper[4749]: I0219 20:11:58.301185 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29525521-n6rn5_9759f20d-bfd3-4538-b451-418ffaa00853/keystone-cron/0.log" Feb 19 20:11:58 crc kubenswrapper[4749]: I0219 20:11:58.436311 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_e6855d59-78cd-4386-b41b-8670ebdadafb/kube-state-metrics/0.log" Feb 19 20:11:58 crc kubenswrapper[4749]: I0219 20:11:58.536011 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7b4d589db8-c89ft_8812ac95-8284-4b4f-a838-b5ab30a55fad/horizon-log/0.log" Feb 19 20:11:58 crc kubenswrapper[4749]: I0219 20:11:58.609851 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-59976cccdd-n7pz6_559ba668-4259-4ec1-a8b7-e6ab6b78d4b6/keystone-api/0.log" Feb 19 20:11:58 crc kubenswrapper[4749]: I0219 20:11:58.663965 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-xjsf4_1032ad4c-247e-48e1-805c-31aadc54415d/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:11:59 crc kubenswrapper[4749]: I0219 20:11:59.021786 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-h9x27_cdcbdafe-8bad-41be-91bb-59bc54994227/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:11:59 crc kubenswrapper[4749]: I0219 20:11:59.075608 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-85c7d94649-hz2gq_3bc4f02b-0135-46b5-ad46-aa2a9ce82f54/neutron-httpd/0.log" Feb 19 20:11:59 crc kubenswrapper[4749]: I0219 20:11:59.184503 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-85c7d94649-hz2gq_3bc4f02b-0135-46b5-ad46-aa2a9ce82f54/neutron-api/0.log" Feb 19 20:11:59 crc kubenswrapper[4749]: I0219 20:11:59.270517 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_27dfe8e9-686d-4703-b36d-df6b94491b40/setup-container/0.log" Feb 19 20:11:59 crc kubenswrapper[4749]: I0219 20:11:59.435372 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_27dfe8e9-686d-4703-b36d-df6b94491b40/setup-container/0.log" Feb 19 20:11:59 crc kubenswrapper[4749]: I0219 20:11:59.518310 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_27dfe8e9-686d-4703-b36d-df6b94491b40/rabbitmq/0.log" Feb 19 20:12:00 crc kubenswrapper[4749]: I0219 20:12:00.129880 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_4ccfb5c8-e819-4a4c-bf02-d7dd004d970e/nova-cell0-conductor-conductor/0.log" Feb 19 20:12:00 crc kubenswrapper[4749]: I0219 20:12:00.412155 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_0780a73c-852b-470b-9de7-61afde153d72/nova-cell1-conductor-conductor/0.log" Feb 19 20:12:00 crc kubenswrapper[4749]: I0219 20:12:00.825921 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_a05f7aea-8655-4484-ad07-c9c6f0e98880/nova-cell1-novncproxy-novncproxy/0.log" Feb 19 20:12:00 crc kubenswrapper[4749]: I0219 20:12:00.979207 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-zr2gj_e120e358-960c-435a-9655-35499a01c0c0/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:12:01 crc kubenswrapper[4749]: I0219 20:12:01.326340 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_3380219c-08a7-4ecd-8646-6e39cb13137b/nova-metadata-log/0.log" Feb 19 20:12:01 crc kubenswrapper[4749]: I0219 20:12:01.424931 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_26c708f4-7611-4c40-9dc9-8b941ff97b87/nova-api-log/0.log" Feb 19 20:12:01 crc kubenswrapper[4749]: I0219 20:12:01.924822 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_26c708f4-7611-4c40-9dc9-8b941ff97b87/nova-api-api/0.log" Feb 19 20:12:02 crc kubenswrapper[4749]: I0219 20:12:02.016340 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e84776ec-57db-4685-84f6-f86655d9f079/mysql-bootstrap/0.log" Feb 19 20:12:02 crc kubenswrapper[4749]: I0219 20:12:02.056267 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_cfdcc5f4-f33b-4d0a-bab9-0c5ac64c9704/nova-scheduler-scheduler/0.log" Feb 19 20:12:02 crc kubenswrapper[4749]: I0219 20:12:02.156358 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e84776ec-57db-4685-84f6-f86655d9f079/mysql-bootstrap/0.log" Feb 19 20:12:02 crc kubenswrapper[4749]: I0219 20:12:02.248154 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e84776ec-57db-4685-84f6-f86655d9f079/galera/0.log" Feb 19 20:12:02 crc kubenswrapper[4749]: I0219 20:12:02.381005 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_affb1316-cbf5-4641-bdd7-186e390b9e7e/mysql-bootstrap/0.log" Feb 19 20:12:02 crc kubenswrapper[4749]: I0219 20:12:02.569771 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_affb1316-cbf5-4641-bdd7-186e390b9e7e/galera/0.log" Feb 19 20:12:02 crc kubenswrapper[4749]: I0219 20:12:02.576878 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_affb1316-cbf5-4641-bdd7-186e390b9e7e/mysql-bootstrap/0.log" Feb 19 20:12:02 crc kubenswrapper[4749]: I0219 20:12:02.754182 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_5d676f6e-9d56-41ab-9689-a19a0b9665f7/openstackclient/0.log" Feb 19 20:12:02 crc kubenswrapper[4749]: I0219 20:12:02.894200 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-sbvrv_59199734-1adc-46b0-9208-75331e4b868c/openstack-network-exporter/0.log" Feb 19 20:12:03 crc kubenswrapper[4749]: I0219 20:12:03.077637 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-sn6j7_091cdb0e-a88c-4731-9a77-6a3c41e0fc1a/ovsdb-server-init/0.log" Feb 19 20:12:03 crc kubenswrapper[4749]: I0219 20:12:03.274961 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-sn6j7_091cdb0e-a88c-4731-9a77-6a3c41e0fc1a/ovsdb-server-init/0.log" Feb 19 20:12:03 crc kubenswrapper[4749]: I0219 20:12:03.311941 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-sn6j7_091cdb0e-a88c-4731-9a77-6a3c41e0fc1a/ovsdb-server/0.log" Feb 19 20:12:03 crc kubenswrapper[4749]: I0219 20:12:03.531487 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ztkm4_7232b466-ffe3-4eab-ad4c-bb2ccac65929/ovn-controller/0.log" Feb 19 20:12:03 crc kubenswrapper[4749]: I0219 20:12:03.654346 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-sn6j7_091cdb0e-a88c-4731-9a77-6a3c41e0fc1a/ovs-vswitchd/0.log" Feb 19 20:12:03 crc kubenswrapper[4749]: I0219 20:12:03.781562 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_3380219c-08a7-4ecd-8646-6e39cb13137b/nova-metadata-metadata/0.log" Feb 19 20:12:03 crc kubenswrapper[4749]: I0219 20:12:03.786884 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-qtct9_8c86e6e6-0776-48c9-9c58-e1b2d41a4552/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:12:03 crc kubenswrapper[4749]: I0219 20:12:03.916363 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2231660b-7776-4cf8-a793-7d592dd23ecf/openstack-network-exporter/0.log" Feb 19 20:12:03 crc kubenswrapper[4749]: I0219 20:12:03.953993 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2231660b-7776-4cf8-a793-7d592dd23ecf/ovn-northd/0.log" Feb 19 20:12:04 crc kubenswrapper[4749]: I0219 20:12:04.156872 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0c7f450e-9c65-4f47-a259-c6e667660b59/openstack-network-exporter/0.log" Feb 19 20:12:04 crc kubenswrapper[4749]: I0219 20:12:04.168987 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0c7f450e-9c65-4f47-a259-c6e667660b59/ovsdbserver-nb/0.log" Feb 19 20:12:04 crc kubenswrapper[4749]: I0219 20:12:04.232605 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e2f6471c-3fea-45fc-8702-9022ff831352/openstack-network-exporter/0.log" Feb 19 20:12:04 crc kubenswrapper[4749]: I0219 20:12:04.340042 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e2f6471c-3fea-45fc-8702-9022ff831352/ovsdbserver-sb/0.log" Feb 19 20:12:04 crc kubenswrapper[4749]: I0219 20:12:04.626558 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f/init-config-reloader/0.log" Feb 19 20:12:04 crc kubenswrapper[4749]: I0219 20:12:04.652694 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-548647668b-bwckt_d7dd258c-a64d-49cc-acf0-5bf79f10e8a5/placement-api/0.log" Feb 19 20:12:04 crc kubenswrapper[4749]: I0219 20:12:04.716469 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-548647668b-bwckt_d7dd258c-a64d-49cc-acf0-5bf79f10e8a5/placement-log/0.log" Feb 19 20:12:04 crc kubenswrapper[4749]: I0219 20:12:04.772870 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f/init-config-reloader/0.log" Feb 19 20:12:04 crc kubenswrapper[4749]: I0219 20:12:04.869170 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f/config-reloader/0.log" Feb 19 20:12:04 crc kubenswrapper[4749]: I0219 20:12:04.900218 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f/prometheus/0.log" Feb 19 20:12:04 crc kubenswrapper[4749]: I0219 20:12:04.907215 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f79a6e1e-9ad3-43c8-a02e-e0a9ecb59d6f/thanos-sidecar/0.log" Feb 19 20:12:05 crc kubenswrapper[4749]: I0219 20:12:05.131412 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6783e255-9125-478b-8c87-61176c735e2c/setup-container/0.log" Feb 19 20:12:05 crc kubenswrapper[4749]: I0219 20:12:05.477291 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6783e255-9125-478b-8c87-61176c735e2c/rabbitmq/0.log" Feb 19 20:12:05 crc kubenswrapper[4749]: I0219 20:12:05.480489 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6783e255-9125-478b-8c87-61176c735e2c/setup-container/0.log" Feb 19 20:12:05 crc kubenswrapper[4749]: I0219 20:12:05.541994 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8f675112-5cb9-4988-b346-b29f1e2699f9/setup-container/0.log" Feb 19 20:12:05 crc kubenswrapper[4749]: I0219 20:12:05.757880 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8f675112-5cb9-4988-b346-b29f1e2699f9/rabbitmq/0.log" Feb 19 20:12:05 crc kubenswrapper[4749]: I0219 20:12:05.803346 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-hzmtd_52a3df8c-e606-4fe8-990f-cef2a807956d/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:12:05 crc kubenswrapper[4749]: I0219 20:12:05.804016 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8f675112-5cb9-4988-b346-b29f1e2699f9/setup-container/0.log" Feb 19 20:12:05 crc kubenswrapper[4749]: I0219 20:12:05.996290 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-5cmm6_ab3e2ecf-4d0f-4482-87fc-1098e7b8818a/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:12:06 crc kubenswrapper[4749]: I0219 20:12:06.079167 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-9vw42_bf17385f-8d5b-43a5-82c8-9d8bd893e056/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:12:06 crc kubenswrapper[4749]: I0219 20:12:06.238896 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-bqhvv_b354c1a0-43cd-442f-b818-54fc0bc89cad/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:12:06 crc kubenswrapper[4749]: I0219 20:12:06.308959 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-dz7fw_31a27783-6092-4112-97a0-2335f4f251b4/ssh-known-hosts-edpm-deployment/0.log" Feb 19 20:12:06 crc kubenswrapper[4749]: I0219 20:12:06.553057 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6966bc7795-zbh89_3ebc6a8f-ca72-408a-8add-2a21e7a4c803/proxy-server/0.log" Feb 19 20:12:06 crc kubenswrapper[4749]: I0219 20:12:06.648624 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-czb4z_f1234ce5-5e40-4f76-a3b5-8b47853bf147/swift-ring-rebalance/0.log" Feb 19 20:12:06 crc kubenswrapper[4749]: I0219 20:12:06.686601 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6966bc7795-zbh89_3ebc6a8f-ca72-408a-8add-2a21e7a4c803/proxy-httpd/0.log" Feb 19 20:12:06 crc kubenswrapper[4749]: I0219 20:12:06.863004 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/account-auditor/0.log" Feb 19 20:12:06 crc kubenswrapper[4749]: I0219 20:12:06.868634 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/account-reaper/0.log" Feb 19 20:12:06 crc kubenswrapper[4749]: I0219 20:12:06.952242 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/account-replicator/0.log" Feb 19 20:12:07 crc kubenswrapper[4749]: I0219 20:12:07.039932 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/account-server/0.log" Feb 19 20:12:07 crc kubenswrapper[4749]: I0219 20:12:07.071380 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/container-auditor/0.log" Feb 19 20:12:07 crc kubenswrapper[4749]: I0219 20:12:07.079531 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/container-replicator/0.log" Feb 19 20:12:07 crc kubenswrapper[4749]: I0219 20:12:07.170592 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/container-server/0.log" Feb 19 20:12:07 crc kubenswrapper[4749]: I0219 20:12:07.228320 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/container-updater/0.log" Feb 19 20:12:07 crc kubenswrapper[4749]: I0219 20:12:07.288777 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/object-auditor/0.log" Feb 19 20:12:07 crc kubenswrapper[4749]: I0219 20:12:07.350802 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/object-expirer/0.log" Feb 19 20:12:07 crc kubenswrapper[4749]: I0219 20:12:07.418266 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/object-replicator/0.log" Feb 19 20:12:07 crc kubenswrapper[4749]: I0219 20:12:07.426444 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/object-server/0.log" Feb 19 20:12:07 crc kubenswrapper[4749]: I0219 20:12:07.518189 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/object-updater/0.log" Feb 19 20:12:07 crc kubenswrapper[4749]: I0219 20:12:07.532622 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/rsync/0.log" Feb 19 20:12:07 crc kubenswrapper[4749]: I0219 20:12:07.626759 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ece11938-c758-4d62-ad84-c630d040f511/swift-recon-cron/0.log" Feb 19 20:12:07 crc kubenswrapper[4749]: I0219 20:12:07.823157 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-tsflr_39f74bf8-e240-408d-a674-c61fcf66fd06/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:12:07 crc kubenswrapper[4749]: I0219 20:12:07.900367 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_656c9f00-c5aa-4d25-b425-84c0ce173433/tempest-tests-tempest-tests-runner/0.log" Feb 19 20:12:07 crc kubenswrapper[4749]: I0219 20:12:07.986264 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_00d5b3f4-f6de-4204-a2a8-633a9d9041e3/test-operator-logs-container/0.log" Feb 19 20:12:08 crc kubenswrapper[4749]: I0219 20:12:08.148070 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-s8ddf_91c1231f-dc2c-4c68-ba4f-e6d99913bd60/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:12:08 crc kubenswrapper[4749]: I0219 20:12:08.922007 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_f97a54ef-7ca6-4ad1-951b-5c05572b591a/watcher-applier/0.log" Feb 19 20:12:09 crc kubenswrapper[4749]: I0219 20:12:09.565984 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6/watcher-api-log/0.log" Feb 19 20:12:12 crc kubenswrapper[4749]: I0219 20:12:12.247510 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_5f18804d-f75a-4e9c-ba11-ba225b074df7/watcher-decision-engine/0.log" Feb 19 20:12:13 crc kubenswrapper[4749]: I0219 20:12:13.197885 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_7a890ae9-2fb7-4410-b2a5-3374f5555b0f/memcached/0.log" Feb 19 20:12:13 crc kubenswrapper[4749]: I0219 20:12:13.384940 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_6fd7ff0b-8aa5-4fec-a0c6-9f9c46eae5c6/watcher-api/0.log" Feb 19 20:12:24 crc kubenswrapper[4749]: I0219 20:12:24.725597 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 20:12:24 crc kubenswrapper[4749]: I0219 20:12:24.726156 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 20:12:33 crc kubenswrapper[4749]: I0219 20:12:33.078255 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548_a22a9813-be4b-4990-b503-3a1c2c30ea1d/util/0.log" Feb 19 20:12:33 crc kubenswrapper[4749]: I0219 20:12:33.285274 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548_a22a9813-be4b-4990-b503-3a1c2c30ea1d/pull/0.log" Feb 19 20:12:33 crc kubenswrapper[4749]: I0219 20:12:33.292383 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548_a22a9813-be4b-4990-b503-3a1c2c30ea1d/pull/0.log" Feb 19 20:12:33 crc kubenswrapper[4749]: I0219 20:12:33.292863 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548_a22a9813-be4b-4990-b503-3a1c2c30ea1d/util/0.log" Feb 19 20:12:33 crc kubenswrapper[4749]: I0219 20:12:33.467112 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548_a22a9813-be4b-4990-b503-3a1c2c30ea1d/util/0.log" Feb 19 20:12:33 crc kubenswrapper[4749]: I0219 20:12:33.478414 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548_a22a9813-be4b-4990-b503-3a1c2c30ea1d/pull/0.log" Feb 19 20:12:33 crc kubenswrapper[4749]: I0219 20:12:33.490772 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7fed82a6f8c5299af18261da229a874bc9bec308bae2ff85f973371043mz548_a22a9813-be4b-4990-b503-3a1c2c30ea1d/extract/0.log" Feb 19 20:12:33 crc kubenswrapper[4749]: I0219 20:12:33.864520 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-5k6kx_82c521c0-6968-4298-afc8-e4aac617b61d/manager/0.log" Feb 19 20:12:34 crc kubenswrapper[4749]: I0219 20:12:34.163260 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-kqx52_872c81d0-4024-4678-a081-6698ee2fe586/manager/0.log" Feb 19 20:12:34 crc kubenswrapper[4749]: I0219 20:12:34.352199 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-xmlcc_b5e9baf7-cc67-4cde-ba8f-256eb3c5601f/manager/0.log" Feb 19 20:12:34 crc kubenswrapper[4749]: I0219 20:12:34.580931 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-g44zq_95419345-6f7d-4cb6-b0c6-75bdebf35ade/manager/0.log" Feb 19 20:12:35 crc kubenswrapper[4749]: I0219 20:12:35.071165 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-66xhq_184b233b-5456-42e8-a09b-61221754095e/manager/0.log" Feb 19 20:12:35 crc kubenswrapper[4749]: I0219 20:12:35.400545 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-c5zvr_0c080714-223f-4954-81ad-0fbf2d7ceff1/manager/0.log" Feb 19 20:12:35 crc kubenswrapper[4749]: I0219 20:12:35.568121 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-7c76f_bda10183-e834-4f98-a0cf-47ce14f1d333/manager/0.log" Feb 19 20:12:35 crc kubenswrapper[4749]: I0219 20:12:35.777748 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-g4tkq_eaf1fbac-75fa-4442-811d-8f51e3a1e66b/manager/0.log" Feb 19 20:12:36 crc kubenswrapper[4749]: I0219 20:12:36.039844 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-5rm6j_0e32dc41-84cc-42d1-bbf8-be4aa4d4b010/manager/0.log" Feb 19 20:12:36 crc kubenswrapper[4749]: I0219 20:12:36.212939 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-nlt7x_7938ade1-7dc4-4927-b620-4cdcb7125a94/manager/0.log" Feb 19 20:12:36 crc kubenswrapper[4749]: I0219 20:12:36.336230 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-k6k2h_aa7dbc9c-deb4-49ae-a0c7-2130343cae10/manager/0.log" Feb 19 20:12:36 crc kubenswrapper[4749]: I0219 20:12:36.512599 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-x87ll_655da61c-572a-43c2-8d53-c3a3e0f95d43/manager/0.log" Feb 19 20:12:36 crc kubenswrapper[4749]: I0219 20:12:36.787271 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9cft725_b12d13ea-738f-4ecb-af86-4fe5ccfc5b0d/manager/0.log" Feb 19 20:12:37 crc kubenswrapper[4749]: I0219 20:12:37.142113 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-857dd64d7c-c9mq9_83f4c4a7-f126-44f4-9780-82e159ec9ec7/operator/0.log" Feb 19 20:12:37 crc kubenswrapper[4749]: I0219 20:12:37.448234 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-n89m9_c177988f-7956-4b60-aaf0-0ece549e28cb/registry-server/0.log" Feb 19 20:12:37 crc kubenswrapper[4749]: I0219 20:12:37.725178 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-972mw_61c9ed9f-0dff-4560-a6d3-a621e1a6ff09/manager/0.log" Feb 19 20:12:37 crc kubenswrapper[4749]: I0219 20:12:37.968152 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-2w6pw_f7febf8e-9f11-446d-b8f1-4bc8e0e2dce3/manager/0.log" Feb 19 20:12:38 crc kubenswrapper[4749]: I0219 20:12:38.165787 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-rtd4f_4a2510e7-7b2d-445a-b092-74831cb6701e/operator/0.log" Feb 19 20:12:38 crc kubenswrapper[4749]: I0219 20:12:38.361375 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-mcgwg_c8870691-c7b5-4715-8db1-2ac0f6c56ad9/manager/0.log" Feb 19 20:12:38 crc kubenswrapper[4749]: I0219 20:12:38.831546 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-7cjck_a8475ec0-8fed-454b-9d2e-7008db016ae4/manager/0.log" Feb 19 20:12:38 crc kubenswrapper[4749]: I0219 20:12:38.907808 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-xx724_e6eac27e-c253-4729-9171-7adca82bbf48/manager/0.log" Feb 19 20:12:39 crc kubenswrapper[4749]: I0219 20:12:39.234392 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-56fd5cc5c9-s7k5m_aaf03c23-f79b-4c42-9350-dd35ace208e3/manager/0.log" Feb 19 20:12:39 crc kubenswrapper[4749]: I0219 20:12:39.444877 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-c59d96f56-stlgf_4a0f1a4e-6bc1-49bb-a62b-9f51ca92b382/manager/0.log" Feb 19 20:12:39 crc kubenswrapper[4749]: I0219 20:12:39.570121 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-875j6_b31580cf-6da4-442c-aa01-bed52414bf52/manager/0.log" Feb 19 20:12:45 crc kubenswrapper[4749]: I0219 20:12:45.581049 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-54ths_ed276a06-3dcf-475c-8d9c-1ee1c364f783/manager/0.log" Feb 19 20:12:54 crc kubenswrapper[4749]: I0219 20:12:54.725579 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 20:12:54 crc kubenswrapper[4749]: I0219 20:12:54.726219 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 20:13:00 crc kubenswrapper[4749]: I0219 20:13:00.041893 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-nl4bn_18ce2742-770a-492b-a2c1-b1c615b27c71/control-plane-machine-set-operator/0.log" Feb 19 20:13:00 crc kubenswrapper[4749]: I0219 20:13:00.266263 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qbnpr_fc385e2a-5c57-49bc-a308-57a35663a452/kube-rbac-proxy/0.log" Feb 19 20:13:00 crc kubenswrapper[4749]: I0219 20:13:00.281635 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qbnpr_fc385e2a-5c57-49bc-a308-57a35663a452/machine-api-operator/0.log" Feb 19 20:13:12 crc kubenswrapper[4749]: I0219 20:13:12.269469 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-7ngvd_30e026fc-9274-4942-bf3d-68740957aeec/cert-manager-controller/0.log" Feb 19 20:13:12 crc kubenswrapper[4749]: I0219 20:13:12.448687 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-pbj6v_b4703030-f4cb-4751-a9e1-5a6c1c9f4332/cert-manager-cainjector/0.log" Feb 19 20:13:12 crc kubenswrapper[4749]: I0219 20:13:12.511440 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-gqvc7_fc78af5c-d237-4523-8035-d8992d4b539c/cert-manager-webhook/0.log" Feb 19 20:13:24 crc kubenswrapper[4749]: I0219 20:13:24.077076 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-b57rs_84b14d9d-fa97-4391-9bd8-f3c5680ad7d1/nmstate-console-plugin/0.log" Feb 19 20:13:24 crc kubenswrapper[4749]: I0219 20:13:24.245160 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-vg82k_6cacdce3-57f6-4ae5-bcdd-6d94b938a155/nmstate-handler/0.log" Feb 19 20:13:24 crc kubenswrapper[4749]: I0219 20:13:24.317652 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-kc58p_2f498838-9a5f-4320-9044-3602de46b7cb/kube-rbac-proxy/0.log" Feb 19 20:13:24 crc kubenswrapper[4749]: I0219 20:13:24.430748 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-kc58p_2f498838-9a5f-4320-9044-3602de46b7cb/nmstate-metrics/0.log" Feb 19 20:13:24 crc kubenswrapper[4749]: I0219 20:13:24.489966 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-4thb7_0339e5b6-a614-4424-8375-01b24fd90b54/nmstate-operator/0.log" Feb 19 20:13:24 crc kubenswrapper[4749]: I0219 20:13:24.627732 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-9lfmz_9c7d501a-b552-4c50-960c-63ae5826b93a/nmstate-webhook/0.log" Feb 19 20:13:24 crc kubenswrapper[4749]: I0219 20:13:24.725172 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 20:13:24 crc kubenswrapper[4749]: I0219 20:13:24.725236 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 20:13:24 crc kubenswrapper[4749]: I0219 20:13:24.725282 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 20:13:24 crc kubenswrapper[4749]: I0219 20:13:24.726086 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507"} pod="openshift-machine-config-operator/machine-config-daemon-nzldt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 20:13:24 crc kubenswrapper[4749]: I0219 20:13:24.726156 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" containerID="cri-o://a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" gracePeriod=600 Feb 19 20:13:24 crc kubenswrapper[4749]: E0219 20:13:24.855594 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:13:25 crc kubenswrapper[4749]: I0219 20:13:25.236914 4749 generic.go:334] "Generic (PLEG): container finished" podID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerID="a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" exitCode=0 Feb 19 20:13:25 crc kubenswrapper[4749]: I0219 20:13:25.236956 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerDied","Data":"a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507"} Feb 19 20:13:25 crc kubenswrapper[4749]: I0219 20:13:25.236988 4749 scope.go:117] "RemoveContainer" containerID="f0ed5f6d4b268527a299c8e916a7c157586f7cffd5eaed0132646118a1ff546c" Feb 19 20:13:25 crc kubenswrapper[4749]: I0219 20:13:25.237786 4749 scope.go:117] "RemoveContainer" containerID="a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" Feb 19 20:13:25 crc kubenswrapper[4749]: E0219 20:13:25.238105 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:13:36 crc kubenswrapper[4749]: I0219 20:13:36.685710 4749 scope.go:117] "RemoveContainer" containerID="a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" Feb 19 20:13:36 crc kubenswrapper[4749]: E0219 20:13:36.686519 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:13:37 crc kubenswrapper[4749]: I0219 20:13:37.576017 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-hcn88_f02e6039-b225-4177-9704-4cdd8b15f297/prometheus-operator/0.log" Feb 19 20:13:37 crc kubenswrapper[4749]: I0219 20:13:37.735639 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-58c6cdd4c6-lhtjb_6107d2c9-e758-426c-8c42-d8a9241b1ce8/prometheus-operator-admission-webhook/0.log" Feb 19 20:13:37 crc kubenswrapper[4749]: I0219 20:13:37.790527 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-58c6cdd4c6-pb2c8_8b734a5b-56c7-4001-8c16-e4a75f50afb3/prometheus-operator-admission-webhook/0.log" Feb 19 20:13:37 crc kubenswrapper[4749]: I0219 20:13:37.924867 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-74zcg_74ad505e-d15c-43b4-b072-444ffdedf939/operator/0.log" Feb 19 20:13:37 crc kubenswrapper[4749]: I0219 20:13:37.967792 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-qs22p_456b4e23-1427-4b46-9672-87cff5dd12b9/perses-operator/0.log" Feb 19 20:13:47 crc kubenswrapper[4749]: I0219 20:13:47.679351 4749 scope.go:117] "RemoveContainer" containerID="a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" Feb 19 20:13:47 crc kubenswrapper[4749]: E0219 20:13:47.681505 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:13:50 crc kubenswrapper[4749]: I0219 20:13:50.247760 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-trs7x_0920f1a5-2f5c-4ec0-b20a-4c1e2a152ccc/kube-rbac-proxy/0.log" Feb 19 20:13:50 crc kubenswrapper[4749]: I0219 20:13:50.360269 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-trs7x_0920f1a5-2f5c-4ec0-b20a-4c1e2a152ccc/controller/0.log" Feb 19 20:13:50 crc kubenswrapper[4749]: I0219 20:13:50.437473 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/cp-frr-files/0.log" Feb 19 20:13:50 crc kubenswrapper[4749]: I0219 20:13:50.615935 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/cp-metrics/0.log" Feb 19 20:13:50 crc kubenswrapper[4749]: I0219 20:13:50.620098 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/cp-frr-files/0.log" Feb 19 20:13:50 crc kubenswrapper[4749]: I0219 20:13:50.638446 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/cp-reloader/0.log" Feb 19 20:13:50 crc kubenswrapper[4749]: I0219 20:13:50.644699 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/cp-reloader/0.log" Feb 19 20:13:50 crc kubenswrapper[4749]: I0219 20:13:50.814757 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/cp-frr-files/0.log" Feb 19 20:13:50 crc kubenswrapper[4749]: I0219 20:13:50.831194 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/cp-metrics/0.log" Feb 19 20:13:50 crc kubenswrapper[4749]: I0219 20:13:50.858838 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/cp-metrics/0.log" Feb 19 20:13:50 crc kubenswrapper[4749]: I0219 20:13:50.899000 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/cp-reloader/0.log" Feb 19 20:13:51 crc kubenswrapper[4749]: I0219 20:13:51.049298 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/cp-reloader/0.log" Feb 19 20:13:51 crc kubenswrapper[4749]: I0219 20:13:51.049893 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/cp-frr-files/0.log" Feb 19 20:13:51 crc kubenswrapper[4749]: I0219 20:13:51.093902 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/controller/0.log" Feb 19 20:13:51 crc kubenswrapper[4749]: I0219 20:13:51.095910 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/cp-metrics/0.log" Feb 19 20:13:51 crc kubenswrapper[4749]: I0219 20:13:51.214822 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/frr-metrics/0.log" Feb 19 20:13:51 crc kubenswrapper[4749]: I0219 20:13:51.273244 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/kube-rbac-proxy/0.log" Feb 19 20:13:51 crc kubenswrapper[4749]: I0219 20:13:51.316624 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/kube-rbac-proxy-frr/0.log" Feb 19 20:13:51 crc kubenswrapper[4749]: I0219 20:13:51.441592 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/reloader/0.log" Feb 19 20:13:51 crc kubenswrapper[4749]: I0219 20:13:51.563336 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-66v4b_ec4d9a98-6ae9-4282-8b6e-eb107c8fadc5/frr-k8s-webhook-server/0.log" Feb 19 20:13:51 crc kubenswrapper[4749]: I0219 20:13:51.744180 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-74989bddb6-dcst5_bcecd22a-15ba-4bca-8be3-9cc08843c86d/manager/0.log" Feb 19 20:13:51 crc kubenswrapper[4749]: I0219 20:13:51.846973 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-66b7b94c9b-n69x7_9c6d5734-2093-42d7-a330-59c6dc0dc138/webhook-server/0.log" Feb 19 20:13:52 crc kubenswrapper[4749]: I0219 20:13:52.010102 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-6x54g_d4ac8583-8e6a-4c40-9903-37fe6f82d038/kube-rbac-proxy/0.log" Feb 19 20:13:52 crc kubenswrapper[4749]: I0219 20:13:52.545470 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-6x54g_d4ac8583-8e6a-4c40-9903-37fe6f82d038/speaker/0.log" Feb 19 20:13:53 crc kubenswrapper[4749]: I0219 20:13:53.177077 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qx65k_c7166b5c-848f-44ba-a13a-a8c5b6301844/frr/0.log" Feb 19 20:14:01 crc kubenswrapper[4749]: I0219 20:14:01.679436 4749 scope.go:117] "RemoveContainer" containerID="a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" Feb 19 20:14:01 crc kubenswrapper[4749]: E0219 20:14:01.680326 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:14:03 crc kubenswrapper[4749]: I0219 20:14:03.673278 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5_0278ac54-8aaa-4f8e-ba20-8986a1467d1f/util/0.log" Feb 19 20:14:03 crc kubenswrapper[4749]: I0219 20:14:03.849767 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5_0278ac54-8aaa-4f8e-ba20-8986a1467d1f/util/0.log" Feb 19 20:14:03 crc kubenswrapper[4749]: I0219 20:14:03.858697 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5_0278ac54-8aaa-4f8e-ba20-8986a1467d1f/pull/0.log" Feb 19 20:14:03 crc kubenswrapper[4749]: I0219 20:14:03.859863 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5_0278ac54-8aaa-4f8e-ba20-8986a1467d1f/pull/0.log" Feb 19 20:14:04 crc kubenswrapper[4749]: I0219 20:14:04.013526 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5_0278ac54-8aaa-4f8e-ba20-8986a1467d1f/util/0.log" Feb 19 20:14:04 crc kubenswrapper[4749]: I0219 20:14:04.014669 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5_0278ac54-8aaa-4f8e-ba20-8986a1467d1f/extract/0.log" Feb 19 20:14:04 crc kubenswrapper[4749]: I0219 20:14:04.035358 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08twhc5_0278ac54-8aaa-4f8e-ba20-8986a1467d1f/pull/0.log" Feb 19 20:14:04 crc kubenswrapper[4749]: I0219 20:14:04.179690 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b_8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe/util/0.log" Feb 19 20:14:04 crc kubenswrapper[4749]: I0219 20:14:04.331842 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b_8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe/util/0.log" Feb 19 20:14:04 crc kubenswrapper[4749]: I0219 20:14:04.345863 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b_8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe/pull/0.log" Feb 19 20:14:04 crc kubenswrapper[4749]: I0219 20:14:04.347825 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b_8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe/pull/0.log" Feb 19 20:14:04 crc kubenswrapper[4749]: I0219 20:14:04.526951 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b_8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe/extract/0.log" Feb 19 20:14:04 crc kubenswrapper[4749]: I0219 20:14:04.527919 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b_8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe/pull/0.log" Feb 19 20:14:04 crc kubenswrapper[4749]: I0219 20:14:04.545713 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213g2x2b_8e0e297e-8bb2-4e13-95d4-ad2c7b2a79fe/util/0.log" Feb 19 20:14:04 crc kubenswrapper[4749]: I0219 20:14:04.697262 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chgqt_00a20f87-31e9-444d-a98f-588258a67d7d/extract-utilities/0.log" Feb 19 20:14:04 crc kubenswrapper[4749]: I0219 20:14:04.844878 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chgqt_00a20f87-31e9-444d-a98f-588258a67d7d/extract-content/0.log" Feb 19 20:14:04 crc kubenswrapper[4749]: I0219 20:14:04.859221 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chgqt_00a20f87-31e9-444d-a98f-588258a67d7d/extract-utilities/0.log" Feb 19 20:14:04 crc kubenswrapper[4749]: I0219 20:14:04.867265 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chgqt_00a20f87-31e9-444d-a98f-588258a67d7d/extract-content/0.log" Feb 19 20:14:05 crc kubenswrapper[4749]: I0219 20:14:05.024217 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chgqt_00a20f87-31e9-444d-a98f-588258a67d7d/extract-content/0.log" Feb 19 20:14:05 crc kubenswrapper[4749]: I0219 20:14:05.057069 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chgqt_00a20f87-31e9-444d-a98f-588258a67d7d/extract-utilities/0.log" Feb 19 20:14:05 crc kubenswrapper[4749]: I0219 20:14:05.268511 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g6rth_ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9/extract-utilities/0.log" Feb 19 20:14:05 crc kubenswrapper[4749]: I0219 20:14:05.460267 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g6rth_ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9/extract-content/0.log" Feb 19 20:14:05 crc kubenswrapper[4749]: I0219 20:14:05.483952 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g6rth_ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9/extract-utilities/0.log" Feb 19 20:14:05 crc kubenswrapper[4749]: I0219 20:14:05.519482 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g6rth_ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9/extract-content/0.log" Feb 19 20:14:05 crc kubenswrapper[4749]: I0219 20:14:05.710898 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g6rth_ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9/extract-utilities/0.log" Feb 19 20:14:05 crc kubenswrapper[4749]: I0219 20:14:05.747594 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g6rth_ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9/extract-content/0.log" Feb 19 20:14:05 crc kubenswrapper[4749]: I0219 20:14:05.749716 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chgqt_00a20f87-31e9-444d-a98f-588258a67d7d/registry-server/0.log" Feb 19 20:14:05 crc kubenswrapper[4749]: I0219 20:14:05.968194 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr_9bc71b96-3cbf-4481-a9ac-d77071db9e39/util/0.log" Feb 19 20:14:06 crc kubenswrapper[4749]: I0219 20:14:06.049677 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-6966bc7795-zbh89" podUID="3ebc6a8f-ca72-408a-8add-2a21e7a4c803" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Feb 19 20:14:06 crc kubenswrapper[4749]: I0219 20:14:06.177313 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr_9bc71b96-3cbf-4481-a9ac-d77071db9e39/pull/0.log" Feb 19 20:14:06 crc kubenswrapper[4749]: I0219 20:14:06.212426 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr_9bc71b96-3cbf-4481-a9ac-d77071db9e39/util/0.log" Feb 19 20:14:06 crc kubenswrapper[4749]: I0219 20:14:06.283598 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr_9bc71b96-3cbf-4481-a9ac-d77071db9e39/pull/0.log" Feb 19 20:14:06 crc kubenswrapper[4749]: I0219 20:14:06.468996 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr_9bc71b96-3cbf-4481-a9ac-d77071db9e39/pull/0.log" Feb 19 20:14:06 crc kubenswrapper[4749]: I0219 20:14:06.492981 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr_9bc71b96-3cbf-4481-a9ac-d77071db9e39/extract/0.log" Feb 19 20:14:06 crc kubenswrapper[4749]: I0219 20:14:06.512550 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanhczr_9bc71b96-3cbf-4481-a9ac-d77071db9e39/util/0.log" Feb 19 20:14:06 crc kubenswrapper[4749]: I0219 20:14:06.677006 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g6rth_ffb2ea5d-fbe3-41d3-8dde-e7c7f31722a9/registry-server/0.log" Feb 19 20:14:06 crc kubenswrapper[4749]: I0219 20:14:06.709247 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-wd24f_22a5e8d0-f222-4a7c-8bb7-51689ef460a8/marketplace-operator/0.log" Feb 19 20:14:06 crc kubenswrapper[4749]: I0219 20:14:06.833496 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rt4s8_73ceef31-959a-49ce-9f27-0b41330d330b/extract-utilities/0.log" Feb 19 20:14:06 crc kubenswrapper[4749]: I0219 20:14:06.988769 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rt4s8_73ceef31-959a-49ce-9f27-0b41330d330b/extract-utilities/0.log" Feb 19 20:14:07 crc kubenswrapper[4749]: I0219 20:14:07.028309 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rt4s8_73ceef31-959a-49ce-9f27-0b41330d330b/extract-content/0.log" Feb 19 20:14:07 crc kubenswrapper[4749]: I0219 20:14:07.034834 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rt4s8_73ceef31-959a-49ce-9f27-0b41330d330b/extract-content/0.log" Feb 19 20:14:07 crc kubenswrapper[4749]: I0219 20:14:07.182972 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rt4s8_73ceef31-959a-49ce-9f27-0b41330d330b/extract-utilities/0.log" Feb 19 20:14:07 crc kubenswrapper[4749]: I0219 20:14:07.184728 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rt4s8_73ceef31-959a-49ce-9f27-0b41330d330b/extract-content/0.log" Feb 19 20:14:07 crc kubenswrapper[4749]: I0219 20:14:07.413227 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d8w5c_50a852ed-343b-45a6-987c-d6a4a98446dd/extract-utilities/0.log" Feb 19 20:14:07 crc kubenswrapper[4749]: I0219 20:14:07.431679 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rt4s8_73ceef31-959a-49ce-9f27-0b41330d330b/registry-server/0.log" Feb 19 20:14:07 crc kubenswrapper[4749]: I0219 20:14:07.577718 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d8w5c_50a852ed-343b-45a6-987c-d6a4a98446dd/extract-utilities/0.log" Feb 19 20:14:07 crc kubenswrapper[4749]: I0219 20:14:07.589445 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d8w5c_50a852ed-343b-45a6-987c-d6a4a98446dd/extract-content/0.log" Feb 19 20:14:07 crc kubenswrapper[4749]: I0219 20:14:07.612868 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d8w5c_50a852ed-343b-45a6-987c-d6a4a98446dd/extract-content/0.log" Feb 19 20:14:07 crc kubenswrapper[4749]: I0219 20:14:07.755300 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d8w5c_50a852ed-343b-45a6-987c-d6a4a98446dd/extract-content/0.log" Feb 19 20:14:07 crc kubenswrapper[4749]: I0219 20:14:07.777619 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d8w5c_50a852ed-343b-45a6-987c-d6a4a98446dd/extract-utilities/0.log" Feb 19 20:14:08 crc kubenswrapper[4749]: I0219 20:14:08.477334 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d8w5c_50a852ed-343b-45a6-987c-d6a4a98446dd/registry-server/0.log" Feb 19 20:14:16 crc kubenswrapper[4749]: I0219 20:14:16.684791 4749 scope.go:117] "RemoveContainer" containerID="a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" Feb 19 20:14:16 crc kubenswrapper[4749]: E0219 20:14:16.686734 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:14:18 crc kubenswrapper[4749]: I0219 20:14:18.999092 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-58c6cdd4c6-lhtjb_6107d2c9-e758-426c-8c42-d8a9241b1ce8/prometheus-operator-admission-webhook/0.log" Feb 19 20:14:19 crc kubenswrapper[4749]: I0219 20:14:19.008089 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-hcn88_f02e6039-b225-4177-9704-4cdd8b15f297/prometheus-operator/0.log" Feb 19 20:14:19 crc kubenswrapper[4749]: I0219 20:14:19.019190 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-58c6cdd4c6-pb2c8_8b734a5b-56c7-4001-8c16-e4a75f50afb3/prometheus-operator-admission-webhook/0.log" Feb 19 20:14:19 crc kubenswrapper[4749]: I0219 20:14:19.200056 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-qs22p_456b4e23-1427-4b46-9672-87cff5dd12b9/perses-operator/0.log" Feb 19 20:14:19 crc kubenswrapper[4749]: I0219 20:14:19.222307 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-74zcg_74ad505e-d15c-43b4-b072-444ffdedf939/operator/0.log" Feb 19 20:14:28 crc kubenswrapper[4749]: I0219 20:14:28.680504 4749 scope.go:117] "RemoveContainer" containerID="a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" Feb 19 20:14:28 crc kubenswrapper[4749]: E0219 20:14:28.681345 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:14:41 crc kubenswrapper[4749]: I0219 20:14:41.679639 4749 scope.go:117] "RemoveContainer" containerID="a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" Feb 19 20:14:41 crc kubenswrapper[4749]: E0219 20:14:41.680500 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:14:56 crc kubenswrapper[4749]: I0219 20:14:56.695414 4749 scope.go:117] "RemoveContainer" containerID="a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" Feb 19 20:14:56 crc kubenswrapper[4749]: E0219 20:14:56.696454 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:15:00 crc kubenswrapper[4749]: I0219 20:15:00.141053 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525535-nv9lj"] Feb 19 20:15:00 crc kubenswrapper[4749]: E0219 20:15:00.142049 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a" containerName="container-00" Feb 19 20:15:00 crc kubenswrapper[4749]: I0219 20:15:00.142062 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a" containerName="container-00" Feb 19 20:15:00 crc kubenswrapper[4749]: E0219 20:15:00.142075 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b999dc79-b1cf-4ab3-99c2-19092badce3c" containerName="extract-utilities" Feb 19 20:15:00 crc kubenswrapper[4749]: I0219 20:15:00.142081 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b999dc79-b1cf-4ab3-99c2-19092badce3c" containerName="extract-utilities" Feb 19 20:15:00 crc kubenswrapper[4749]: E0219 20:15:00.142100 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b999dc79-b1cf-4ab3-99c2-19092badce3c" containerName="extract-content" Feb 19 20:15:00 crc kubenswrapper[4749]: I0219 20:15:00.142107 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b999dc79-b1cf-4ab3-99c2-19092badce3c" containerName="extract-content" Feb 19 20:15:00 crc kubenswrapper[4749]: E0219 20:15:00.142123 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b999dc79-b1cf-4ab3-99c2-19092badce3c" containerName="registry-server" Feb 19 20:15:00 crc kubenswrapper[4749]: I0219 20:15:00.142129 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b999dc79-b1cf-4ab3-99c2-19092badce3c" containerName="registry-server" Feb 19 20:15:00 crc kubenswrapper[4749]: I0219 20:15:00.142314 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b999dc79-b1cf-4ab3-99c2-19092badce3c" containerName="registry-server" Feb 19 20:15:00 crc kubenswrapper[4749]: I0219 20:15:00.142337 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c58b35d-f43e-4fd0-9e6f-3695d4e4c24a" containerName="container-00" Feb 19 20:15:00 crc kubenswrapper[4749]: I0219 20:15:00.142989 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-nv9lj" Feb 19 20:15:00 crc kubenswrapper[4749]: I0219 20:15:00.145176 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 20:15:00 crc kubenswrapper[4749]: I0219 20:15:00.146660 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 20:15:00 crc kubenswrapper[4749]: I0219 20:15:00.152175 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525535-nv9lj"] Feb 19 20:15:00 crc kubenswrapper[4749]: I0219 20:15:00.251354 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af7fbf76-3c5a-43bf-9430-7de30986ca2a-config-volume\") pod \"collect-profiles-29525535-nv9lj\" (UID: \"af7fbf76-3c5a-43bf-9430-7de30986ca2a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-nv9lj" Feb 19 20:15:00 crc kubenswrapper[4749]: I0219 20:15:00.251505 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af7fbf76-3c5a-43bf-9430-7de30986ca2a-secret-volume\") pod \"collect-profiles-29525535-nv9lj\" (UID: \"af7fbf76-3c5a-43bf-9430-7de30986ca2a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-nv9lj" Feb 19 20:15:00 crc kubenswrapper[4749]: I0219 20:15:00.251528 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rts9w\" (UniqueName: \"kubernetes.io/projected/af7fbf76-3c5a-43bf-9430-7de30986ca2a-kube-api-access-rts9w\") pod \"collect-profiles-29525535-nv9lj\" (UID: \"af7fbf76-3c5a-43bf-9430-7de30986ca2a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-nv9lj" Feb 19 20:15:00 crc kubenswrapper[4749]: I0219 20:15:00.353994 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af7fbf76-3c5a-43bf-9430-7de30986ca2a-secret-volume\") pod \"collect-profiles-29525535-nv9lj\" (UID: \"af7fbf76-3c5a-43bf-9430-7de30986ca2a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-nv9lj" Feb 19 20:15:00 crc kubenswrapper[4749]: I0219 20:15:00.354125 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rts9w\" (UniqueName: \"kubernetes.io/projected/af7fbf76-3c5a-43bf-9430-7de30986ca2a-kube-api-access-rts9w\") pod \"collect-profiles-29525535-nv9lj\" (UID: \"af7fbf76-3c5a-43bf-9430-7de30986ca2a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-nv9lj" Feb 19 20:15:00 crc kubenswrapper[4749]: I0219 20:15:00.354306 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af7fbf76-3c5a-43bf-9430-7de30986ca2a-config-volume\") pod \"collect-profiles-29525535-nv9lj\" (UID: \"af7fbf76-3c5a-43bf-9430-7de30986ca2a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-nv9lj" Feb 19 20:15:00 crc kubenswrapper[4749]: I0219 20:15:00.355798 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af7fbf76-3c5a-43bf-9430-7de30986ca2a-config-volume\") pod \"collect-profiles-29525535-nv9lj\" (UID: \"af7fbf76-3c5a-43bf-9430-7de30986ca2a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-nv9lj" Feb 19 20:15:00 crc kubenswrapper[4749]: I0219 20:15:00.367717 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af7fbf76-3c5a-43bf-9430-7de30986ca2a-secret-volume\") pod \"collect-profiles-29525535-nv9lj\" (UID: \"af7fbf76-3c5a-43bf-9430-7de30986ca2a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-nv9lj" Feb 19 20:15:00 crc kubenswrapper[4749]: I0219 20:15:00.370941 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rts9w\" (UniqueName: \"kubernetes.io/projected/af7fbf76-3c5a-43bf-9430-7de30986ca2a-kube-api-access-rts9w\") pod \"collect-profiles-29525535-nv9lj\" (UID: \"af7fbf76-3c5a-43bf-9430-7de30986ca2a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-nv9lj" Feb 19 20:15:00 crc kubenswrapper[4749]: I0219 20:15:00.473699 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-nv9lj" Feb 19 20:15:00 crc kubenswrapper[4749]: I0219 20:15:00.943113 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525535-nv9lj"] Feb 19 20:15:01 crc kubenswrapper[4749]: I0219 20:15:01.093301 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-nv9lj" event={"ID":"af7fbf76-3c5a-43bf-9430-7de30986ca2a","Type":"ContainerStarted","Data":"dcc173b59939bdb525b7d5496d0d4b3463f115c0be94ac80960f7f7a3c9afae9"} Feb 19 20:15:01 crc kubenswrapper[4749]: I0219 20:15:01.093627 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-nv9lj" event={"ID":"af7fbf76-3c5a-43bf-9430-7de30986ca2a","Type":"ContainerStarted","Data":"a051021a11df1e31be3bf40b9d1a01d1f337f43b6f331cec9a483ffcf17ca465"} Feb 19 20:15:01 crc kubenswrapper[4749]: I0219 20:15:01.112569 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-nv9lj" podStartSLOduration=1.112545624 podStartE2EDuration="1.112545624s" podCreationTimestamp="2026-02-19 20:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 20:15:01.109219204 +0000 UTC m=+6075.070439178" watchObservedRunningTime="2026-02-19 20:15:01.112545624 +0000 UTC m=+6075.073765578" Feb 19 20:15:02 crc kubenswrapper[4749]: I0219 20:15:02.104222 4749 generic.go:334] "Generic (PLEG): container finished" podID="af7fbf76-3c5a-43bf-9430-7de30986ca2a" containerID="dcc173b59939bdb525b7d5496d0d4b3463f115c0be94ac80960f7f7a3c9afae9" exitCode=0 Feb 19 20:15:02 crc kubenswrapper[4749]: I0219 20:15:02.104295 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-nv9lj" event={"ID":"af7fbf76-3c5a-43bf-9430-7de30986ca2a","Type":"ContainerDied","Data":"dcc173b59939bdb525b7d5496d0d4b3463f115c0be94ac80960f7f7a3c9afae9"} Feb 19 20:15:03 crc kubenswrapper[4749]: I0219 20:15:03.511011 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-nv9lj" Feb 19 20:15:03 crc kubenswrapper[4749]: I0219 20:15:03.651238 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af7fbf76-3c5a-43bf-9430-7de30986ca2a-secret-volume\") pod \"af7fbf76-3c5a-43bf-9430-7de30986ca2a\" (UID: \"af7fbf76-3c5a-43bf-9430-7de30986ca2a\") " Feb 19 20:15:03 crc kubenswrapper[4749]: I0219 20:15:03.651394 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rts9w\" (UniqueName: \"kubernetes.io/projected/af7fbf76-3c5a-43bf-9430-7de30986ca2a-kube-api-access-rts9w\") pod \"af7fbf76-3c5a-43bf-9430-7de30986ca2a\" (UID: \"af7fbf76-3c5a-43bf-9430-7de30986ca2a\") " Feb 19 20:15:03 crc kubenswrapper[4749]: I0219 20:15:03.651607 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af7fbf76-3c5a-43bf-9430-7de30986ca2a-config-volume\") pod \"af7fbf76-3c5a-43bf-9430-7de30986ca2a\" (UID: \"af7fbf76-3c5a-43bf-9430-7de30986ca2a\") " Feb 19 20:15:03 crc kubenswrapper[4749]: I0219 20:15:03.653153 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af7fbf76-3c5a-43bf-9430-7de30986ca2a-config-volume" (OuterVolumeSpecName: "config-volume") pod "af7fbf76-3c5a-43bf-9430-7de30986ca2a" (UID: "af7fbf76-3c5a-43bf-9430-7de30986ca2a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 20:15:03 crc kubenswrapper[4749]: I0219 20:15:03.666119 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af7fbf76-3c5a-43bf-9430-7de30986ca2a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "af7fbf76-3c5a-43bf-9430-7de30986ca2a" (UID: "af7fbf76-3c5a-43bf-9430-7de30986ca2a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:15:03 crc kubenswrapper[4749]: I0219 20:15:03.680396 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af7fbf76-3c5a-43bf-9430-7de30986ca2a-kube-api-access-rts9w" (OuterVolumeSpecName: "kube-api-access-rts9w") pod "af7fbf76-3c5a-43bf-9430-7de30986ca2a" (UID: "af7fbf76-3c5a-43bf-9430-7de30986ca2a"). InnerVolumeSpecName "kube-api-access-rts9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:15:03 crc kubenswrapper[4749]: I0219 20:15:03.774479 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af7fbf76-3c5a-43bf-9430-7de30986ca2a-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 20:15:03 crc kubenswrapper[4749]: I0219 20:15:03.774534 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af7fbf76-3c5a-43bf-9430-7de30986ca2a-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 20:15:03 crc kubenswrapper[4749]: I0219 20:15:03.774546 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rts9w\" (UniqueName: \"kubernetes.io/projected/af7fbf76-3c5a-43bf-9430-7de30986ca2a-kube-api-access-rts9w\") on node \"crc\" DevicePath \"\"" Feb 19 20:15:04 crc kubenswrapper[4749]: I0219 20:15:04.125235 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-nv9lj" event={"ID":"af7fbf76-3c5a-43bf-9430-7de30986ca2a","Type":"ContainerDied","Data":"a051021a11df1e31be3bf40b9d1a01d1f337f43b6f331cec9a483ffcf17ca465"} Feb 19 20:15:04 crc kubenswrapper[4749]: I0219 20:15:04.125275 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a051021a11df1e31be3bf40b9d1a01d1f337f43b6f331cec9a483ffcf17ca465" Feb 19 20:15:04 crc kubenswrapper[4749]: I0219 20:15:04.125320 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-nv9lj" Feb 19 20:15:04 crc kubenswrapper[4749]: I0219 20:15:04.590387 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525490-qg5vd"] Feb 19 20:15:04 crc kubenswrapper[4749]: I0219 20:15:04.602105 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525490-qg5vd"] Feb 19 20:15:04 crc kubenswrapper[4749]: I0219 20:15:04.695017 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf14e5ea-0731-41b0-93d5-0fb06e2190c6" path="/var/lib/kubelet/pods/cf14e5ea-0731-41b0-93d5-0fb06e2190c6/volumes" Feb 19 20:15:07 crc kubenswrapper[4749]: I0219 20:15:07.679207 4749 scope.go:117] "RemoveContainer" containerID="a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" Feb 19 20:15:07 crc kubenswrapper[4749]: E0219 20:15:07.680085 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:15:21 crc kubenswrapper[4749]: I0219 20:15:21.679574 4749 scope.go:117] "RemoveContainer" containerID="a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" Feb 19 20:15:21 crc kubenswrapper[4749]: E0219 20:15:21.680357 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:15:21 crc kubenswrapper[4749]: I0219 20:15:21.993423 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8l84h"] Feb 19 20:15:21 crc kubenswrapper[4749]: E0219 20:15:21.993855 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af7fbf76-3c5a-43bf-9430-7de30986ca2a" containerName="collect-profiles" Feb 19 20:15:21 crc kubenswrapper[4749]: I0219 20:15:21.993871 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="af7fbf76-3c5a-43bf-9430-7de30986ca2a" containerName="collect-profiles" Feb 19 20:15:21 crc kubenswrapper[4749]: I0219 20:15:21.994117 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="af7fbf76-3c5a-43bf-9430-7de30986ca2a" containerName="collect-profiles" Feb 19 20:15:21 crc kubenswrapper[4749]: I0219 20:15:21.995555 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8l84h" Feb 19 20:15:22 crc kubenswrapper[4749]: I0219 20:15:22.010486 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8l84h"] Feb 19 20:15:22 crc kubenswrapper[4749]: I0219 20:15:22.154947 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7647095-8e3e-4452-9b11-32122d25f0cb-utilities\") pod \"redhat-operators-8l84h\" (UID: \"c7647095-8e3e-4452-9b11-32122d25f0cb\") " pod="openshift-marketplace/redhat-operators-8l84h" Feb 19 20:15:22 crc kubenswrapper[4749]: I0219 20:15:22.155016 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7647095-8e3e-4452-9b11-32122d25f0cb-catalog-content\") pod \"redhat-operators-8l84h\" (UID: \"c7647095-8e3e-4452-9b11-32122d25f0cb\") " pod="openshift-marketplace/redhat-operators-8l84h" Feb 19 20:15:22 crc kubenswrapper[4749]: I0219 20:15:22.155549 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww2hm\" (UniqueName: \"kubernetes.io/projected/c7647095-8e3e-4452-9b11-32122d25f0cb-kube-api-access-ww2hm\") pod \"redhat-operators-8l84h\" (UID: \"c7647095-8e3e-4452-9b11-32122d25f0cb\") " pod="openshift-marketplace/redhat-operators-8l84h" Feb 19 20:15:22 crc kubenswrapper[4749]: I0219 20:15:22.257242 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7647095-8e3e-4452-9b11-32122d25f0cb-utilities\") pod \"redhat-operators-8l84h\" (UID: \"c7647095-8e3e-4452-9b11-32122d25f0cb\") " pod="openshift-marketplace/redhat-operators-8l84h" Feb 19 20:15:22 crc kubenswrapper[4749]: I0219 20:15:22.257313 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7647095-8e3e-4452-9b11-32122d25f0cb-catalog-content\") pod \"redhat-operators-8l84h\" (UID: \"c7647095-8e3e-4452-9b11-32122d25f0cb\") " pod="openshift-marketplace/redhat-operators-8l84h" Feb 19 20:15:22 crc kubenswrapper[4749]: I0219 20:15:22.257432 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww2hm\" (UniqueName: \"kubernetes.io/projected/c7647095-8e3e-4452-9b11-32122d25f0cb-kube-api-access-ww2hm\") pod \"redhat-operators-8l84h\" (UID: \"c7647095-8e3e-4452-9b11-32122d25f0cb\") " pod="openshift-marketplace/redhat-operators-8l84h" Feb 19 20:15:22 crc kubenswrapper[4749]: I0219 20:15:22.258231 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7647095-8e3e-4452-9b11-32122d25f0cb-utilities\") pod \"redhat-operators-8l84h\" (UID: \"c7647095-8e3e-4452-9b11-32122d25f0cb\") " pod="openshift-marketplace/redhat-operators-8l84h" Feb 19 20:15:22 crc kubenswrapper[4749]: I0219 20:15:22.258504 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7647095-8e3e-4452-9b11-32122d25f0cb-catalog-content\") pod \"redhat-operators-8l84h\" (UID: \"c7647095-8e3e-4452-9b11-32122d25f0cb\") " pod="openshift-marketplace/redhat-operators-8l84h" Feb 19 20:15:22 crc kubenswrapper[4749]: I0219 20:15:22.278352 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww2hm\" (UniqueName: \"kubernetes.io/projected/c7647095-8e3e-4452-9b11-32122d25f0cb-kube-api-access-ww2hm\") pod \"redhat-operators-8l84h\" (UID: \"c7647095-8e3e-4452-9b11-32122d25f0cb\") " pod="openshift-marketplace/redhat-operators-8l84h" Feb 19 20:15:22 crc kubenswrapper[4749]: I0219 20:15:22.325778 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8l84h" Feb 19 20:15:22 crc kubenswrapper[4749]: I0219 20:15:22.976575 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8l84h"] Feb 19 20:15:23 crc kubenswrapper[4749]: I0219 20:15:23.304646 4749 generic.go:334] "Generic (PLEG): container finished" podID="c7647095-8e3e-4452-9b11-32122d25f0cb" containerID="eac373899152f23750e7cc609e1909d14f8e1c226afa13ce4b807a8351115a4d" exitCode=0 Feb 19 20:15:23 crc kubenswrapper[4749]: I0219 20:15:23.304959 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8l84h" event={"ID":"c7647095-8e3e-4452-9b11-32122d25f0cb","Type":"ContainerDied","Data":"eac373899152f23750e7cc609e1909d14f8e1c226afa13ce4b807a8351115a4d"} Feb 19 20:15:23 crc kubenswrapper[4749]: I0219 20:15:23.304982 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8l84h" event={"ID":"c7647095-8e3e-4452-9b11-32122d25f0cb","Type":"ContainerStarted","Data":"a26cf8f7b4fc6c56889fa205a8e045da6663205730f08b66810c4f3e68c937b1"} Feb 19 20:15:24 crc kubenswrapper[4749]: I0219 20:15:24.315765 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8l84h" event={"ID":"c7647095-8e3e-4452-9b11-32122d25f0cb","Type":"ContainerStarted","Data":"b3414efcba0d83628284009b7bf3303f4b67b566d463e1d4c9ae959014a6a91e"} Feb 19 20:15:27 crc kubenswrapper[4749]: I0219 20:15:27.046468 4749 scope.go:117] "RemoveContainer" containerID="5f00b43e59cded978f2de7bc6d0610106993c1e10db4568182161ddf66b16ab9" Feb 19 20:15:29 crc kubenswrapper[4749]: I0219 20:15:29.366885 4749 generic.go:334] "Generic (PLEG): container finished" podID="c7647095-8e3e-4452-9b11-32122d25f0cb" containerID="b3414efcba0d83628284009b7bf3303f4b67b566d463e1d4c9ae959014a6a91e" exitCode=0 Feb 19 20:15:29 crc kubenswrapper[4749]: I0219 20:15:29.366969 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8l84h" event={"ID":"c7647095-8e3e-4452-9b11-32122d25f0cb","Type":"ContainerDied","Data":"b3414efcba0d83628284009b7bf3303f4b67b566d463e1d4c9ae959014a6a91e"} Feb 19 20:15:30 crc kubenswrapper[4749]: I0219 20:15:30.379309 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8l84h" event={"ID":"c7647095-8e3e-4452-9b11-32122d25f0cb","Type":"ContainerStarted","Data":"dc92ae137afe0150c1db341ffc2feaad2de9015d166a4146cb79e8b5f87b754a"} Feb 19 20:15:30 crc kubenswrapper[4749]: I0219 20:15:30.404682 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8l84h" podStartSLOduration=2.943460907 podStartE2EDuration="9.404666362s" podCreationTimestamp="2026-02-19 20:15:21 +0000 UTC" firstStartedPulling="2026-02-19 20:15:23.306405591 +0000 UTC m=+6097.267625545" lastFinishedPulling="2026-02-19 20:15:29.767611046 +0000 UTC m=+6103.728831000" observedRunningTime="2026-02-19 20:15:30.404010615 +0000 UTC m=+6104.365230579" watchObservedRunningTime="2026-02-19 20:15:30.404666362 +0000 UTC m=+6104.365886316" Feb 19 20:15:32 crc kubenswrapper[4749]: I0219 20:15:32.326521 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8l84h" Feb 19 20:15:32 crc kubenswrapper[4749]: I0219 20:15:32.326905 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8l84h" Feb 19 20:15:32 crc kubenswrapper[4749]: I0219 20:15:32.680122 4749 scope.go:117] "RemoveContainer" containerID="a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" Feb 19 20:15:32 crc kubenswrapper[4749]: E0219 20:15:32.680614 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:15:33 crc kubenswrapper[4749]: I0219 20:15:33.373670 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8l84h" podUID="c7647095-8e3e-4452-9b11-32122d25f0cb" containerName="registry-server" probeResult="failure" output=< Feb 19 20:15:33 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Feb 19 20:15:33 crc kubenswrapper[4749]: > Feb 19 20:15:43 crc kubenswrapper[4749]: I0219 20:15:43.387358 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8l84h" podUID="c7647095-8e3e-4452-9b11-32122d25f0cb" containerName="registry-server" probeResult="failure" output=< Feb 19 20:15:43 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Feb 19 20:15:43 crc kubenswrapper[4749]: > Feb 19 20:15:44 crc kubenswrapper[4749]: I0219 20:15:44.679187 4749 scope.go:117] "RemoveContainer" containerID="a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" Feb 19 20:15:44 crc kubenswrapper[4749]: E0219 20:15:44.679875 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:15:52 crc kubenswrapper[4749]: I0219 20:15:52.374970 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8l84h" Feb 19 20:15:52 crc kubenswrapper[4749]: I0219 20:15:52.427745 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8l84h" Feb 19 20:15:53 crc kubenswrapper[4749]: I0219 20:15:53.198237 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8l84h"] Feb 19 20:15:53 crc kubenswrapper[4749]: I0219 20:15:53.605513 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8l84h" podUID="c7647095-8e3e-4452-9b11-32122d25f0cb" containerName="registry-server" containerID="cri-o://dc92ae137afe0150c1db341ffc2feaad2de9015d166a4146cb79e8b5f87b754a" gracePeriod=2 Feb 19 20:15:54 crc kubenswrapper[4749]: I0219 20:15:54.175678 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8l84h" Feb 19 20:15:54 crc kubenswrapper[4749]: I0219 20:15:54.199564 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7647095-8e3e-4452-9b11-32122d25f0cb-utilities\") pod \"c7647095-8e3e-4452-9b11-32122d25f0cb\" (UID: \"c7647095-8e3e-4452-9b11-32122d25f0cb\") " Feb 19 20:15:54 crc kubenswrapper[4749]: I0219 20:15:54.199868 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7647095-8e3e-4452-9b11-32122d25f0cb-catalog-content\") pod \"c7647095-8e3e-4452-9b11-32122d25f0cb\" (UID: \"c7647095-8e3e-4452-9b11-32122d25f0cb\") " Feb 19 20:15:54 crc kubenswrapper[4749]: I0219 20:15:54.199915 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww2hm\" (UniqueName: \"kubernetes.io/projected/c7647095-8e3e-4452-9b11-32122d25f0cb-kube-api-access-ww2hm\") pod \"c7647095-8e3e-4452-9b11-32122d25f0cb\" (UID: \"c7647095-8e3e-4452-9b11-32122d25f0cb\") " Feb 19 20:15:54 crc kubenswrapper[4749]: I0219 20:15:54.203487 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7647095-8e3e-4452-9b11-32122d25f0cb-utilities" (OuterVolumeSpecName: "utilities") pod "c7647095-8e3e-4452-9b11-32122d25f0cb" (UID: "c7647095-8e3e-4452-9b11-32122d25f0cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:15:54 crc kubenswrapper[4749]: I0219 20:15:54.209424 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7647095-8e3e-4452-9b11-32122d25f0cb-kube-api-access-ww2hm" (OuterVolumeSpecName: "kube-api-access-ww2hm") pod "c7647095-8e3e-4452-9b11-32122d25f0cb" (UID: "c7647095-8e3e-4452-9b11-32122d25f0cb"). InnerVolumeSpecName "kube-api-access-ww2hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:15:54 crc kubenswrapper[4749]: I0219 20:15:54.303778 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7647095-8e3e-4452-9b11-32122d25f0cb-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 20:15:54 crc kubenswrapper[4749]: I0219 20:15:54.303827 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww2hm\" (UniqueName: \"kubernetes.io/projected/c7647095-8e3e-4452-9b11-32122d25f0cb-kube-api-access-ww2hm\") on node \"crc\" DevicePath \"\"" Feb 19 20:15:54 crc kubenswrapper[4749]: I0219 20:15:54.365459 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7647095-8e3e-4452-9b11-32122d25f0cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7647095-8e3e-4452-9b11-32122d25f0cb" (UID: "c7647095-8e3e-4452-9b11-32122d25f0cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:15:54 crc kubenswrapper[4749]: I0219 20:15:54.406248 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7647095-8e3e-4452-9b11-32122d25f0cb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 20:15:54 crc kubenswrapper[4749]: I0219 20:15:54.616800 4749 generic.go:334] "Generic (PLEG): container finished" podID="c7647095-8e3e-4452-9b11-32122d25f0cb" containerID="dc92ae137afe0150c1db341ffc2feaad2de9015d166a4146cb79e8b5f87b754a" exitCode=0 Feb 19 20:15:54 crc kubenswrapper[4749]: I0219 20:15:54.616857 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8l84h" Feb 19 20:15:54 crc kubenswrapper[4749]: I0219 20:15:54.616867 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8l84h" event={"ID":"c7647095-8e3e-4452-9b11-32122d25f0cb","Type":"ContainerDied","Data":"dc92ae137afe0150c1db341ffc2feaad2de9015d166a4146cb79e8b5f87b754a"} Feb 19 20:15:54 crc kubenswrapper[4749]: I0219 20:15:54.616935 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8l84h" event={"ID":"c7647095-8e3e-4452-9b11-32122d25f0cb","Type":"ContainerDied","Data":"a26cf8f7b4fc6c56889fa205a8e045da6663205730f08b66810c4f3e68c937b1"} Feb 19 20:15:54 crc kubenswrapper[4749]: I0219 20:15:54.616958 4749 scope.go:117] "RemoveContainer" containerID="dc92ae137afe0150c1db341ffc2feaad2de9015d166a4146cb79e8b5f87b754a" Feb 19 20:15:54 crc kubenswrapper[4749]: I0219 20:15:54.637515 4749 scope.go:117] "RemoveContainer" containerID="b3414efcba0d83628284009b7bf3303f4b67b566d463e1d4c9ae959014a6a91e" Feb 19 20:15:54 crc kubenswrapper[4749]: I0219 20:15:54.651017 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8l84h"] Feb 19 20:15:54 crc kubenswrapper[4749]: I0219 20:15:54.658734 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8l84h"] Feb 19 20:15:54 crc kubenswrapper[4749]: I0219 20:15:54.670426 4749 scope.go:117] "RemoveContainer" containerID="eac373899152f23750e7cc609e1909d14f8e1c226afa13ce4b807a8351115a4d" Feb 19 20:15:54 crc kubenswrapper[4749]: I0219 20:15:54.695746 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7647095-8e3e-4452-9b11-32122d25f0cb" path="/var/lib/kubelet/pods/c7647095-8e3e-4452-9b11-32122d25f0cb/volumes" Feb 19 20:15:54 crc kubenswrapper[4749]: I0219 20:15:54.719881 4749 scope.go:117] "RemoveContainer" containerID="dc92ae137afe0150c1db341ffc2feaad2de9015d166a4146cb79e8b5f87b754a" Feb 19 20:15:54 crc kubenswrapper[4749]: E0219 20:15:54.720352 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc92ae137afe0150c1db341ffc2feaad2de9015d166a4146cb79e8b5f87b754a\": container with ID starting with dc92ae137afe0150c1db341ffc2feaad2de9015d166a4146cb79e8b5f87b754a not found: ID does not exist" containerID="dc92ae137afe0150c1db341ffc2feaad2de9015d166a4146cb79e8b5f87b754a" Feb 19 20:15:54 crc kubenswrapper[4749]: I0219 20:15:54.720391 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc92ae137afe0150c1db341ffc2feaad2de9015d166a4146cb79e8b5f87b754a"} err="failed to get container status \"dc92ae137afe0150c1db341ffc2feaad2de9015d166a4146cb79e8b5f87b754a\": rpc error: code = NotFound desc = could not find container \"dc92ae137afe0150c1db341ffc2feaad2de9015d166a4146cb79e8b5f87b754a\": container with ID starting with dc92ae137afe0150c1db341ffc2feaad2de9015d166a4146cb79e8b5f87b754a not found: ID does not exist" Feb 19 20:15:54 crc kubenswrapper[4749]: I0219 20:15:54.720413 4749 scope.go:117] "RemoveContainer" containerID="b3414efcba0d83628284009b7bf3303f4b67b566d463e1d4c9ae959014a6a91e" Feb 19 20:15:54 crc kubenswrapper[4749]: E0219 20:15:54.722059 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3414efcba0d83628284009b7bf3303f4b67b566d463e1d4c9ae959014a6a91e\": container with ID starting with b3414efcba0d83628284009b7bf3303f4b67b566d463e1d4c9ae959014a6a91e not found: ID does not exist" containerID="b3414efcba0d83628284009b7bf3303f4b67b566d463e1d4c9ae959014a6a91e" Feb 19 20:15:54 crc kubenswrapper[4749]: I0219 20:15:54.722091 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3414efcba0d83628284009b7bf3303f4b67b566d463e1d4c9ae959014a6a91e"} err="failed to get container status \"b3414efcba0d83628284009b7bf3303f4b67b566d463e1d4c9ae959014a6a91e\": rpc error: code = NotFound desc = could not find container \"b3414efcba0d83628284009b7bf3303f4b67b566d463e1d4c9ae959014a6a91e\": container with ID starting with b3414efcba0d83628284009b7bf3303f4b67b566d463e1d4c9ae959014a6a91e not found: ID does not exist" Feb 19 20:15:54 crc kubenswrapper[4749]: I0219 20:15:54.722111 4749 scope.go:117] "RemoveContainer" containerID="eac373899152f23750e7cc609e1909d14f8e1c226afa13ce4b807a8351115a4d" Feb 19 20:15:54 crc kubenswrapper[4749]: E0219 20:15:54.722383 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eac373899152f23750e7cc609e1909d14f8e1c226afa13ce4b807a8351115a4d\": container with ID starting with eac373899152f23750e7cc609e1909d14f8e1c226afa13ce4b807a8351115a4d not found: ID does not exist" containerID="eac373899152f23750e7cc609e1909d14f8e1c226afa13ce4b807a8351115a4d" Feb 19 20:15:54 crc kubenswrapper[4749]: I0219 20:15:54.722407 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eac373899152f23750e7cc609e1909d14f8e1c226afa13ce4b807a8351115a4d"} err="failed to get container status \"eac373899152f23750e7cc609e1909d14f8e1c226afa13ce4b807a8351115a4d\": rpc error: code = NotFound desc = could not find container \"eac373899152f23750e7cc609e1909d14f8e1c226afa13ce4b807a8351115a4d\": container with ID starting with eac373899152f23750e7cc609e1909d14f8e1c226afa13ce4b807a8351115a4d not found: ID does not exist" Feb 19 20:15:55 crc kubenswrapper[4749]: I0219 20:15:55.679468 4749 scope.go:117] "RemoveContainer" containerID="a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" Feb 19 20:15:55 crc kubenswrapper[4749]: E0219 20:15:55.680303 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:16:07 crc kubenswrapper[4749]: I0219 20:16:07.678782 4749 scope.go:117] "RemoveContainer" containerID="a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" Feb 19 20:16:07 crc kubenswrapper[4749]: E0219 20:16:07.679778 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:16:11 crc kubenswrapper[4749]: I0219 20:16:11.391821 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r7tx2"] Feb 19 20:16:11 crc kubenswrapper[4749]: E0219 20:16:11.392708 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7647095-8e3e-4452-9b11-32122d25f0cb" containerName="extract-content" Feb 19 20:16:11 crc kubenswrapper[4749]: I0219 20:16:11.392726 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7647095-8e3e-4452-9b11-32122d25f0cb" containerName="extract-content" Feb 19 20:16:11 crc kubenswrapper[4749]: E0219 20:16:11.392741 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7647095-8e3e-4452-9b11-32122d25f0cb" containerName="registry-server" Feb 19 20:16:11 crc kubenswrapper[4749]: I0219 20:16:11.392748 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7647095-8e3e-4452-9b11-32122d25f0cb" containerName="registry-server" Feb 19 20:16:11 crc kubenswrapper[4749]: E0219 20:16:11.392769 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7647095-8e3e-4452-9b11-32122d25f0cb" containerName="extract-utilities" Feb 19 20:16:11 crc kubenswrapper[4749]: I0219 20:16:11.392776 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7647095-8e3e-4452-9b11-32122d25f0cb" containerName="extract-utilities" Feb 19 20:16:11 crc kubenswrapper[4749]: I0219 20:16:11.392972 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7647095-8e3e-4452-9b11-32122d25f0cb" containerName="registry-server" Feb 19 20:16:11 crc kubenswrapper[4749]: I0219 20:16:11.394567 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7tx2" Feb 19 20:16:11 crc kubenswrapper[4749]: I0219 20:16:11.430828 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r7tx2"] Feb 19 20:16:11 crc kubenswrapper[4749]: I0219 20:16:11.542237 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdsz5\" (UniqueName: \"kubernetes.io/projected/e2290754-fba3-47c9-8cc7-842688985495-kube-api-access-hdsz5\") pod \"community-operators-r7tx2\" (UID: \"e2290754-fba3-47c9-8cc7-842688985495\") " pod="openshift-marketplace/community-operators-r7tx2" Feb 19 20:16:11 crc kubenswrapper[4749]: I0219 20:16:11.542721 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2290754-fba3-47c9-8cc7-842688985495-utilities\") pod \"community-operators-r7tx2\" (UID: \"e2290754-fba3-47c9-8cc7-842688985495\") " pod="openshift-marketplace/community-operators-r7tx2" Feb 19 20:16:11 crc kubenswrapper[4749]: I0219 20:16:11.542905 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2290754-fba3-47c9-8cc7-842688985495-catalog-content\") pod \"community-operators-r7tx2\" (UID: \"e2290754-fba3-47c9-8cc7-842688985495\") " pod="openshift-marketplace/community-operators-r7tx2" Feb 19 20:16:11 crc kubenswrapper[4749]: I0219 20:16:11.644871 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdsz5\" (UniqueName: \"kubernetes.io/projected/e2290754-fba3-47c9-8cc7-842688985495-kube-api-access-hdsz5\") pod \"community-operators-r7tx2\" (UID: \"e2290754-fba3-47c9-8cc7-842688985495\") " pod="openshift-marketplace/community-operators-r7tx2" Feb 19 20:16:11 crc kubenswrapper[4749]: I0219 20:16:11.644920 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2290754-fba3-47c9-8cc7-842688985495-utilities\") pod \"community-operators-r7tx2\" (UID: \"e2290754-fba3-47c9-8cc7-842688985495\") " pod="openshift-marketplace/community-operators-r7tx2" Feb 19 20:16:11 crc kubenswrapper[4749]: I0219 20:16:11.645069 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2290754-fba3-47c9-8cc7-842688985495-catalog-content\") pod \"community-operators-r7tx2\" (UID: \"e2290754-fba3-47c9-8cc7-842688985495\") " pod="openshift-marketplace/community-operators-r7tx2" Feb 19 20:16:11 crc kubenswrapper[4749]: I0219 20:16:11.645534 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2290754-fba3-47c9-8cc7-842688985495-utilities\") pod \"community-operators-r7tx2\" (UID: \"e2290754-fba3-47c9-8cc7-842688985495\") " pod="openshift-marketplace/community-operators-r7tx2" Feb 19 20:16:11 crc kubenswrapper[4749]: I0219 20:16:11.645715 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2290754-fba3-47c9-8cc7-842688985495-catalog-content\") pod \"community-operators-r7tx2\" (UID: \"e2290754-fba3-47c9-8cc7-842688985495\") " pod="openshift-marketplace/community-operators-r7tx2" Feb 19 20:16:11 crc kubenswrapper[4749]: I0219 20:16:11.673921 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdsz5\" (UniqueName: \"kubernetes.io/projected/e2290754-fba3-47c9-8cc7-842688985495-kube-api-access-hdsz5\") pod \"community-operators-r7tx2\" (UID: \"e2290754-fba3-47c9-8cc7-842688985495\") " pod="openshift-marketplace/community-operators-r7tx2" Feb 19 20:16:11 crc kubenswrapper[4749]: I0219 20:16:11.731656 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7tx2" Feb 19 20:16:12 crc kubenswrapper[4749]: I0219 20:16:12.274654 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r7tx2"] Feb 19 20:16:12 crc kubenswrapper[4749]: I0219 20:16:12.811804 4749 generic.go:334] "Generic (PLEG): container finished" podID="e2290754-fba3-47c9-8cc7-842688985495" containerID="0eba7641b490a2c1a3b285ede5499a53b02b3386404feafd0067e52a661e4030" exitCode=0 Feb 19 20:16:12 crc kubenswrapper[4749]: I0219 20:16:12.811855 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7tx2" event={"ID":"e2290754-fba3-47c9-8cc7-842688985495","Type":"ContainerDied","Data":"0eba7641b490a2c1a3b285ede5499a53b02b3386404feafd0067e52a661e4030"} Feb 19 20:16:12 crc kubenswrapper[4749]: I0219 20:16:12.813169 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7tx2" event={"ID":"e2290754-fba3-47c9-8cc7-842688985495","Type":"ContainerStarted","Data":"23c5eeea52a5fb6b176f796ebfa2f629518bbefe70a40a70d91fee8e3bfdb317"} Feb 19 20:16:12 crc kubenswrapper[4749]: I0219 20:16:12.814983 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 20:16:13 crc kubenswrapper[4749]: I0219 20:16:13.776462 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mbbhl"] Feb 19 20:16:13 crc kubenswrapper[4749]: I0219 20:16:13.779275 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mbbhl" Feb 19 20:16:13 crc kubenswrapper[4749]: I0219 20:16:13.795543 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mbbhl"] Feb 19 20:16:13 crc kubenswrapper[4749]: I0219 20:16:13.841649 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7tx2" event={"ID":"e2290754-fba3-47c9-8cc7-842688985495","Type":"ContainerStarted","Data":"6057417face57fb1609395b4382dd54b0eb6bdb85f0022ce75880332d8e00bd3"} Feb 19 20:16:13 crc kubenswrapper[4749]: I0219 20:16:13.890696 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398f9e0d-2529-45a5-b52a-3b3f60b816f2-utilities\") pod \"certified-operators-mbbhl\" (UID: \"398f9e0d-2529-45a5-b52a-3b3f60b816f2\") " pod="openshift-marketplace/certified-operators-mbbhl" Feb 19 20:16:13 crc kubenswrapper[4749]: I0219 20:16:13.890764 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bz2f\" (UniqueName: \"kubernetes.io/projected/398f9e0d-2529-45a5-b52a-3b3f60b816f2-kube-api-access-4bz2f\") pod \"certified-operators-mbbhl\" (UID: \"398f9e0d-2529-45a5-b52a-3b3f60b816f2\") " pod="openshift-marketplace/certified-operators-mbbhl" Feb 19 20:16:13 crc kubenswrapper[4749]: I0219 20:16:13.890842 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398f9e0d-2529-45a5-b52a-3b3f60b816f2-catalog-content\") pod \"certified-operators-mbbhl\" (UID: \"398f9e0d-2529-45a5-b52a-3b3f60b816f2\") " pod="openshift-marketplace/certified-operators-mbbhl" Feb 19 20:16:13 crc kubenswrapper[4749]: I0219 20:16:13.994999 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398f9e0d-2529-45a5-b52a-3b3f60b816f2-utilities\") pod \"certified-operators-mbbhl\" (UID: \"398f9e0d-2529-45a5-b52a-3b3f60b816f2\") " pod="openshift-marketplace/certified-operators-mbbhl" Feb 19 20:16:13 crc kubenswrapper[4749]: I0219 20:16:13.995079 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bz2f\" (UniqueName: \"kubernetes.io/projected/398f9e0d-2529-45a5-b52a-3b3f60b816f2-kube-api-access-4bz2f\") pod \"certified-operators-mbbhl\" (UID: \"398f9e0d-2529-45a5-b52a-3b3f60b816f2\") " pod="openshift-marketplace/certified-operators-mbbhl" Feb 19 20:16:13 crc kubenswrapper[4749]: I0219 20:16:13.995108 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398f9e0d-2529-45a5-b52a-3b3f60b816f2-catalog-content\") pod \"certified-operators-mbbhl\" (UID: \"398f9e0d-2529-45a5-b52a-3b3f60b816f2\") " pod="openshift-marketplace/certified-operators-mbbhl" Feb 19 20:16:13 crc kubenswrapper[4749]: I0219 20:16:13.995769 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398f9e0d-2529-45a5-b52a-3b3f60b816f2-catalog-content\") pod \"certified-operators-mbbhl\" (UID: \"398f9e0d-2529-45a5-b52a-3b3f60b816f2\") " pod="openshift-marketplace/certified-operators-mbbhl" Feb 19 20:16:13 crc kubenswrapper[4749]: I0219 20:16:13.995815 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398f9e0d-2529-45a5-b52a-3b3f60b816f2-utilities\") pod \"certified-operators-mbbhl\" (UID: \"398f9e0d-2529-45a5-b52a-3b3f60b816f2\") " pod="openshift-marketplace/certified-operators-mbbhl" Feb 19 20:16:14 crc kubenswrapper[4749]: I0219 20:16:14.050585 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bz2f\" (UniqueName: \"kubernetes.io/projected/398f9e0d-2529-45a5-b52a-3b3f60b816f2-kube-api-access-4bz2f\") pod \"certified-operators-mbbhl\" (UID: \"398f9e0d-2529-45a5-b52a-3b3f60b816f2\") " pod="openshift-marketplace/certified-operators-mbbhl" Feb 19 20:16:14 crc kubenswrapper[4749]: I0219 20:16:14.099541 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mbbhl" Feb 19 20:16:14 crc kubenswrapper[4749]: I0219 20:16:14.601222 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mbbhl"] Feb 19 20:16:14 crc kubenswrapper[4749]: W0219 20:16:14.606499 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod398f9e0d_2529_45a5_b52a_3b3f60b816f2.slice/crio-4578c0c84299fe70a4a057d464357bca0502387d201b552dbddc1007edbc2c41 WatchSource:0}: Error finding container 4578c0c84299fe70a4a057d464357bca0502387d201b552dbddc1007edbc2c41: Status 404 returned error can't find the container with id 4578c0c84299fe70a4a057d464357bca0502387d201b552dbddc1007edbc2c41 Feb 19 20:16:14 crc kubenswrapper[4749]: I0219 20:16:14.851429 4749 generic.go:334] "Generic (PLEG): container finished" podID="398f9e0d-2529-45a5-b52a-3b3f60b816f2" containerID="b93141b3494441889a492a1aecce8a37e6baba6891ebe10ffe16c789c2c29465" exitCode=0 Feb 19 20:16:14 crc kubenswrapper[4749]: I0219 20:16:14.851529 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbbhl" event={"ID":"398f9e0d-2529-45a5-b52a-3b3f60b816f2","Type":"ContainerDied","Data":"b93141b3494441889a492a1aecce8a37e6baba6891ebe10ffe16c789c2c29465"} Feb 19 20:16:14 crc kubenswrapper[4749]: I0219 20:16:14.851749 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbbhl" event={"ID":"398f9e0d-2529-45a5-b52a-3b3f60b816f2","Type":"ContainerStarted","Data":"4578c0c84299fe70a4a057d464357bca0502387d201b552dbddc1007edbc2c41"} Feb 19 20:16:15 crc kubenswrapper[4749]: I0219 20:16:15.862816 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbbhl" event={"ID":"398f9e0d-2529-45a5-b52a-3b3f60b816f2","Type":"ContainerStarted","Data":"1eca0b237fdf3181558bc1b2e148d7f2f4de79b28fd04abc9f5ba0c9fb59fbda"} Feb 19 20:16:15 crc kubenswrapper[4749]: I0219 20:16:15.864709 4749 generic.go:334] "Generic (PLEG): container finished" podID="e2290754-fba3-47c9-8cc7-842688985495" containerID="6057417face57fb1609395b4382dd54b0eb6bdb85f0022ce75880332d8e00bd3" exitCode=0 Feb 19 20:16:15 crc kubenswrapper[4749]: I0219 20:16:15.864740 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7tx2" event={"ID":"e2290754-fba3-47c9-8cc7-842688985495","Type":"ContainerDied","Data":"6057417face57fb1609395b4382dd54b0eb6bdb85f0022ce75880332d8e00bd3"} Feb 19 20:16:16 crc kubenswrapper[4749]: I0219 20:16:16.898457 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7tx2" event={"ID":"e2290754-fba3-47c9-8cc7-842688985495","Type":"ContainerStarted","Data":"10e672cf5df596fa7836ec63393fc73fb54929eb5427c90b32694e199d386417"} Feb 19 20:16:16 crc kubenswrapper[4749]: I0219 20:16:16.926130 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r7tx2" podStartSLOduration=2.406285 podStartE2EDuration="5.926112088s" podCreationTimestamp="2026-02-19 20:16:11 +0000 UTC" firstStartedPulling="2026-02-19 20:16:12.814681872 +0000 UTC m=+6146.775901826" lastFinishedPulling="2026-02-19 20:16:16.33450896 +0000 UTC m=+6150.295728914" observedRunningTime="2026-02-19 20:16:16.920687356 +0000 UTC m=+6150.881907310" watchObservedRunningTime="2026-02-19 20:16:16.926112088 +0000 UTC m=+6150.887332042" Feb 19 20:16:17 crc kubenswrapper[4749]: I0219 20:16:17.923079 4749 generic.go:334] "Generic (PLEG): container finished" podID="398f9e0d-2529-45a5-b52a-3b3f60b816f2" containerID="1eca0b237fdf3181558bc1b2e148d7f2f4de79b28fd04abc9f5ba0c9fb59fbda" exitCode=0 Feb 19 20:16:17 crc kubenswrapper[4749]: I0219 20:16:17.923122 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbbhl" event={"ID":"398f9e0d-2529-45a5-b52a-3b3f60b816f2","Type":"ContainerDied","Data":"1eca0b237fdf3181558bc1b2e148d7f2f4de79b28fd04abc9f5ba0c9fb59fbda"} Feb 19 20:16:19 crc kubenswrapper[4749]: I0219 20:16:19.942929 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbbhl" event={"ID":"398f9e0d-2529-45a5-b52a-3b3f60b816f2","Type":"ContainerStarted","Data":"54372cb9482b293c1d7625e1c1b6ff5e642ab75e8ce86a41285888bd5c82fa78"} Feb 19 20:16:19 crc kubenswrapper[4749]: I0219 20:16:19.965631 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mbbhl" podStartSLOduration=3.057835948 podStartE2EDuration="6.965609645s" podCreationTimestamp="2026-02-19 20:16:13 +0000 UTC" firstStartedPulling="2026-02-19 20:16:14.853883341 +0000 UTC m=+6148.815103285" lastFinishedPulling="2026-02-19 20:16:18.761657038 +0000 UTC m=+6152.722876982" observedRunningTime="2026-02-19 20:16:19.961152518 +0000 UTC m=+6153.922372472" watchObservedRunningTime="2026-02-19 20:16:19.965609645 +0000 UTC m=+6153.926829599" Feb 19 20:16:20 crc kubenswrapper[4749]: I0219 20:16:20.682999 4749 scope.go:117] "RemoveContainer" containerID="a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" Feb 19 20:16:20 crc kubenswrapper[4749]: E0219 20:16:20.683617 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:16:21 crc kubenswrapper[4749]: I0219 20:16:21.732764 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r7tx2" Feb 19 20:16:21 crc kubenswrapper[4749]: I0219 20:16:21.733637 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r7tx2" Feb 19 20:16:21 crc kubenswrapper[4749]: I0219 20:16:21.783315 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r7tx2" Feb 19 20:16:22 crc kubenswrapper[4749]: I0219 20:16:22.007772 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r7tx2" Feb 19 20:16:24 crc kubenswrapper[4749]: I0219 20:16:24.100181 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mbbhl" Feb 19 20:16:24 crc kubenswrapper[4749]: I0219 20:16:24.100305 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mbbhl" Feb 19 20:16:24 crc kubenswrapper[4749]: I0219 20:16:24.184830 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r7tx2"] Feb 19 20:16:24 crc kubenswrapper[4749]: I0219 20:16:24.209239 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mbbhl" Feb 19 20:16:24 crc kubenswrapper[4749]: I0219 20:16:24.998591 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r7tx2" podUID="e2290754-fba3-47c9-8cc7-842688985495" containerName="registry-server" containerID="cri-o://10e672cf5df596fa7836ec63393fc73fb54929eb5427c90b32694e199d386417" gracePeriod=2 Feb 19 20:16:25 crc kubenswrapper[4749]: I0219 20:16:25.062685 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mbbhl" Feb 19 20:16:25 crc kubenswrapper[4749]: I0219 20:16:25.475105 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7tx2" Feb 19 20:16:25 crc kubenswrapper[4749]: I0219 20:16:25.628332 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2290754-fba3-47c9-8cc7-842688985495-utilities\") pod \"e2290754-fba3-47c9-8cc7-842688985495\" (UID: \"e2290754-fba3-47c9-8cc7-842688985495\") " Feb 19 20:16:25 crc kubenswrapper[4749]: I0219 20:16:25.628814 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdsz5\" (UniqueName: \"kubernetes.io/projected/e2290754-fba3-47c9-8cc7-842688985495-kube-api-access-hdsz5\") pod \"e2290754-fba3-47c9-8cc7-842688985495\" (UID: \"e2290754-fba3-47c9-8cc7-842688985495\") " Feb 19 20:16:25 crc kubenswrapper[4749]: I0219 20:16:25.628894 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2290754-fba3-47c9-8cc7-842688985495-catalog-content\") pod \"e2290754-fba3-47c9-8cc7-842688985495\" (UID: \"e2290754-fba3-47c9-8cc7-842688985495\") " Feb 19 20:16:25 crc kubenswrapper[4749]: I0219 20:16:25.629496 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2290754-fba3-47c9-8cc7-842688985495-utilities" (OuterVolumeSpecName: "utilities") pod "e2290754-fba3-47c9-8cc7-842688985495" (UID: "e2290754-fba3-47c9-8cc7-842688985495"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:16:25 crc kubenswrapper[4749]: I0219 20:16:25.635327 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2290754-fba3-47c9-8cc7-842688985495-kube-api-access-hdsz5" (OuterVolumeSpecName: "kube-api-access-hdsz5") pod "e2290754-fba3-47c9-8cc7-842688985495" (UID: "e2290754-fba3-47c9-8cc7-842688985495"). InnerVolumeSpecName "kube-api-access-hdsz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:16:25 crc kubenswrapper[4749]: I0219 20:16:25.690424 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2290754-fba3-47c9-8cc7-842688985495-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2290754-fba3-47c9-8cc7-842688985495" (UID: "e2290754-fba3-47c9-8cc7-842688985495"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:16:25 crc kubenswrapper[4749]: I0219 20:16:25.732061 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2290754-fba3-47c9-8cc7-842688985495-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 20:16:25 crc kubenswrapper[4749]: I0219 20:16:25.732093 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdsz5\" (UniqueName: \"kubernetes.io/projected/e2290754-fba3-47c9-8cc7-842688985495-kube-api-access-hdsz5\") on node \"crc\" DevicePath \"\"" Feb 19 20:16:25 crc kubenswrapper[4749]: I0219 20:16:25.732103 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2290754-fba3-47c9-8cc7-842688985495-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 20:16:26 crc kubenswrapper[4749]: I0219 20:16:26.011793 4749 generic.go:334] "Generic (PLEG): container finished" podID="e2290754-fba3-47c9-8cc7-842688985495" containerID="10e672cf5df596fa7836ec63393fc73fb54929eb5427c90b32694e199d386417" exitCode=0 Feb 19 20:16:26 crc kubenswrapper[4749]: I0219 20:16:26.012830 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7tx2" Feb 19 20:16:26 crc kubenswrapper[4749]: I0219 20:16:26.013119 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7tx2" event={"ID":"e2290754-fba3-47c9-8cc7-842688985495","Type":"ContainerDied","Data":"10e672cf5df596fa7836ec63393fc73fb54929eb5427c90b32694e199d386417"} Feb 19 20:16:26 crc kubenswrapper[4749]: I0219 20:16:26.013156 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7tx2" event={"ID":"e2290754-fba3-47c9-8cc7-842688985495","Type":"ContainerDied","Data":"23c5eeea52a5fb6b176f796ebfa2f629518bbefe70a40a70d91fee8e3bfdb317"} Feb 19 20:16:26 crc kubenswrapper[4749]: I0219 20:16:26.013177 4749 scope.go:117] "RemoveContainer" containerID="10e672cf5df596fa7836ec63393fc73fb54929eb5427c90b32694e199d386417" Feb 19 20:16:26 crc kubenswrapper[4749]: I0219 20:16:26.052922 4749 scope.go:117] "RemoveContainer" containerID="6057417face57fb1609395b4382dd54b0eb6bdb85f0022ce75880332d8e00bd3" Feb 19 20:16:26 crc kubenswrapper[4749]: I0219 20:16:26.068093 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r7tx2"] Feb 19 20:16:26 crc kubenswrapper[4749]: I0219 20:16:26.077698 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r7tx2"] Feb 19 20:16:26 crc kubenswrapper[4749]: I0219 20:16:26.087438 4749 scope.go:117] "RemoveContainer" containerID="0eba7641b490a2c1a3b285ede5499a53b02b3386404feafd0067e52a661e4030" Feb 19 20:16:26 crc kubenswrapper[4749]: I0219 20:16:26.131218 4749 scope.go:117] "RemoveContainer" containerID="10e672cf5df596fa7836ec63393fc73fb54929eb5427c90b32694e199d386417" Feb 19 20:16:26 crc kubenswrapper[4749]: E0219 20:16:26.132089 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10e672cf5df596fa7836ec63393fc73fb54929eb5427c90b32694e199d386417\": container with ID starting with 10e672cf5df596fa7836ec63393fc73fb54929eb5427c90b32694e199d386417 not found: ID does not exist" containerID="10e672cf5df596fa7836ec63393fc73fb54929eb5427c90b32694e199d386417" Feb 19 20:16:26 crc kubenswrapper[4749]: I0219 20:16:26.132150 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e672cf5df596fa7836ec63393fc73fb54929eb5427c90b32694e199d386417"} err="failed to get container status \"10e672cf5df596fa7836ec63393fc73fb54929eb5427c90b32694e199d386417\": rpc error: code = NotFound desc = could not find container \"10e672cf5df596fa7836ec63393fc73fb54929eb5427c90b32694e199d386417\": container with ID starting with 10e672cf5df596fa7836ec63393fc73fb54929eb5427c90b32694e199d386417 not found: ID does not exist" Feb 19 20:16:26 crc kubenswrapper[4749]: I0219 20:16:26.132176 4749 scope.go:117] "RemoveContainer" containerID="6057417face57fb1609395b4382dd54b0eb6bdb85f0022ce75880332d8e00bd3" Feb 19 20:16:26 crc kubenswrapper[4749]: E0219 20:16:26.132617 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6057417face57fb1609395b4382dd54b0eb6bdb85f0022ce75880332d8e00bd3\": container with ID starting with 6057417face57fb1609395b4382dd54b0eb6bdb85f0022ce75880332d8e00bd3 not found: ID does not exist" containerID="6057417face57fb1609395b4382dd54b0eb6bdb85f0022ce75880332d8e00bd3" Feb 19 20:16:26 crc kubenswrapper[4749]: I0219 20:16:26.132665 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6057417face57fb1609395b4382dd54b0eb6bdb85f0022ce75880332d8e00bd3"} err="failed to get container status \"6057417face57fb1609395b4382dd54b0eb6bdb85f0022ce75880332d8e00bd3\": rpc error: code = NotFound desc = could not find container \"6057417face57fb1609395b4382dd54b0eb6bdb85f0022ce75880332d8e00bd3\": container with ID starting with 6057417face57fb1609395b4382dd54b0eb6bdb85f0022ce75880332d8e00bd3 not found: ID does not exist" Feb 19 20:16:26 crc kubenswrapper[4749]: I0219 20:16:26.132698 4749 scope.go:117] "RemoveContainer" containerID="0eba7641b490a2c1a3b285ede5499a53b02b3386404feafd0067e52a661e4030" Feb 19 20:16:26 crc kubenswrapper[4749]: E0219 20:16:26.133103 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eba7641b490a2c1a3b285ede5499a53b02b3386404feafd0067e52a661e4030\": container with ID starting with 0eba7641b490a2c1a3b285ede5499a53b02b3386404feafd0067e52a661e4030 not found: ID does not exist" containerID="0eba7641b490a2c1a3b285ede5499a53b02b3386404feafd0067e52a661e4030" Feb 19 20:16:26 crc kubenswrapper[4749]: I0219 20:16:26.133137 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eba7641b490a2c1a3b285ede5499a53b02b3386404feafd0067e52a661e4030"} err="failed to get container status \"0eba7641b490a2c1a3b285ede5499a53b02b3386404feafd0067e52a661e4030\": rpc error: code = NotFound desc = could not find container \"0eba7641b490a2c1a3b285ede5499a53b02b3386404feafd0067e52a661e4030\": container with ID starting with 0eba7641b490a2c1a3b285ede5499a53b02b3386404feafd0067e52a661e4030 not found: ID does not exist" Feb 19 20:16:26 crc kubenswrapper[4749]: I0219 20:16:26.577144 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mbbhl"] Feb 19 20:16:26 crc kubenswrapper[4749]: I0219 20:16:26.692992 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2290754-fba3-47c9-8cc7-842688985495" path="/var/lib/kubelet/pods/e2290754-fba3-47c9-8cc7-842688985495/volumes" Feb 19 20:16:28 crc kubenswrapper[4749]: I0219 20:16:28.029518 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mbbhl" podUID="398f9e0d-2529-45a5-b52a-3b3f60b816f2" containerName="registry-server" containerID="cri-o://54372cb9482b293c1d7625e1c1b6ff5e642ab75e8ce86a41285888bd5c82fa78" gracePeriod=2 Feb 19 20:16:28 crc kubenswrapper[4749]: I0219 20:16:28.632820 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mbbhl" Feb 19 20:16:28 crc kubenswrapper[4749]: I0219 20:16:28.727101 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bz2f\" (UniqueName: \"kubernetes.io/projected/398f9e0d-2529-45a5-b52a-3b3f60b816f2-kube-api-access-4bz2f\") pod \"398f9e0d-2529-45a5-b52a-3b3f60b816f2\" (UID: \"398f9e0d-2529-45a5-b52a-3b3f60b816f2\") " Feb 19 20:16:28 crc kubenswrapper[4749]: I0219 20:16:28.727282 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398f9e0d-2529-45a5-b52a-3b3f60b816f2-catalog-content\") pod \"398f9e0d-2529-45a5-b52a-3b3f60b816f2\" (UID: \"398f9e0d-2529-45a5-b52a-3b3f60b816f2\") " Feb 19 20:16:28 crc kubenswrapper[4749]: I0219 20:16:28.727341 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398f9e0d-2529-45a5-b52a-3b3f60b816f2-utilities\") pod \"398f9e0d-2529-45a5-b52a-3b3f60b816f2\" (UID: \"398f9e0d-2529-45a5-b52a-3b3f60b816f2\") " Feb 19 20:16:28 crc kubenswrapper[4749]: I0219 20:16:28.728755 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/398f9e0d-2529-45a5-b52a-3b3f60b816f2-utilities" (OuterVolumeSpecName: "utilities") pod "398f9e0d-2529-45a5-b52a-3b3f60b816f2" (UID: "398f9e0d-2529-45a5-b52a-3b3f60b816f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:16:28 crc kubenswrapper[4749]: I0219 20:16:28.733886 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/398f9e0d-2529-45a5-b52a-3b3f60b816f2-kube-api-access-4bz2f" (OuterVolumeSpecName: "kube-api-access-4bz2f") pod "398f9e0d-2529-45a5-b52a-3b3f60b816f2" (UID: "398f9e0d-2529-45a5-b52a-3b3f60b816f2"). InnerVolumeSpecName "kube-api-access-4bz2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:16:28 crc kubenswrapper[4749]: I0219 20:16:28.779044 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/398f9e0d-2529-45a5-b52a-3b3f60b816f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "398f9e0d-2529-45a5-b52a-3b3f60b816f2" (UID: "398f9e0d-2529-45a5-b52a-3b3f60b816f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:16:28 crc kubenswrapper[4749]: I0219 20:16:28.829879 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398f9e0d-2529-45a5-b52a-3b3f60b816f2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 20:16:28 crc kubenswrapper[4749]: I0219 20:16:28.829919 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398f9e0d-2529-45a5-b52a-3b3f60b816f2-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 20:16:28 crc kubenswrapper[4749]: I0219 20:16:28.829935 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bz2f\" (UniqueName: \"kubernetes.io/projected/398f9e0d-2529-45a5-b52a-3b3f60b816f2-kube-api-access-4bz2f\") on node \"crc\" DevicePath \"\"" Feb 19 20:16:29 crc kubenswrapper[4749]: I0219 20:16:29.040061 4749 generic.go:334] "Generic (PLEG): container finished" podID="398f9e0d-2529-45a5-b52a-3b3f60b816f2" containerID="54372cb9482b293c1d7625e1c1b6ff5e642ab75e8ce86a41285888bd5c82fa78" exitCode=0 Feb 19 20:16:29 crc kubenswrapper[4749]: I0219 20:16:29.040193 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mbbhl" Feb 19 20:16:29 crc kubenswrapper[4749]: I0219 20:16:29.040109 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbbhl" event={"ID":"398f9e0d-2529-45a5-b52a-3b3f60b816f2","Type":"ContainerDied","Data":"54372cb9482b293c1d7625e1c1b6ff5e642ab75e8ce86a41285888bd5c82fa78"} Feb 19 20:16:29 crc kubenswrapper[4749]: I0219 20:16:29.041119 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbbhl" event={"ID":"398f9e0d-2529-45a5-b52a-3b3f60b816f2","Type":"ContainerDied","Data":"4578c0c84299fe70a4a057d464357bca0502387d201b552dbddc1007edbc2c41"} Feb 19 20:16:29 crc kubenswrapper[4749]: I0219 20:16:29.041142 4749 scope.go:117] "RemoveContainer" containerID="54372cb9482b293c1d7625e1c1b6ff5e642ab75e8ce86a41285888bd5c82fa78" Feb 19 20:16:29 crc kubenswrapper[4749]: I0219 20:16:29.072230 4749 scope.go:117] "RemoveContainer" containerID="1eca0b237fdf3181558bc1b2e148d7f2f4de79b28fd04abc9f5ba0c9fb59fbda" Feb 19 20:16:29 crc kubenswrapper[4749]: I0219 20:16:29.093716 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mbbhl"] Feb 19 20:16:29 crc kubenswrapper[4749]: I0219 20:16:29.100225 4749 scope.go:117] "RemoveContainer" containerID="b93141b3494441889a492a1aecce8a37e6baba6891ebe10ffe16c789c2c29465" Feb 19 20:16:29 crc kubenswrapper[4749]: I0219 20:16:29.104595 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mbbhl"] Feb 19 20:16:29 crc kubenswrapper[4749]: I0219 20:16:29.167966 4749 scope.go:117] "RemoveContainer" containerID="54372cb9482b293c1d7625e1c1b6ff5e642ab75e8ce86a41285888bd5c82fa78" Feb 19 20:16:29 crc kubenswrapper[4749]: E0219 20:16:29.168747 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54372cb9482b293c1d7625e1c1b6ff5e642ab75e8ce86a41285888bd5c82fa78\": container with ID starting with 54372cb9482b293c1d7625e1c1b6ff5e642ab75e8ce86a41285888bd5c82fa78 not found: ID does not exist" containerID="54372cb9482b293c1d7625e1c1b6ff5e642ab75e8ce86a41285888bd5c82fa78" Feb 19 20:16:29 crc kubenswrapper[4749]: I0219 20:16:29.168783 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54372cb9482b293c1d7625e1c1b6ff5e642ab75e8ce86a41285888bd5c82fa78"} err="failed to get container status \"54372cb9482b293c1d7625e1c1b6ff5e642ab75e8ce86a41285888bd5c82fa78\": rpc error: code = NotFound desc = could not find container \"54372cb9482b293c1d7625e1c1b6ff5e642ab75e8ce86a41285888bd5c82fa78\": container with ID starting with 54372cb9482b293c1d7625e1c1b6ff5e642ab75e8ce86a41285888bd5c82fa78 not found: ID does not exist" Feb 19 20:16:29 crc kubenswrapper[4749]: I0219 20:16:29.168809 4749 scope.go:117] "RemoveContainer" containerID="1eca0b237fdf3181558bc1b2e148d7f2f4de79b28fd04abc9f5ba0c9fb59fbda" Feb 19 20:16:29 crc kubenswrapper[4749]: E0219 20:16:29.170602 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eca0b237fdf3181558bc1b2e148d7f2f4de79b28fd04abc9f5ba0c9fb59fbda\": container with ID starting with 1eca0b237fdf3181558bc1b2e148d7f2f4de79b28fd04abc9f5ba0c9fb59fbda not found: ID does not exist" containerID="1eca0b237fdf3181558bc1b2e148d7f2f4de79b28fd04abc9f5ba0c9fb59fbda" Feb 19 20:16:29 crc kubenswrapper[4749]: I0219 20:16:29.170628 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eca0b237fdf3181558bc1b2e148d7f2f4de79b28fd04abc9f5ba0c9fb59fbda"} err="failed to get container status \"1eca0b237fdf3181558bc1b2e148d7f2f4de79b28fd04abc9f5ba0c9fb59fbda\": rpc error: code = NotFound desc = could not find container \"1eca0b237fdf3181558bc1b2e148d7f2f4de79b28fd04abc9f5ba0c9fb59fbda\": container with ID starting with 1eca0b237fdf3181558bc1b2e148d7f2f4de79b28fd04abc9f5ba0c9fb59fbda not found: ID does not exist" Feb 19 20:16:29 crc kubenswrapper[4749]: I0219 20:16:29.170644 4749 scope.go:117] "RemoveContainer" containerID="b93141b3494441889a492a1aecce8a37e6baba6891ebe10ffe16c789c2c29465" Feb 19 20:16:29 crc kubenswrapper[4749]: E0219 20:16:29.170921 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b93141b3494441889a492a1aecce8a37e6baba6891ebe10ffe16c789c2c29465\": container with ID starting with b93141b3494441889a492a1aecce8a37e6baba6891ebe10ffe16c789c2c29465 not found: ID does not exist" containerID="b93141b3494441889a492a1aecce8a37e6baba6891ebe10ffe16c789c2c29465" Feb 19 20:16:29 crc kubenswrapper[4749]: I0219 20:16:29.171057 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b93141b3494441889a492a1aecce8a37e6baba6891ebe10ffe16c789c2c29465"} err="failed to get container status \"b93141b3494441889a492a1aecce8a37e6baba6891ebe10ffe16c789c2c29465\": rpc error: code = NotFound desc = could not find container \"b93141b3494441889a492a1aecce8a37e6baba6891ebe10ffe16c789c2c29465\": container with ID starting with b93141b3494441889a492a1aecce8a37e6baba6891ebe10ffe16c789c2c29465 not found: ID does not exist" Feb 19 20:16:30 crc kubenswrapper[4749]: I0219 20:16:30.695582 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="398f9e0d-2529-45a5-b52a-3b3f60b816f2" path="/var/lib/kubelet/pods/398f9e0d-2529-45a5-b52a-3b3f60b816f2/volumes" Feb 19 20:16:32 crc kubenswrapper[4749]: I0219 20:16:32.073152 4749 generic.go:334] "Generic (PLEG): container finished" podID="8e1e8621-901e-44d8-a2ba-7b56b11d302b" containerID="3374dad2e38ffa4b1ac48acb3a4d0654ba45ec425de5d23303129c4ecbca7a59" exitCode=0 Feb 19 20:16:32 crc kubenswrapper[4749]: I0219 20:16:32.073259 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qbkkx/must-gather-s5qs9" event={"ID":"8e1e8621-901e-44d8-a2ba-7b56b11d302b","Type":"ContainerDied","Data":"3374dad2e38ffa4b1ac48acb3a4d0654ba45ec425de5d23303129c4ecbca7a59"} Feb 19 20:16:32 crc kubenswrapper[4749]: I0219 20:16:32.074193 4749 scope.go:117] "RemoveContainer" containerID="3374dad2e38ffa4b1ac48acb3a4d0654ba45ec425de5d23303129c4ecbca7a59" Feb 19 20:16:33 crc kubenswrapper[4749]: I0219 20:16:33.029535 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qbkkx_must-gather-s5qs9_8e1e8621-901e-44d8-a2ba-7b56b11d302b/gather/0.log" Feb 19 20:16:34 crc kubenswrapper[4749]: I0219 20:16:34.678749 4749 scope.go:117] "RemoveContainer" containerID="a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" Feb 19 20:16:34 crc kubenswrapper[4749]: E0219 20:16:34.679230 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:16:45 crc kubenswrapper[4749]: I0219 20:16:45.679899 4749 scope.go:117] "RemoveContainer" containerID="a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" Feb 19 20:16:45 crc kubenswrapper[4749]: E0219 20:16:45.680730 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:16:46 crc kubenswrapper[4749]: I0219 20:16:46.719508 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qbkkx/must-gather-s5qs9"] Feb 19 20:16:46 crc kubenswrapper[4749]: I0219 20:16:46.720070 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-qbkkx/must-gather-s5qs9" podUID="8e1e8621-901e-44d8-a2ba-7b56b11d302b" containerName="copy" containerID="cri-o://a7b40b66ddf30100178c4a7ddb28663308449515ba118b756caaf21fea308812" gracePeriod=2 Feb 19 20:16:46 crc kubenswrapper[4749]: I0219 20:16:46.752935 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qbkkx/must-gather-s5qs9"] Feb 19 20:16:47 crc kubenswrapper[4749]: I0219 20:16:47.122165 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qbkkx_must-gather-s5qs9_8e1e8621-901e-44d8-a2ba-7b56b11d302b/copy/0.log" Feb 19 20:16:47 crc kubenswrapper[4749]: I0219 20:16:47.122754 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qbkkx/must-gather-s5qs9" Feb 19 20:16:47 crc kubenswrapper[4749]: I0219 20:16:47.226996 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qbkkx_must-gather-s5qs9_8e1e8621-901e-44d8-a2ba-7b56b11d302b/copy/0.log" Feb 19 20:16:47 crc kubenswrapper[4749]: I0219 20:16:47.227766 4749 generic.go:334] "Generic (PLEG): container finished" podID="8e1e8621-901e-44d8-a2ba-7b56b11d302b" containerID="a7b40b66ddf30100178c4a7ddb28663308449515ba118b756caaf21fea308812" exitCode=143 Feb 19 20:16:47 crc kubenswrapper[4749]: I0219 20:16:47.227834 4749 scope.go:117] "RemoveContainer" containerID="a7b40b66ddf30100178c4a7ddb28663308449515ba118b756caaf21fea308812" Feb 19 20:16:47 crc kubenswrapper[4749]: I0219 20:16:47.227855 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qbkkx/must-gather-s5qs9" Feb 19 20:16:47 crc kubenswrapper[4749]: I0219 20:16:47.246481 4749 scope.go:117] "RemoveContainer" containerID="3374dad2e38ffa4b1ac48acb3a4d0654ba45ec425de5d23303129c4ecbca7a59" Feb 19 20:16:47 crc kubenswrapper[4749]: I0219 20:16:47.301798 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8e1e8621-901e-44d8-a2ba-7b56b11d302b-must-gather-output\") pod \"8e1e8621-901e-44d8-a2ba-7b56b11d302b\" (UID: \"8e1e8621-901e-44d8-a2ba-7b56b11d302b\") " Feb 19 20:16:47 crc kubenswrapper[4749]: I0219 20:16:47.302200 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bd4g\" (UniqueName: \"kubernetes.io/projected/8e1e8621-901e-44d8-a2ba-7b56b11d302b-kube-api-access-4bd4g\") pod \"8e1e8621-901e-44d8-a2ba-7b56b11d302b\" (UID: \"8e1e8621-901e-44d8-a2ba-7b56b11d302b\") " Feb 19 20:16:47 crc kubenswrapper[4749]: I0219 20:16:47.302878 4749 scope.go:117] "RemoveContainer" containerID="a7b40b66ddf30100178c4a7ddb28663308449515ba118b756caaf21fea308812" Feb 19 20:16:47 crc kubenswrapper[4749]: E0219 20:16:47.303356 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7b40b66ddf30100178c4a7ddb28663308449515ba118b756caaf21fea308812\": container with ID starting with a7b40b66ddf30100178c4a7ddb28663308449515ba118b756caaf21fea308812 not found: ID does not exist" containerID="a7b40b66ddf30100178c4a7ddb28663308449515ba118b756caaf21fea308812" Feb 19 20:16:47 crc kubenswrapper[4749]: I0219 20:16:47.303400 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7b40b66ddf30100178c4a7ddb28663308449515ba118b756caaf21fea308812"} err="failed to get container status \"a7b40b66ddf30100178c4a7ddb28663308449515ba118b756caaf21fea308812\": rpc error: code = NotFound desc = could not find container \"a7b40b66ddf30100178c4a7ddb28663308449515ba118b756caaf21fea308812\": container with ID starting with a7b40b66ddf30100178c4a7ddb28663308449515ba118b756caaf21fea308812 not found: ID does not exist" Feb 19 20:16:47 crc kubenswrapper[4749]: I0219 20:16:47.303422 4749 scope.go:117] "RemoveContainer" containerID="3374dad2e38ffa4b1ac48acb3a4d0654ba45ec425de5d23303129c4ecbca7a59" Feb 19 20:16:47 crc kubenswrapper[4749]: E0219 20:16:47.303832 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3374dad2e38ffa4b1ac48acb3a4d0654ba45ec425de5d23303129c4ecbca7a59\": container with ID starting with 3374dad2e38ffa4b1ac48acb3a4d0654ba45ec425de5d23303129c4ecbca7a59 not found: ID does not exist" containerID="3374dad2e38ffa4b1ac48acb3a4d0654ba45ec425de5d23303129c4ecbca7a59" Feb 19 20:16:47 crc kubenswrapper[4749]: I0219 20:16:47.303853 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3374dad2e38ffa4b1ac48acb3a4d0654ba45ec425de5d23303129c4ecbca7a59"} err="failed to get container status \"3374dad2e38ffa4b1ac48acb3a4d0654ba45ec425de5d23303129c4ecbca7a59\": rpc error: code = NotFound desc = could not find container \"3374dad2e38ffa4b1ac48acb3a4d0654ba45ec425de5d23303129c4ecbca7a59\": container with ID starting with 3374dad2e38ffa4b1ac48acb3a4d0654ba45ec425de5d23303129c4ecbca7a59 not found: ID does not exist" Feb 19 20:16:47 crc kubenswrapper[4749]: I0219 20:16:47.308189 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e1e8621-901e-44d8-a2ba-7b56b11d302b-kube-api-access-4bd4g" (OuterVolumeSpecName: "kube-api-access-4bd4g") pod "8e1e8621-901e-44d8-a2ba-7b56b11d302b" (UID: "8e1e8621-901e-44d8-a2ba-7b56b11d302b"). InnerVolumeSpecName "kube-api-access-4bd4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:16:47 crc kubenswrapper[4749]: I0219 20:16:47.404867 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bd4g\" (UniqueName: \"kubernetes.io/projected/8e1e8621-901e-44d8-a2ba-7b56b11d302b-kube-api-access-4bd4g\") on node \"crc\" DevicePath \"\"" Feb 19 20:16:47 crc kubenswrapper[4749]: I0219 20:16:47.497218 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e1e8621-901e-44d8-a2ba-7b56b11d302b-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "8e1e8621-901e-44d8-a2ba-7b56b11d302b" (UID: "8e1e8621-901e-44d8-a2ba-7b56b11d302b"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:16:47 crc kubenswrapper[4749]: I0219 20:16:47.506979 4749 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8e1e8621-901e-44d8-a2ba-7b56b11d302b-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 19 20:16:48 crc kubenswrapper[4749]: I0219 20:16:48.689984 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e1e8621-901e-44d8-a2ba-7b56b11d302b" path="/var/lib/kubelet/pods/8e1e8621-901e-44d8-a2ba-7b56b11d302b/volumes" Feb 19 20:16:56 crc kubenswrapper[4749]: I0219 20:16:56.699155 4749 scope.go:117] "RemoveContainer" containerID="a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" Feb 19 20:16:56 crc kubenswrapper[4749]: E0219 20:16:56.700379 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:17:07 crc kubenswrapper[4749]: I0219 20:17:07.679476 4749 scope.go:117] "RemoveContainer" containerID="a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" Feb 19 20:17:07 crc kubenswrapper[4749]: E0219 20:17:07.680310 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:17:21 crc kubenswrapper[4749]: I0219 20:17:21.678770 4749 scope.go:117] "RemoveContainer" containerID="a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" Feb 19 20:17:21 crc kubenswrapper[4749]: E0219 20:17:21.679590 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:17:27 crc kubenswrapper[4749]: I0219 20:17:27.147194 4749 scope.go:117] "RemoveContainer" containerID="df8298cb06d59b2583597f23c71fc10924009b208f4624f81a268634f1cc3fa0" Feb 19 20:17:36 crc kubenswrapper[4749]: I0219 20:17:36.685747 4749 scope.go:117] "RemoveContainer" containerID="a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" Feb 19 20:17:36 crc kubenswrapper[4749]: E0219 20:17:36.686616 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:17:48 crc kubenswrapper[4749]: I0219 20:17:48.679139 4749 scope.go:117] "RemoveContainer" containerID="a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" Feb 19 20:17:48 crc kubenswrapper[4749]: E0219 20:17:48.679995 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:18:02 crc kubenswrapper[4749]: I0219 20:18:02.679350 4749 scope.go:117] "RemoveContainer" containerID="a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" Feb 19 20:18:02 crc kubenswrapper[4749]: E0219 20:18:02.680135 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:18:15 crc kubenswrapper[4749]: I0219 20:18:15.678891 4749 scope.go:117] "RemoveContainer" containerID="a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" Feb 19 20:18:15 crc kubenswrapper[4749]: E0219 20:18:15.679885 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nzldt_openshift-machine-config-operator(b4b7c32a-5fc5-45f9-848f-f344598f6d73)\"" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" Feb 19 20:18:27 crc kubenswrapper[4749]: I0219 20:18:27.679094 4749 scope.go:117] "RemoveContainer" containerID="a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" Feb 19 20:18:28 crc kubenswrapper[4749]: I0219 20:18:28.161437 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerStarted","Data":"a47df6624e46b63ab5089bbd3e77b4e6647cd59c40d1ab05a91818f03e381dc5"} Feb 19 20:20:54 crc kubenswrapper[4749]: I0219 20:20:54.725738 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 20:20:54 crc kubenswrapper[4749]: I0219 20:20:54.726281 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 20:21:24 crc kubenswrapper[4749]: I0219 20:21:24.725217 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 20:21:24 crc kubenswrapper[4749]: I0219 20:21:24.725795 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 20:21:28 crc kubenswrapper[4749]: I0219 20:21:28.480078 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h6sft"] Feb 19 20:21:28 crc kubenswrapper[4749]: E0219 20:21:28.481403 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398f9e0d-2529-45a5-b52a-3b3f60b816f2" containerName="extract-utilities" Feb 19 20:21:28 crc kubenswrapper[4749]: I0219 20:21:28.481427 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="398f9e0d-2529-45a5-b52a-3b3f60b816f2" containerName="extract-utilities" Feb 19 20:21:28 crc kubenswrapper[4749]: E0219 20:21:28.481457 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398f9e0d-2529-45a5-b52a-3b3f60b816f2" containerName="registry-server" Feb 19 20:21:28 crc kubenswrapper[4749]: I0219 20:21:28.481469 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="398f9e0d-2529-45a5-b52a-3b3f60b816f2" containerName="registry-server" Feb 19 20:21:28 crc kubenswrapper[4749]: E0219 20:21:28.481495 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2290754-fba3-47c9-8cc7-842688985495" containerName="extract-utilities" Feb 19 20:21:28 crc kubenswrapper[4749]: I0219 20:21:28.481508 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2290754-fba3-47c9-8cc7-842688985495" containerName="extract-utilities" Feb 19 20:21:28 crc kubenswrapper[4749]: E0219 20:21:28.481529 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2290754-fba3-47c9-8cc7-842688985495" containerName="registry-server" Feb 19 20:21:28 crc kubenswrapper[4749]: I0219 20:21:28.481541 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2290754-fba3-47c9-8cc7-842688985495" containerName="registry-server" Feb 19 20:21:28 crc kubenswrapper[4749]: E0219 20:21:28.481580 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e1e8621-901e-44d8-a2ba-7b56b11d302b" containerName="gather" Feb 19 20:21:28 crc kubenswrapper[4749]: I0219 20:21:28.481596 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e1e8621-901e-44d8-a2ba-7b56b11d302b" containerName="gather" Feb 19 20:21:28 crc kubenswrapper[4749]: E0219 20:21:28.481621 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e1e8621-901e-44d8-a2ba-7b56b11d302b" containerName="copy" Feb 19 20:21:28 crc kubenswrapper[4749]: I0219 20:21:28.481633 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e1e8621-901e-44d8-a2ba-7b56b11d302b" containerName="copy" Feb 19 20:21:28 crc kubenswrapper[4749]: E0219 20:21:28.481678 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2290754-fba3-47c9-8cc7-842688985495" containerName="extract-content" Feb 19 20:21:28 crc kubenswrapper[4749]: I0219 20:21:28.481690 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2290754-fba3-47c9-8cc7-842688985495" containerName="extract-content" Feb 19 20:21:28 crc kubenswrapper[4749]: E0219 20:21:28.481710 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398f9e0d-2529-45a5-b52a-3b3f60b816f2" containerName="extract-content" Feb 19 20:21:28 crc kubenswrapper[4749]: I0219 20:21:28.481722 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="398f9e0d-2529-45a5-b52a-3b3f60b816f2" containerName="extract-content" Feb 19 20:21:28 crc kubenswrapper[4749]: I0219 20:21:28.482127 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e1e8621-901e-44d8-a2ba-7b56b11d302b" containerName="gather" Feb 19 20:21:28 crc kubenswrapper[4749]: I0219 20:21:28.482172 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2290754-fba3-47c9-8cc7-842688985495" containerName="registry-server" Feb 19 20:21:28 crc kubenswrapper[4749]: I0219 20:21:28.482199 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e1e8621-901e-44d8-a2ba-7b56b11d302b" containerName="copy" Feb 19 20:21:28 crc kubenswrapper[4749]: I0219 20:21:28.482227 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="398f9e0d-2529-45a5-b52a-3b3f60b816f2" containerName="registry-server" Feb 19 20:21:28 crc kubenswrapper[4749]: I0219 20:21:28.484725 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6sft" Feb 19 20:21:28 crc kubenswrapper[4749]: I0219 20:21:28.488631 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6sft"] Feb 19 20:21:28 crc kubenswrapper[4749]: I0219 20:21:28.616974 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b0b2fd8-432d-4783-a8db-464bf2cd7aeb-utilities\") pod \"redhat-marketplace-h6sft\" (UID: \"4b0b2fd8-432d-4783-a8db-464bf2cd7aeb\") " pod="openshift-marketplace/redhat-marketplace-h6sft" Feb 19 20:21:28 crc kubenswrapper[4749]: I0219 20:21:28.617205 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b0b2fd8-432d-4783-a8db-464bf2cd7aeb-catalog-content\") pod \"redhat-marketplace-h6sft\" (UID: \"4b0b2fd8-432d-4783-a8db-464bf2cd7aeb\") " pod="openshift-marketplace/redhat-marketplace-h6sft" Feb 19 20:21:28 crc kubenswrapper[4749]: I0219 20:21:28.617375 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q2f6\" (UniqueName: \"kubernetes.io/projected/4b0b2fd8-432d-4783-a8db-464bf2cd7aeb-kube-api-access-7q2f6\") pod \"redhat-marketplace-h6sft\" (UID: \"4b0b2fd8-432d-4783-a8db-464bf2cd7aeb\") " pod="openshift-marketplace/redhat-marketplace-h6sft" Feb 19 20:21:28 crc kubenswrapper[4749]: I0219 20:21:28.719698 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b0b2fd8-432d-4783-a8db-464bf2cd7aeb-catalog-content\") pod \"redhat-marketplace-h6sft\" (UID: \"4b0b2fd8-432d-4783-a8db-464bf2cd7aeb\") " pod="openshift-marketplace/redhat-marketplace-h6sft" Feb 19 20:21:28 crc kubenswrapper[4749]: I0219 20:21:28.719805 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q2f6\" (UniqueName: \"kubernetes.io/projected/4b0b2fd8-432d-4783-a8db-464bf2cd7aeb-kube-api-access-7q2f6\") pod \"redhat-marketplace-h6sft\" (UID: \"4b0b2fd8-432d-4783-a8db-464bf2cd7aeb\") " pod="openshift-marketplace/redhat-marketplace-h6sft" Feb 19 20:21:28 crc kubenswrapper[4749]: I0219 20:21:28.719962 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b0b2fd8-432d-4783-a8db-464bf2cd7aeb-utilities\") pod \"redhat-marketplace-h6sft\" (UID: \"4b0b2fd8-432d-4783-a8db-464bf2cd7aeb\") " pod="openshift-marketplace/redhat-marketplace-h6sft" Feb 19 20:21:28 crc kubenswrapper[4749]: I0219 20:21:28.720812 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b0b2fd8-432d-4783-a8db-464bf2cd7aeb-catalog-content\") pod \"redhat-marketplace-h6sft\" (UID: \"4b0b2fd8-432d-4783-a8db-464bf2cd7aeb\") " pod="openshift-marketplace/redhat-marketplace-h6sft" Feb 19 20:21:28 crc kubenswrapper[4749]: I0219 20:21:28.720843 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b0b2fd8-432d-4783-a8db-464bf2cd7aeb-utilities\") pod \"redhat-marketplace-h6sft\" (UID: \"4b0b2fd8-432d-4783-a8db-464bf2cd7aeb\") " pod="openshift-marketplace/redhat-marketplace-h6sft" Feb 19 20:21:28 crc kubenswrapper[4749]: I0219 20:21:28.757181 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q2f6\" (UniqueName: \"kubernetes.io/projected/4b0b2fd8-432d-4783-a8db-464bf2cd7aeb-kube-api-access-7q2f6\") pod \"redhat-marketplace-h6sft\" (UID: \"4b0b2fd8-432d-4783-a8db-464bf2cd7aeb\") " pod="openshift-marketplace/redhat-marketplace-h6sft" Feb 19 20:21:28 crc kubenswrapper[4749]: I0219 20:21:28.815645 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6sft" Feb 19 20:21:29 crc kubenswrapper[4749]: I0219 20:21:29.289685 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6sft"] Feb 19 20:21:29 crc kubenswrapper[4749]: I0219 20:21:29.787511 4749 generic.go:334] "Generic (PLEG): container finished" podID="4b0b2fd8-432d-4783-a8db-464bf2cd7aeb" containerID="ab9dd583a310527e3f9da2c12107294db113742820dcf1d4a0c799213f90bb80" exitCode=0 Feb 19 20:21:29 crc kubenswrapper[4749]: I0219 20:21:29.787577 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6sft" event={"ID":"4b0b2fd8-432d-4783-a8db-464bf2cd7aeb","Type":"ContainerDied","Data":"ab9dd583a310527e3f9da2c12107294db113742820dcf1d4a0c799213f90bb80"} Feb 19 20:21:29 crc kubenswrapper[4749]: I0219 20:21:29.787654 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6sft" event={"ID":"4b0b2fd8-432d-4783-a8db-464bf2cd7aeb","Type":"ContainerStarted","Data":"de5d8c32217f44b2352640883c1e8ffe1b71efdb4ee069268a2b91ece4092a20"} Feb 19 20:21:29 crc kubenswrapper[4749]: I0219 20:21:29.789696 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 20:21:30 crc kubenswrapper[4749]: I0219 20:21:30.798106 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6sft" event={"ID":"4b0b2fd8-432d-4783-a8db-464bf2cd7aeb","Type":"ContainerStarted","Data":"c97a30d60dd2b6dc0e761016dd74b008ae879483990b2dbef0f175888186d974"} Feb 19 20:21:31 crc kubenswrapper[4749]: I0219 20:21:31.809650 4749 generic.go:334] "Generic (PLEG): container finished" podID="4b0b2fd8-432d-4783-a8db-464bf2cd7aeb" containerID="c97a30d60dd2b6dc0e761016dd74b008ae879483990b2dbef0f175888186d974" exitCode=0 Feb 19 20:21:31 crc kubenswrapper[4749]: I0219 20:21:31.809774 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6sft" event={"ID":"4b0b2fd8-432d-4783-a8db-464bf2cd7aeb","Type":"ContainerDied","Data":"c97a30d60dd2b6dc0e761016dd74b008ae879483990b2dbef0f175888186d974"} Feb 19 20:21:32 crc kubenswrapper[4749]: I0219 20:21:32.819598 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6sft" event={"ID":"4b0b2fd8-432d-4783-a8db-464bf2cd7aeb","Type":"ContainerStarted","Data":"29b33366f7082bc4991f351d0f9bde4345cf55ae659173984dcc11e0d0a3c56d"} Feb 19 20:21:32 crc kubenswrapper[4749]: I0219 20:21:32.843627 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h6sft" podStartSLOduration=2.45699698 podStartE2EDuration="4.843605669s" podCreationTimestamp="2026-02-19 20:21:28 +0000 UTC" firstStartedPulling="2026-02-19 20:21:29.789407787 +0000 UTC m=+6463.750627741" lastFinishedPulling="2026-02-19 20:21:32.176016476 +0000 UTC m=+6466.137236430" observedRunningTime="2026-02-19 20:21:32.834361856 +0000 UTC m=+6466.795581810" watchObservedRunningTime="2026-02-19 20:21:32.843605669 +0000 UTC m=+6466.804825623" Feb 19 20:21:38 crc kubenswrapper[4749]: I0219 20:21:38.816755 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h6sft" Feb 19 20:21:38 crc kubenswrapper[4749]: I0219 20:21:38.817378 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h6sft" Feb 19 20:21:38 crc kubenswrapper[4749]: I0219 20:21:38.866216 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h6sft" Feb 19 20:21:38 crc kubenswrapper[4749]: I0219 20:21:38.929510 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h6sft" Feb 19 20:21:39 crc kubenswrapper[4749]: I0219 20:21:39.111077 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6sft"] Feb 19 20:21:40 crc kubenswrapper[4749]: I0219 20:21:40.891911 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h6sft" podUID="4b0b2fd8-432d-4783-a8db-464bf2cd7aeb" containerName="registry-server" containerID="cri-o://29b33366f7082bc4991f351d0f9bde4345cf55ae659173984dcc11e0d0a3c56d" gracePeriod=2 Feb 19 20:21:41 crc kubenswrapper[4749]: I0219 20:21:41.353569 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6sft" Feb 19 20:21:41 crc kubenswrapper[4749]: I0219 20:21:41.475676 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b0b2fd8-432d-4783-a8db-464bf2cd7aeb-utilities\") pod \"4b0b2fd8-432d-4783-a8db-464bf2cd7aeb\" (UID: \"4b0b2fd8-432d-4783-a8db-464bf2cd7aeb\") " Feb 19 20:21:41 crc kubenswrapper[4749]: I0219 20:21:41.475942 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b0b2fd8-432d-4783-a8db-464bf2cd7aeb-catalog-content\") pod \"4b0b2fd8-432d-4783-a8db-464bf2cd7aeb\" (UID: \"4b0b2fd8-432d-4783-a8db-464bf2cd7aeb\") " Feb 19 20:21:41 crc kubenswrapper[4749]: I0219 20:21:41.475974 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7q2f6\" (UniqueName: \"kubernetes.io/projected/4b0b2fd8-432d-4783-a8db-464bf2cd7aeb-kube-api-access-7q2f6\") pod \"4b0b2fd8-432d-4783-a8db-464bf2cd7aeb\" (UID: \"4b0b2fd8-432d-4783-a8db-464bf2cd7aeb\") " Feb 19 20:21:41 crc kubenswrapper[4749]: I0219 20:21:41.477089 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b0b2fd8-432d-4783-a8db-464bf2cd7aeb-utilities" (OuterVolumeSpecName: "utilities") pod "4b0b2fd8-432d-4783-a8db-464bf2cd7aeb" (UID: "4b0b2fd8-432d-4783-a8db-464bf2cd7aeb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:21:41 crc kubenswrapper[4749]: I0219 20:21:41.481721 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b0b2fd8-432d-4783-a8db-464bf2cd7aeb-kube-api-access-7q2f6" (OuterVolumeSpecName: "kube-api-access-7q2f6") pod "4b0b2fd8-432d-4783-a8db-464bf2cd7aeb" (UID: "4b0b2fd8-432d-4783-a8db-464bf2cd7aeb"). InnerVolumeSpecName "kube-api-access-7q2f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:21:41 crc kubenswrapper[4749]: I0219 20:21:41.499756 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b0b2fd8-432d-4783-a8db-464bf2cd7aeb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b0b2fd8-432d-4783-a8db-464bf2cd7aeb" (UID: "4b0b2fd8-432d-4783-a8db-464bf2cd7aeb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:21:41 crc kubenswrapper[4749]: I0219 20:21:41.578451 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b0b2fd8-432d-4783-a8db-464bf2cd7aeb-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 20:21:41 crc kubenswrapper[4749]: I0219 20:21:41.578483 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b0b2fd8-432d-4783-a8db-464bf2cd7aeb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 20:21:41 crc kubenswrapper[4749]: I0219 20:21:41.578495 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7q2f6\" (UniqueName: \"kubernetes.io/projected/4b0b2fd8-432d-4783-a8db-464bf2cd7aeb-kube-api-access-7q2f6\") on node \"crc\" DevicePath \"\"" Feb 19 20:21:41 crc kubenswrapper[4749]: I0219 20:21:41.903672 4749 generic.go:334] "Generic (PLEG): container finished" podID="4b0b2fd8-432d-4783-a8db-464bf2cd7aeb" containerID="29b33366f7082bc4991f351d0f9bde4345cf55ae659173984dcc11e0d0a3c56d" exitCode=0 Feb 19 20:21:41 crc kubenswrapper[4749]: I0219 20:21:41.903720 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6sft" event={"ID":"4b0b2fd8-432d-4783-a8db-464bf2cd7aeb","Type":"ContainerDied","Data":"29b33366f7082bc4991f351d0f9bde4345cf55ae659173984dcc11e0d0a3c56d"} Feb 19 20:21:41 crc kubenswrapper[4749]: I0219 20:21:41.903744 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6sft" Feb 19 20:21:41 crc kubenswrapper[4749]: I0219 20:21:41.903768 4749 scope.go:117] "RemoveContainer" containerID="29b33366f7082bc4991f351d0f9bde4345cf55ae659173984dcc11e0d0a3c56d" Feb 19 20:21:41 crc kubenswrapper[4749]: I0219 20:21:41.903752 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6sft" event={"ID":"4b0b2fd8-432d-4783-a8db-464bf2cd7aeb","Type":"ContainerDied","Data":"de5d8c32217f44b2352640883c1e8ffe1b71efdb4ee069268a2b91ece4092a20"} Feb 19 20:21:41 crc kubenswrapper[4749]: I0219 20:21:41.928597 4749 scope.go:117] "RemoveContainer" containerID="c97a30d60dd2b6dc0e761016dd74b008ae879483990b2dbef0f175888186d974" Feb 19 20:21:41 crc kubenswrapper[4749]: I0219 20:21:41.945268 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6sft"] Feb 19 20:21:41 crc kubenswrapper[4749]: I0219 20:21:41.957745 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6sft"] Feb 19 20:21:41 crc kubenswrapper[4749]: I0219 20:21:41.967698 4749 scope.go:117] "RemoveContainer" containerID="ab9dd583a310527e3f9da2c12107294db113742820dcf1d4a0c799213f90bb80" Feb 19 20:21:42 crc kubenswrapper[4749]: I0219 20:21:42.002009 4749 scope.go:117] "RemoveContainer" containerID="29b33366f7082bc4991f351d0f9bde4345cf55ae659173984dcc11e0d0a3c56d" Feb 19 20:21:42 crc kubenswrapper[4749]: E0219 20:21:42.002520 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29b33366f7082bc4991f351d0f9bde4345cf55ae659173984dcc11e0d0a3c56d\": container with ID starting with 29b33366f7082bc4991f351d0f9bde4345cf55ae659173984dcc11e0d0a3c56d not found: ID does not exist" containerID="29b33366f7082bc4991f351d0f9bde4345cf55ae659173984dcc11e0d0a3c56d" Feb 19 20:21:42 crc kubenswrapper[4749]: I0219 20:21:42.002550 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29b33366f7082bc4991f351d0f9bde4345cf55ae659173984dcc11e0d0a3c56d"} err="failed to get container status \"29b33366f7082bc4991f351d0f9bde4345cf55ae659173984dcc11e0d0a3c56d\": rpc error: code = NotFound desc = could not find container \"29b33366f7082bc4991f351d0f9bde4345cf55ae659173984dcc11e0d0a3c56d\": container with ID starting with 29b33366f7082bc4991f351d0f9bde4345cf55ae659173984dcc11e0d0a3c56d not found: ID does not exist" Feb 19 20:21:42 crc kubenswrapper[4749]: I0219 20:21:42.002579 4749 scope.go:117] "RemoveContainer" containerID="c97a30d60dd2b6dc0e761016dd74b008ae879483990b2dbef0f175888186d974" Feb 19 20:21:42 crc kubenswrapper[4749]: E0219 20:21:42.002872 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c97a30d60dd2b6dc0e761016dd74b008ae879483990b2dbef0f175888186d974\": container with ID starting with c97a30d60dd2b6dc0e761016dd74b008ae879483990b2dbef0f175888186d974 not found: ID does not exist" containerID="c97a30d60dd2b6dc0e761016dd74b008ae879483990b2dbef0f175888186d974" Feb 19 20:21:42 crc kubenswrapper[4749]: I0219 20:21:42.002921 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c97a30d60dd2b6dc0e761016dd74b008ae879483990b2dbef0f175888186d974"} err="failed to get container status \"c97a30d60dd2b6dc0e761016dd74b008ae879483990b2dbef0f175888186d974\": rpc error: code = NotFound desc = could not find container \"c97a30d60dd2b6dc0e761016dd74b008ae879483990b2dbef0f175888186d974\": container with ID starting with c97a30d60dd2b6dc0e761016dd74b008ae879483990b2dbef0f175888186d974 not found: ID does not exist" Feb 19 20:21:42 crc kubenswrapper[4749]: I0219 20:21:42.002955 4749 scope.go:117] "RemoveContainer" containerID="ab9dd583a310527e3f9da2c12107294db113742820dcf1d4a0c799213f90bb80" Feb 19 20:21:42 crc kubenswrapper[4749]: E0219 20:21:42.003395 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab9dd583a310527e3f9da2c12107294db113742820dcf1d4a0c799213f90bb80\": container with ID starting with ab9dd583a310527e3f9da2c12107294db113742820dcf1d4a0c799213f90bb80 not found: ID does not exist" containerID="ab9dd583a310527e3f9da2c12107294db113742820dcf1d4a0c799213f90bb80" Feb 19 20:21:42 crc kubenswrapper[4749]: I0219 20:21:42.003425 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab9dd583a310527e3f9da2c12107294db113742820dcf1d4a0c799213f90bb80"} err="failed to get container status \"ab9dd583a310527e3f9da2c12107294db113742820dcf1d4a0c799213f90bb80\": rpc error: code = NotFound desc = could not find container \"ab9dd583a310527e3f9da2c12107294db113742820dcf1d4a0c799213f90bb80\": container with ID starting with ab9dd583a310527e3f9da2c12107294db113742820dcf1d4a0c799213f90bb80 not found: ID does not exist" Feb 19 20:21:42 crc kubenswrapper[4749]: I0219 20:21:42.692606 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b0b2fd8-432d-4783-a8db-464bf2cd7aeb" path="/var/lib/kubelet/pods/4b0b2fd8-432d-4783-a8db-464bf2cd7aeb/volumes" Feb 19 20:21:54 crc kubenswrapper[4749]: I0219 20:21:54.725801 4749 patch_prober.go:28] interesting pod/machine-config-daemon-nzldt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 20:21:54 crc kubenswrapper[4749]: I0219 20:21:54.726389 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 20:21:54 crc kubenswrapper[4749]: I0219 20:21:54.726476 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" Feb 19 20:21:54 crc kubenswrapper[4749]: I0219 20:21:54.727581 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a47df6624e46b63ab5089bbd3e77b4e6647cd59c40d1ab05a91818f03e381dc5"} pod="openshift-machine-config-operator/machine-config-daemon-nzldt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 20:21:54 crc kubenswrapper[4749]: I0219 20:21:54.727668 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" podUID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerName="machine-config-daemon" containerID="cri-o://a47df6624e46b63ab5089bbd3e77b4e6647cd59c40d1ab05a91818f03e381dc5" gracePeriod=600 Feb 19 20:21:55 crc kubenswrapper[4749]: I0219 20:21:55.017696 4749 generic.go:334] "Generic (PLEG): container finished" podID="b4b7c32a-5fc5-45f9-848f-f344598f6d73" containerID="a47df6624e46b63ab5089bbd3e77b4e6647cd59c40d1ab05a91818f03e381dc5" exitCode=0 Feb 19 20:21:55 crc kubenswrapper[4749]: I0219 20:21:55.017759 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerDied","Data":"a47df6624e46b63ab5089bbd3e77b4e6647cd59c40d1ab05a91818f03e381dc5"} Feb 19 20:21:55 crc kubenswrapper[4749]: I0219 20:21:55.017808 4749 scope.go:117] "RemoveContainer" containerID="a001acf159d3601ec6673a835c2b3c925257dbba55c6a59a9a64c8fcac253507" Feb 19 20:21:56 crc kubenswrapper[4749]: I0219 20:21:56.030888 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nzldt" event={"ID":"b4b7c32a-5fc5-45f9-848f-f344598f6d73","Type":"ContainerStarted","Data":"66eda710f4a424c3bfd0112521f0c5b9c1d2324841acd33c01749e98821f3be7"} var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515145670402024451 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015145670403017367 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015145653225016515 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015145653226015466 5ustar corecore